The Role of Universities in the Age of AI

The Role of Universities in the Age of AI

Dr Konstantinos Gkoutzis


Thank you very much.

Regardless of your professional or academic background, I can safely assume that, at some point in your lives, you were all taught something. Perhaps it happened at the school or university you attended; or maybe you were home-schooled, or chose to skip the academic route and instead went straight to work in order to learn by doing. Whichever path you followed, I am certain that you all had someone in your lives who taught you something: a teacher, a parent, a guardian, a friend – maybe even all of the above.

However, as Bob Dylan (2014) correctly remarked many years ago, “the times, they are a-changin’”, so “you better start swimmin’ or you’ll sink like a stone”. This can also be applied to the world of education, where new technological advancements are constantly challenging long-standing practices, as well as those who still believe in them.

One important issue in academic education at the moment is the lack of personalisation of our teaching methods. All students who are members of the same cohort are given the same information in the exact, same, way. This may seem as treating everyone equally, but it is actually not fair to consider that each, unique, individual, should learn in the single, very specific, manner that we believe is best. Even though having a singular approach to teaching is easier for academics who also have an ever-increasing research workload to consider, the benefit of the students should always come first — and technology can help us achieve this. With the aid of Artificial Intelligence, we can personalise our teaching to the learning style and pace of each individual, while also receiving instant feedback and metrics on the process of learning, which can allow us to identify any necessary changes and additions to each specific learning object that we offer. All our students can thus be equal without having to be exactly the same, and we can also reach a wider audience than before, by offering a more effective digital curriculum, adjusted to their specific wants and needs.

This summarises my talk, so please feel free to leave now if you wish.


If you decide to stay, you can join me in a journey where we search for the purpose and future of university education — ideally in harmonious co-existence with modern technological advancements.

Why We Learn

Even though I learn something new almost every day, I must admit that my days as a “university student” are behind me. As a university teacher, I sometimes look at modern day students and, judging by the look on their faces, I feel that they must occasionally ask themselves: “What is the meaning of this? Why do we bother waking up early every single working day of the week, going to uni to hear about all these obscure topics that never seem to end, and then spend our evenings studying, researching and writing up our coursework until we are exhausted, while our parents – or worse — we, ourselves – pay all this money, instead of investing it in index funds and post-bubble cryptocurrencies to save up for the future…?”. I must say this is a great question indeed!


So, why do we learn? There is a popular Latin phrase that states: “Non scholæ sed vitæ discimus” – “We do not learn for school, but for life”. In other words, schools and universities are not ends in themselves, but the media preparing us to cope with what lies ahead. Interesting fact: this quote is actually an inversion from a letter that the Roman stoic philosopher Seneca wrote to his friend almost two thousand years ago, complaining that, unfortunately, “we learn our lessons, not for life, but for the lecture-room” (Gummere, 1925). As you can see, times have not “a-changed” a lot since then.



What motivates students to attend university studies? What motivates their parents and guardians to urge them to go? Some might say that it is to acquire a specific degree in order to succeed in the future (Doin and Guzzo, 2012), but how is success defined?

Should parents consider it a success when their child achieves what they did not manage to? This sounds somewhat egotistical. Does success perhaps manifest when students graduate and manage to become well-adjusted and productive members of society? This could indeed make sense — for example, using the degree as a way to get that job interview you wanted, eventually securing the high-paying role that will allow you to pay off those student loans and maybe later acquire that house, that car, and the means to start your own family — if you so wish.

However, these days, the majority of, so-called, “entry-level” jobs usually require up to three years of prior experience, in addition to having the necessary certifications (TalentWorks, 2018), so a university degree in itself is no longer adequate if your sole aim is the expedited security of your livelihood. This is why many universities tend to add work placements as part of their degrees, while also offering students 1:1 meetings with Career Service representatives who can assist them in the process of preparing their CVs, and also to look for Summer Internships.

Given the fact that university – or college, for my North American friends – education is generally not “gratis”, these “added services” on top of the actual teaching – as well as the insinuated opportunities for networking with businesses and successful alumni – constitute part of the reason why a student chooses one university over another. And then comes the application process..!

Entry Criteria

You see, before school-leavers are able to join the university “studentdom”, they first need to meet the entry requirements of their institution of choice. As one would expect, the more a university has to offer, the more interest it amasses. Top institutions receive countless applications each academic year, from all over the world (Harvard Gazette, 2018), which sometimes happens a long time in advance and thus includes some “educated” guesstimating (Weale, 2016). However, physical spaces are naturally limited, since each student will need a desk to sit at in one of the lecture theatres where the lessons will take place. Additional parameters also factor into this limitation of places, like the availability of staff members needed to support each student, as well as the preservation of the prestige status that each university wishes for its “brand”.

In order to filter the numerous requests they receive, application-based universities state and enforce strict requirements for entry, which sometimes differ per faculty or department. These idealised criteria have been set to identify proof of prior knowledge and to manage student expectations regarding the difficulty of each programme and course. However, in essence, they end up acting as a barrier to knowledge for students who could be genuinely interested in a subject offered by a specific institution, merely because their path by that point in life had not allowed them to collect the proper paperwork or communication skills which would let them in (Hartocollis, 2018).

There are, of course, some applicants who do not give up regardless, even if they do not meet the set entry criteria, and, in rare cases, they eventually manage to convince an institution to give them a conditional offer, after agreeing to the addition of extra, mandatory, introductory pre-curriculum workload. But, why do students go to all this trouble?

The Status Quo

This entire process sounds like a lot of hard work! Parents start preparing their children to attend university many-many years before they even actually apply: extracurricular activities, participation in international competitions, part-time internships or volunteering — and all these take place on top of the most difficult and challenging subjects that sixth form has to offer!

In the year 45BC, Cicero wrote: “Nor again is there anyone who loves or pursues or desires to obtain pain of itself, because it is pain, but because occasionally circumstances occur in which toil and pain can procure some great pleasure” (Rackham, 1914). It is in our human nature to strive for our goals, even if it requires a lot of suffering to get there. The problem in this case, however, is that our goals are not necessarily genuinely ours.

The biologist Conrad Waddington (1961) defined the term “creode” as the path of development that cells follow to form organs. If something tries to inhibit this developmental process, the cell attempts to return back to its normal, predefined, course. This norm that a cell retracts to, is specified in our genetic material inherited by our ancestors. Causing a permanent change to this pathway takes time because, if someone or something attempts to impose a significant modification, the innate – encoded – tendency is to stay with what you know.

The psychologist Jean Piaget (1971) was a big fan of Waddington (Parker et al., 2014) and agreed that our actions are not determined simply based on causal components, but also on coordinated logic, thus including normative properties in the overall equation (Smith and Vonèche, 2006). Since norms can be said to develop over time, alongside – and interdependently with – biology and culture (Smith, 2002), our decisions are in this way affected by our habits, which are in turn depended on our surrounding environment that provides the cues. This has been called the “habit loop” (Duhigg, 2012) and is directly related to the inherent “reward system” of our brain (Graybiel, 1998). In other words, society dictates normality.

Even if “everyone is doing it”, some students may still ask themselves: “Quō vādis?” — Where are you going? That is the look I can see on their faces, when I feel them pondering about the meaning of it all. Surely this acquired habit of academic success must take its toll.


Plato proclaims in Politeia: “a free soul ought not to pursue any study slavishly; […] nothing that is learned under compulsion stays with the mind. […] Do not […] keep children to their studies by compulsion, but by play” (Shorey and Bury, 1969). This is obviously the exact opposite of what is happening at most universities today, with parents, guardians and modern society justifying the mandate of “great inconvenience / high payoff” academic education as some sort of “rite of passage” that every young adult must go through before they are formally accepted to be employed by professional establishments. Even though you sometimes hear about the success stories of university – or high school – dropouts, these are merely the few, fortuitous, exceptions to the rule — hence newsworthy.

Picasso has been quoted with saying that “Every child is an artist. The problem is how to remain an artist once he grows up” (Time, 1976). Even though this sounds slightly comical, unfortunately, researchers have found that children tend to lose their creativity as they get older because they are learning to demonstrate non-creative behaviours (Land and Jarman, 1992). Let this sink in for a second — we are actually teaching students how to be less creative. All these conventional facts and proven techniques that we offer them, reinforce the rational part of their minds, but their intuitive part ends up falling behind (Lewis, 1986). Who needs a wild imagination when you have tried-and-tested, scientific, methodologies?

The educationist Sir Kenneth Robinson (2011) has defined imagination as “the source of our creativity”, creativity as “applied imagination” and innovation as “applied creativity”. Simply put, those who drive change need to be able to think differently, but modern, dictative, education is potentially depriving us of fresh, inventive, thinkers.

I personally do not feel that anyone becomes a teacher to intentionally “lobotomise” young, energetic, minds — so why are we going along with this?

Why We Teach

Why do we teach? Why do we, the teachers, participate in an education mechanism that could possibly be causing the demise of any creativity left in young adults? Some may – unknowingly – quote playwright Bernard Shaw (1919) and spitefully say: “He who can, does. He who cannot, teaches“, thus insinuating that this is the only thing we know how to do, and that there is nothing we can do to change it — or ourselves. However, Shaw (ibid) also states the maxim: “The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man“. I feel this “unreasonableness” — the constant questioning of accepted “reason”, describes the motivation behind why I wanted to teach in the first place: to communicate what I understand, and request the input and involvement of a larger audience, thus gradually cultivating an “army” of knowledge-thirsty minds, eagerly looking to comprehend and challenge everything.

Even though my focal subject has mainly been Computing, the truth is I could potentially teach anything I truly understand — it comes naturally. I imagine this applies to many teachers out there, who feel the innate urge to share what they know with others — who wish to become the medium between students and new understanding. In this way, I also see myself as a “connector” — what the bibliography has called a “broker” (Wenger, 1998), linking together ideas of different practices, aiming to introduce students to new notions that could perhaps expand their horizons even further than they had originally imagined. From referencing principles of philosophy when explaining Computer Networking and Semantic Web models, to familiarising tutees with Teaching and Learning theories while supervising their theses, I aspire that this combination of interdisciplinary concepts could materialise the emergence of previously unexplored potential in students.

This is why, instead of considering “educare” as the Latin source of the word “education” — which means “to train or to mould”, I prefer the word “educere”, meaning “to lead out”; to identify and draw out the intrinsic aptitudes of each individual (Craft, 1982).

Quality of Service

I must admit, however, that universities present a superb opportunity for teaching students, because the “mechanism” – the “channel” – is already in place. But, one could ask, what is the goal of universities? Is it to provide a service — if so, to whom? Are students customers, choosing courses like products — and institutions like brands? Or, perhaps, are teachers employees with stock options, always aiming to maximise – what the sociologist Pierre Bourdieu would call – their “capital” (Richardson, 1986), by adding more student synergies and more academic publications to their personal, and company, assets?

Just like publicly traded corporations have a stock value, so do universities — it is called “ranking”. In the UK, the Teaching and Research Excellence Frameworks, TEF and REF, have been used as metrics to quantify the “value” of each academic institution, with the aim to help students decide which one to choose (Baker, 2018; Owen and Smith, 2018). Perhaps this “peer review pressure” is the reason why universities have been accused of fostering an “ivory tower culture” (Etzkowitz et al., 2000), by focusing on disconnected, abstract, pursuits. However, I feel that, nowadays, the exact opposite is happening: in some courses, the focus has shifted too much towards the industry, thus neglecting the original intention of education: inquiry and discourse.

In the past, academic work – like PhD theses – had humorously been rebuked for being “nothing but a transference of bones from one graveyard to another” (Dobie, 1945). Today, academic publications are receiving the same treatment, criticised for being an antiquated, non-interactive, medium of knowledge transfer (Somers, 2018), with some papers being accepted hastily by journals (Brembs, 2018), or even blindly (Stribling et al., 2005). This arduous emphasis that is given by universities to a steady publication flow – in order to increase the institutional prestige –, can cause great stress to academics who need to balance their workload between being successful researchers and effective teachers; and it can also reflect onto students.

Mental Health

You may hear some academics bluntly state: “Assessment is the engine which drives student learning” (QAA Scotland, 2005). This may sound quite “pragmatic”, but allow me to think out loud for a second — lest we forget: a carrot can drive a donkey, and a bell can make a dog salivate (Schachtman and Reilly, 2011). Have we then, conveniently, reduced academic education to “classical conditioning”, as if we were trying to teach beings with whom we are unable to communicate via language? Did we take “Akadimia” and “Lykeion” – Plato’s Academy and Aristotle’s Lyceum – and turn them from a dialectic forum for debates and analyses, into an austere, impersonal, clique of adamant professors and spiritless, apathetic, students?

It saddens me to say that I have witnessed students “begging” for lack of creativity in their learning, asking for straightforward, extremely well-defined, questions with absolutely clear and specific answers, solely focusing on how to pass a coursework or an exam, instead of expanding their horizons further. However, I constantly inform my students that there is not just a single right answer to the questions I pose, and most of them seem to enjoy the process — but definitely not all. I also tell students that they do not have to memorise anything they understand, but, even if we follow the latest “trends” in teaching, we cannot fully ascertain that the methodologies used will be appropriate for everyone (Gardner and Martinko, 1996; Pittenger, 2005; Husmann and O’Loughlin, 2018).

Due to the aforementioned pressure that students sustain to succeed in Higher Education – which of course is only part of their hardships (Kataoka et al., 2012), as well as the stress that academics experience while balancing their multifaceted roles (De Rond and Miller, 2005), both students (Waite and Braidwood, 2016; Evans et al., 2018) and teachers (Bodovski, 2018) can – unfortunately – potentially develop mental health issues; instead of building up, we are breaking down. The philosopher, psychologist and educationist John Dewey (2013) stated, more than a century ago, that it is not about “the waste of money or the waste of things. These matters count; but the primary waste is that of human life, the life of the children while they are at school, and afterward because of inadequate and perverted preparation”. This should not be the case; as an old saying goes, “you [should not] let your schooling interfere with your education” (Allen, 1895).


Fortunately, evolution comes naturally to us humans, so technology has been brought to our aid. Before the multitalented Charles Babbage came up with the idea of his “Analytical Engine”, and the brilliant Ada Lovelace wrote the first machine-based algorithm, “computers” were humans — people who spent all their working hours computing endless – and, potentially, quite monotonous – calculations. I can imagine there are better ways for one to go insane..!


When digital computers formally came along, their original purpose was “to carry out any operations which could be done by a human computer” (Turing, 1950), and due to their large size and cost, they were not meant for personal use. With the invention of microcomputers, most of us got to own – at least – one “computer” of some sort — be it a desktop, a laptop, or the ones in smartphones, cars and fridges. As the Internet became more popular and accessible to consumers, Email, the Web, and other computer network applications enabled us to communicate faster by distance, thus allowing remote working, and even remote learning.

The creation of Learning Management Systems, or Virtual Learning Environments — as they are called here, facilitated learning by providing immediate access to learning objects and resources from anywhere in the world. All you need to achieve this is a network-enabled computer-based device, and – potentially – Internet access — for live usage, or some sort of storage medium for offline access. Gradually, these systems also started offering entire curricula, in what came to be called “Massive Open Online Courses” — or MOOCs.


All these new technological developments can be quite expensive to produce or follow. Luckily, another important advantage of universities is that most of them can afford (Greenwood and Adams, 2018) exploring, implementing and experimenting with new tools and ideas — even if this only happens until a research project is deemed financially inviable. This is why Higher Education has traditionally been a fertile breeding ground for new technologies, and for some of the most brilliant minds of modern history.

It has been noted that, compared to the past, not as many people are innovating these days in HE (Cowen, 2017). University research projects are still producing important results, some of which are even commercially exploitable, but the impact is deemed of smaller scale. Innovation is more streamlined nowadays, adhering to a constant academic publication and patent flow (KU Leuven, 2018) and to a successful series of accepted grant proposals. Even though the number of students attending university studies is growing (Adams, 2017), this does not directly translate into a parallel increase in discoveries and advancements. Perhaps the way we teach students is to blame for this — perhaps they are now too many and we are too few; perhaps we need to consider a different approach that could help.

By continually reviewing our methodologies to ascertain that they are innovation-oriented (Blass and Hayward, 2014) and support critical thinking, while also emphasising multidisciplinary collaboration (Satell, 2016) early on in students’ studies and embracing technology to assist us in teaching as effectively as possible, we could potentially make a difference. However, not everyone agrees with this direction.

Olden Days

Loud opponents of progress in education, like Eric Donald Hirsch Jr. (2016), have suggested that our approach should not be individualised, nor aim “to follow the learning styles and interests of each developing student”, but instead everyone “should study basically the same early curriculum”. Hirsch may have been referring to early stage education, but his lack of “appreciation of how learners grasp concepts” (Derry, 2017) ignores the numerous benefits of modern technological advancements and, more importantly, proposes to enforce this in the most sensitive years of students, thus cementing the concept of learning “en masse” early on.

Even the more sober critics of combining technology with teaching and learning, insist that education is a “profoundly human” endeavour, and that we “cannot” and “should not automate these processes with teaching machines” (Watters, 2014). This resonates perfectly with some teachers who feel that the consideration of embedding AI in education is nothing but “a heavy push to disrupt and diminish the role of teachers as experts” (Singer, 2017). One academic went as far as stating that digital learning is just “like porn for the mind” (Schrager and Wang, 2017), in an attempt to demonstrate that students are paying for the “real thing” — not “merely” electronic content.

Such aphoristic comments remind me of the narrator in George Orwell’s (1949) novel “Nineteen Eighty-Four”, who worried that “Newspeak” would prevail and completely assimilate “Oldspeak”, so much that, by the year 2050, “all real knowledge of Oldspeak will have disappeared” and “the whole literature of the past will have been destroyed”. However, as we are actively seeing all around us today, Orwell’s dystopian vision of a totalitarian “Big Brother” dictating what is allowed, has been complemented by Aldous Huxley’s (1947) endless “distractions”, where people will “adore the technologies that undo their capacities to think” (Postman, 2006) — and the way some educators are currently treating technology, as if it were a nuisance instead of a tool (Khomami, 2017), can be partially blamed for this.

The demise of our current education system does not need to be mourned, because the reason why its present main form – what has been called the “Factory System” – was originally devised about two centuries ago, was to generate more workers — not thinkers (Mokyr, 2001; Galor and Moav, 2006), which is exactly the pattern we are noticing here. This is why students, but also parents and employers, can be heard complaining that “teachers do not explain why [students] are being taught a subject” (Mejia, 2017) — it is a “teacher-centred” approach, where learning depends on what a single teacher believes students should learn, based on what is dictated by the university expectations, the relevant industry connections, the accreditation bodies, and – of course – the teacher’s skillset. This does not necessarily meet the personal needs of each individual student, and it definitely cannot scale.


One might expect that the critics of “non-conventional” education would mainly be old-school – pun intended – philosophers and educationalists who, like modern day “Luddites”, are raging against “the machine” that is trying to introduce change to their normality. However, you may be surprised to hear that doubters exist even among us technologists..!

Some time ago, a panel of external academics visited the institution I worked for in order to renew the professional accreditation of its Computing programmes, and I was present due to the parallel administrative role I had at the time. During the lunch break, I started chatting with one of the visitors — an Information Visualisation professor from a well-known UK university, and the conversation eventually turned to novel data manipulation techniques. This is when I mentioned that AI is gradually getting smarter and, at some point, it will be able to automatically produce analytical natural language summaries and visualisations (Automated Insights, 2017; Scientific Computing World, 2017; Bishop, 2018; sciNote, 2018), in an attempt to “explain” to us the information hidden within large masses of data — as if it were teaching us. The professor immediately rejected this, smirked awkwardly and walked away to finish his croissant at a different social circle. I tend to have this effect on people.


There you have it: “one of us”, dismissing AI as if it were a temporary fad that is only good for skipping music tracks on request, or driving your car while you sit there watching full of amazement and fear..! However, I did not feel that this reaction was caused by a “phobia” for technology, since this individual was immersed in tech by profession; it felt like contempt. While some continue treating teaching as if it were “business as usual”, technology advances “in absentia”, without caring whether they will follow or not. Keeping an open mind can potentially allow us to see the benefits and possible contributions of AI to our work as educators.


At the same time, though, we should avoid falling into the – comfortable – trap of full automation; otherwise we will end up becoming “the tools of [our] tools” — as Henry David Thoreau (1854) put it. It may be appealing to automate everything using AI, but do not forget that, as the brilliant Donald Knuth (1984) said, “Computers are good at following instructions, but not at reading your mind” — which – mostly (Nemrodov et al., 2018) – still holds true today.

Any mistakes we make while designing the algorithms, we will pay for when we execute them — even if it takes a while for the problematic behaviours to manifest. We still need to resolve certain issues of cognitive bias in Machine Learning models — unintentionally introduced by their human engineers (Caliskan et al., 2017; Kliegr et al., 2018), and then we can actually have true student inclusivity by design, since AI can learn to adjust to you and your needs, thus personalising the experience of your learning in an appropriate manner.

Inclusivity of all students, regardless of their background, identity and special needs, is a hot issue that academics are actively trying to tackle and address, and the individualisation of our teaching methodologies can potentially achieve this. In the past, I have taught students one-to-one, in personal tutorials, where I was able to adjust my teaching style accordingly, based on the live, direct and indirect, feedback I was receiving from the student. This was easy because that one student had my full attention each time; however, this cannot be true, for example, in MOOCs, where student numbers are voluminous, while the teachers and their Teaching Assistants are limited.

MOOCs are currently being treated – and marketed – like a “digital version” of an actual university course, because “people are more likely to invest in training if it confers a qualification that others will recognise” (The Economist, 2017) — but this does not have to be the case. Digital Education has so much more to offer than just a few videos, documents, Multiple Choice Question – or peer reviewed – assessments, and – perhaps – a certificate in the end.


Students are sometimes called “learners” in the education bibliography, but how can you be certain that learning is indeed taking place online — or even offline? Are assessments enough? With the help of AI on online systems, we can monitor specific metrics, like, for example, each student’s current attention levels (Strayer and Drews, 2007), and make informed decisions on whether a learning object or teaching methodology we are using at the moment is appropriate for that student at that time, allowing us to dynamically adjust the active lesson appropriately. This will need a lot of preparation from our end initially, but ultimately it will be worth the effort.

AI can also help with marking assessments almost instantly. However, even though identifying and correcting student mistakes in, for example, programming code (Santos et al., 2018) or mathematical procedures (Feldman et al., 2018) is feasible due to their structured – and strict – nature, this can be somewhat more challenging in free text, like in essays and reports; especially when you are looking for the actual meaning the student is trying to convey, instead of just to point out their grammatical and syntax errors. Semantic-based AIs have been in development for many decades already (Lenat, 1995), some of which aim to achieve exactly this by also using Natural Language Processing techniques, and we should see some positive results in this domain within our current lifetimes. This can, of course, also be applied offline, if you combine AI with a robotic “shell”, in order to create physical, digital, teachers (Bodkin, 2017).

As expected, there are some who fear that making AI “intelligent” enough to understand us, could bring about the Technological Singularity and possibly the end of the entire world! Before you start worrying about a digital apocalypse, however, bear in mind that AI is considered to be nearing a dead-end with its current techniques (LeVine, 2017), so there is still plenty of room to grow. Some researchers are suggesting that if AI is to ever surpass its current levels of “creativity”, it would first need to somehow become self-aware (Patel, 2017). If we ever reach a Technological Singularity and AI becomes fully conscious, I feel it would be very happy to realise that it is helping us achieve such a noble cause as teaching students!



In-between the period when AI and robots take over the world, and today, we will need to find a way to live with them. It has been said that “not only should we listen to the machine; we should ask it to figure out what we want” (Friend, 2018), which agrees with a philosophy that some have called “Dataism” (Lohr, 2015; Harari, 2017), where algorithms will make decisions for you. This is already happening, to a certain degree, every time you apply for a credit card, a loan, an insurance policy, or even a job. An algorithm is also followed to decide which student will manage to be accepted by a university — even if it requires heavy human involvement at the moment.

With the onslaught of AI on education, and the tremendous potential of Digital Education technologies, universities will enter a phase of “Creative Destruction” — what is also known as Schumpeter’s (1943) “gale”. During this period, academics will have to learn to work with AI in teaching their modules, and students will have to adjust to having AI monitoring their progress or even assessing their work. The old methods will be assimilated into the new — again and again. However, we need to ascertain that our tools will have a human-centred approach, so that they gradually help students realise how to “exercise their creativity and venturesome spirit in ever-new and challenging environments” (Phelps, 2013) — otherwise, they will entrap us.

Some say that this could potentially lead to a global collaborative society where everyone gets paid a basic income, and will eventually require zero work from its citizens — other than tending to their well-being (Mason, 2016). However, it is estimated that it could take up to fifteen years for the results of these developments to become measurable (Srnicek and Williams, 2015), so time will tell.

Pros & Cons

What we can already foresee at this point, is that we can use AI-enabled Digital Education systems to help us work with a large number of students from all over the world. The Internet, the Web, as well as the large variety of personal and portable computing devices, have provided us with a medium that can be exploited to spread our theories, expertise, skills, ideas and thoughts farther than ever before. Furthermore, instead of just forwarding the same exact message to every single recipient, with the help of AI we can personalise the teaching and learning process by monitoring specific metrics, which can also allow us to identify omissions or required modifications to the content we provide, thus aiming to increase student understanding.

However, in parallel to pursuing this vision, we need to rethink how we “teach” AI (Pearl and Mackenzie, 2018), before AI will be able to teach us in a “student-centred” manner. Otherwise, we will once again make the same mistakes as before — only this time in an accelerated and even more universal scale. AI should not just be a mere replacement for human teachers — as you can tell, we have many traits that are not so… enviable; instead, it should become a powerful companion in our efforts to perfect our practices, enabling us to decentralise our teachings, while deeply considering and exploring all the different approaches we can use to communicate our knowledge.

Of course, before this happens, the AI “hype” could wear out again — just like it did in the 1970s (Marcus, 2018), and then this entire talk would have been all for naught!


Everything comes and goes in waves — always moving up and down, backwards and forwards, so synchronicity plays an important role in this manner; hence why I am here, talking about this to you right now, hoping that you will find something interesting in my words that you can use and perhaps expand even further in your own environments.


Anything new can be daunting at first, so it is only normal to be sceptical towards neoteric views and approaches — especially if they directly involve us having to adopt and accept a new “status quo”. Things that we take for granted these days, were once considered to be “mystical”, “esoteric”, or even “magical”! A few centuries ago, who would believe that little, invisible, organisms make you ill, or that you could bring light into a room with the flip of a switch, or that you could travel long-distances without having to swap out your horses — or even without horses? Regardless, antibiotics prevailed against home-made concoctions, electricity against candlelight, trains against carriages, and microcomputers against – well – everything..!

In the same manner, AI is increasingly pervading more and more domains, with the original intent being to assist us in speeding up our tasks, but it can eventually end up taking over some sectors entirely. The only way we can successfully cope with these changes is to be prepared to adjust to a new reality, where we get to work together with our “virtual apprentices” in order to achieve the same, or even better, results. This is the reason why the importance of information literacy has been continuously emphasised globally (ALA/ACRL, 2000; UNESCOIFLA2005; Secker and Coonan, 2011), thus demonstrating that this skill can nowadays be considered as important as being able to read, write and count.

So, why do we then still need to spend “eighteen minutes” talking, trying to convince everyone that “this is indeed happening right now — like it or not, so you better deal with it, instead of burying your head in the sand”?


You may have noticed that my description of Artificial Intelligence successfully passed through all “Five Stages of Grief” (Kübler-Ross, 2014)..!


First comes denial, by clinging to the Olden Days, when everything was “simpler” and “purer” — as we tend to picture a romanticised version of the past, briefly forgetting how frustrating and time-consuming it was, for example, finding information before “smart” Search Engines (Jones, 2016), or trying to see a human GP (UCLH, 2018). Then, this denial turns into anger — a sudden Technophobia full of disdain, led by our fear and aversion of changing our ways, which would require our continuous exertion — yet no one asked us whether we are willing to participate. At some point, the bargaining begins, while we are trying to see if we can potentially use technology and AI as a Tool to enhance our profession and ourselves. As soon as we realise the full extent of how different things will be from now on, though, the depression comes, inciting the belief that this will lead to a technological Singularity and to the end of our profession — or even of the entire world! However, eventually acceptance kicks in, as you start rethinking society based on new, Postcapitalistic, standards, gradually shifting away from the past, and getting used to a new reality, which – at some point – you will once again have to surpass.

The chapter that immediately follows this model of grief in the original book, mentions that “the one thing that usually persists through all these stages is hope” (Kübler-Ross, op. cit.), thus pointing out that – for better or for worse – humans are optimistic by nature (Sharot et al., 2011), which is what gets us going. I truly feel, however, that AI in education, as the philosopher Arthur Schopenhauer (2010) once wrote, “will fully share the fate that Truth has met within every branch of Knowledge […], that of being granted only a short victory celebration between the two long periods of time when it is condemned as paradoxical or disparaged as trivial”…


But what is actually happening at the moment in the field of education? Perhaps you have heard the parable of the blind men and the elephant. Each of the men tried to identify the animal by touching a different part of its body, which made all of them end up with a different perception of how it looked like, leading to long disputes even though no one actually had the whole picture (Saxe, 1875). It is true that we all see things differently, since reality can be considered a “subjective experience” (Nagel, 1974), and this is fine, as long as we can all agree that no one is truly correct and that our opinions have a complementary value that should be explored instead.

However, you cannot run a viable – and profitable – business if you dare say “do not just listen to me — everyone is a little bit right”..! Institutions (HEA, 2018), organisations (OECD, 2018) and societies (Ross, 2012) that design frameworks of suggested best practice – as a shared point of reference for educationists –, focus on what they think the world needs right now — from their perspective. As “Conway’s Law” goes: “organizations which design systems”, “are constrained to produce designs which are copies of the communication structures of these organizations” (Conway, 1968). Just like the blind men with the elephant, strict, explicit, frameworks, cannot – by design – address the whole picture, because they are forced to make assumptions and generalisations in order to produce a finite, measurable, model.

You see, we may prefer to perceive university students “traditionally” (Choy, 2002), as youngsters who recently joined adulthood after finishing high school education and are now looking to achieve the next level of studies before initiating the hunt for the perfect job. In reality, however, even in undergraduate levels, we have numerous examples of students who do not fit this pattern; who perhaps waited a few years before coming to study — maybe even getting a temp job for a while (Ross-Gordon, 2011). This changes people — it changes their perspective on life, and makes them redefine their goals. Of course, this is even truer in postgraduate education, where students could be of any age and in various life phases: working full-time in parallel to their studies, married and looking for a better job; maybe with a few newborn children tottering around the house — having to deal with coursework and changing diapers all at once!


We are approaching education reform as a standardisation process that aims to fit the “golden mean” — that is impossible. There is no ultimately perfect, universal, approach to designing the ideal “one-size-fits-all” course that could meet the needs of all students (Gkoutzis, 2017). We must think in an entirely novel manner if we wish to break free from our past confines.


Allow me to try and read your minds for a minute. I understand how my “J’Accuse…!” may sound to some of you. It is like having a large group of highly trained individuals inside a building, dedicating years and years discussing what colour the new wallpaper should be and how to move the furniture around without causing too much of an inconvenience, and suddenly I show up with a sledgehammer, ready and willing to take down some of these walls altogether.


Maslow’s (1966) Law of the Instrument declares that: “if the only tool you have is a hammer, [you] treat everything as if it were a nail”, so one could argue that, since my “hammer” is technology, I am cognitively biased to perceive all solutions to problems via a technological perspective. However, what I am suggesting here is not merely my solution, or even my proposal, to the stated problem of outdated teaching methodologies, but what groups and individuals from all over the world have been working on for many-many decades already.

In the second half of the 20th century, when computers were only just beginning to reveal their endless capabilities, numerous systems started appearing for supporting computer-based collaborative learning and working (Resta and Laferrière, 2007). At the time, researchers were hoping that AI could eventually “transform machines from passive agents that process and present information, to active agents that enhance interactions” in electronic classrooms, while also acknowledging the challenge of ensuring this happens “in a way that is procedurally and socially desirable to the participants” (Ellis et al., 1991). Others envisioned future, AI-powered, classrooms where the role of the teacher would be “a more conceptual one, with less concern for repetitive examples” (Carbonell, 1970); where students would learn how to use information management systems as “cognitive resource” locators (Pea, 1985), instead of simply memorising endless facts – most of which – they would eventually forget (Miller, 1956). These AI “expert systems” were expected to automate the “transfer of expertise“, thus establishing “Intelligent Computer-Aided Instruction” tutoring systems, both as communicators of knowledge and as assessors of the students’ answers (Clancey, 1987); in other words, everything we have been discussing all this time.

We are well into the 21st century, but I still do not see any university courses taught by AI agents, nor any modules on “Search Engine based History 101”. So, what is keeping us behind?


It has been said that “there is always a well-known solution to every human problem — neat, plausible, and wrong” (Mencken, 1920), so I am not here to prove beyond doubt that this is the definite way to go. Similar to the questions I ask my students, there is not just a single right answer to these issues either — otherwise, they would have already been resolved.

When faced with the “digital dilemma”, many academics initially worry about what would happen to the reputation of their institution if the online version of their courses did not perform as expected — an issue of prestige. However, “non-learning” in Higher Education (Kinchin et al., 2008) can also occur “offline”, so going “digital” is actually a risk worth taking — sooner rather than later.

In addition to teaching the course material, universities also officially aim to inculcate students with lessons on ethics, morals and values. We may like to think that we do not treat students “in loco parentis” — as if we were replacing their parents, but we still attempt to instil them with extra-curricular skills. Loss of – the positive – hidden curriculum (Wren, 1999) — the good “side-effects” of studying in a real-life environment, like socialising and learning from each other, is an important issue that does need to be considered while implementing these changes.

With the help of 3D avatars and Virtual Reality technologies, we can have “live and interactive meetings and lectures” online (Brown and Green, 2016), simulating a virtual campus comprised of students who perhaps would never have met otherwise, had it not been for the digital system. Organising student meetings based on localised “microcampuses” could also engender new relationships and further energise the determination of system participants. In this way, we can attempt to get “the best of both worlds”, as much as that is possible.

Another potential danger of the Ed/Tech marriage is that of the “devious” return of classical conditioning — this time under a “high-tech” guise. You may have heard the phrase “repetition is the mother of learning”, and this actually holds some truth. Researchers have found that repetition can indeed have quite immediate positive effects to the memorisation of studied material (Shtyrov et al., 2010), even though reviewing is still needed in order to ascertain prolonged retention (Reynolds and Glaser, 1964). This is even more important when studying more practical subjects, something that has been highlighted as early as more than two thousand years ago, when the Chinese philosopher Xun Kuang stated that simply “knowing” something “is not as good as putting it into practice” (Hutton, 2014).

Computers can be used to teach, or train, students to acquire certain skills via passive haptic repetition, which means utilising an AI to physically force you to repeat the same thing again and again, until it becomes a habit for you. However, this “muscle memory”, as it is sometimes called, can be misleading if student expectations are not properly managed by an instructor (Lee et al., 1994) or even by the system itself. We may be able, for example, to teach students how to type in Morse code (Seim et al., 2016), or even how to play a piece of music on the piano (Huang et al., 2008), but this does not necessarily constitute true learning — it cannot ascertain that students have understood the underlying mechanism of this gnosis, allowing them to suddenly start composing creative musical pieces; hence, it is no different than simply memorising some handouts just to pass an exam, without actually digging deeper into the meaning behind each presented concept. To prevent this, AI tutoring systems should encourage students to produce their own content, which can perhaps also be reused by the AI itself — when technology catches up.

This reveals another potential danger of Digital Education systems — staleness. If tutors think that they will create the material once and then just forget all about it, pretending it has now become an “automated money-making machine” of some sort, they are gravely mistaken. Just think of human teachers without any Continuous Professional Development — even if they lecture on the most well-established and unchanging domains, after a while they start sounding like “broken records”..! If we keep reusing the same knowledge-base to teach a topic, without any corrections, additions or expansions, then these digital systems will end up being no different than those teachers you may have met a few times already, who – choose to – only know how to say one thing in one way, and then clock off as soon as the bell rings..!


Until AI Teaching systems learn how to update themselves, their human counterparts should be prepared to continue enriching and amending the provided digital content, as needed. This may sound like a lot of additional work but, as mentioned before, in the long run, our “virtual apprentices” will end up enhancing our teaching in ways we would not have been able to realise within a single lifetime — while maintaining our mental health.

If these Digital Education systems are indeed as successful as we hope they will be, one final, yet important, concern is that, eventually, there will really be no reason for not knowing something — given enough time, of course. When someone asks a question these days, they may receive the response: “hey, why don’t you search for this on-line, instead of asking me”? As Search Engines grew larger, faster, and easier to use, answering straightforward questions became a form of “reinventing the wheel” — if you will. Always-on, widely available AI Teaching Systems could introduce the same “side-effect”, this time to entire subjects and skills, leaving little to no excuse for not having learned something before you needed it — unless you had not received adequate prior notice. This could bring forth a new “generation gap”, but not between the so-called “digital natives” – who were born into technology – and the “digital immigrants” – who had to “re-learn” even widely accepted concepts – (Prensky, 2001), since, by the time it emerges, almost everyone involved would have already grown up with some form of digital technology as part of their lives; no, this time it will be between those who have managed to adapt to the new ways of teaching and learning, versus those who thought that technology is “great and all”, but it will never affect “_______” — enter whichever word you wish here.


We can choose to pretend that everything will always stay exactly the same, ignoring the popular Ancient Hellenic maxim that states: “all things move and nothing remains still” (Fowler, 1926). However, unless a global catastrophe takes place, wiping humanity off the map — or frying all of our digital circuits and permanent data storage devices, then rest assured that “technology is going to replace jobs” as well as “the people holding those jobs” and, in the end, “few industries, if any, will be untouched” (Pistrui, 2018). We will need to stop “resting on our laurels”, thinking that we do not have to try any harder because “we are the only ones who can do our job”, and realise that AI will eventually learn how to do what we do — regardless of whether that is computer programming and mathematics, or acting and counselling.

The psychologist Lev Vygotsky, paraphrasing the poet Alexander Pushkin, remarked that “imagination is as necessary in geometry as it is in poetry” (Vygotsky et al., 1994). Even though “Creative Learning” tends to be intrinsically related to artistic expression (Mansfield et al., 1978), it has been found that the most effective creativity training programmes focus on the development of cognitive skills and their application, while at the same time remaining “subject to revision and extension” (Scott et al., 2004). Being creative with the common, somewhat artistic, meaning, is not adequate any longer. Each individual needs to take initiative and manage their own learning (Roll et al., 2014), in this student-led, “creative, collaborative, critical, and communicative”, learning environment (Cobcroft et al., 2006; Bruns, 2011), and go after what they want to learn, as soon as they want to learn it.

This approach is called “Just-In-Time”, and provides more flexibility for everyone involved, but it demands high motivation and self-regulation from the student. With the help of AI Teaching Assistants (Maderer, 2016), always available to answer most student questions, we can focus on the teaching and the learning, instead of worrying about the logistics, and, with the semantic allocation of prerequisite knowledge, interconnected AI systems can automatically guide students in their quest for new knowledge, steering them in relevant directions, because they understand how each student can learn. This concept of “learning how we learn” has been called “metalearning” and requires students to be aware of their motives for studying a subject, in order for them to have creative “control over their strategy selection and deployment” (Biggs, 1985), instead of the teacher being the “supreme authority” deciding what and how an entire class of students should learn. In other words, students should directly participate in the creation of their curriculum, so that – when the time comes for “reaping what they sowed” – they will achieve exactly what they strove for.

In the end, for students and teachers alike, it is a choice between “Irrelevance” and “Redefinition”. We must learn to learn in new ways, continuously integrating new realities by informing our practice based on the latest developments. This is not going to happen overnight; it will take time — but it will happen.


All this sounds great — in theory, but what can we do with it right now? Well, unfortunately, not that much — yet. However, do not lose hope! We are currently in a – long – transitional phase, where tangible changes are taking place, and we will get to be one of the first few generations to make the best of what AI in Education has to offer.

The process of “Learning Design” (O’Reilly, 2004; Laurillard, 2016), which aims to define the entire pedagogical experience, must start with teachers producing formalised, shareable, techniques (Dalziel et al., 2016) that can be adjusted and reused by others, even for courses of different domains — including those offered by networked AI agents, that should operate in an interdisciplinary manner (Graesser et al., 1999). These designs should adopt a variety of learning methodologies, in order for the student to be able to shift between them while trying to understand a specific concept. Many of the learning theories that were originally meant for “brick-and-mortar teaching” – if you allow the expression –, have the potential of being utilised in a digital environment — with some adjustments (Hsiao, 1998). In this way, personalised learning can become a reality, fulfilling the hopes and expectations of interested stakeholders (Weller, 2017; Pape and Vander Ark, 2018).

Good progress has already been made towards this goal (Olney et al., 2012), with private companies creating platforms that map concepts in “Knowledge Graphs” (Wilson and Nichols, 2015) which then employ “Adaptive Learning” techniques (Waters, 2014) to adjust the teaching methods to each student, according to the perceived understanding — or misunderstanding. A very promising computer-based education system, “ALEKS” — short for “Assessment and Learning in Knowledge Spaces”, uses hard math to define and prove its “Knowledge Space Theory” approach (Falmagne and Doignon, 2011), which enables the system to safely identify and suggest the best possible, personalised, pathway that each student should follow to learn a specific topic. In addition to tailored teaching, education publishing companies have already started working on automated marking solutions (Giles, 2011), while researchers are also exploring AI generated assessments (Yuan et al., 2017), all of which could eventually allow us to offer a more holistic Digital Education approach to students.

Even though these online education systems are still mainly privately managed and currently have a limited amount of – mostly STEM – courses, I envisage a world where anyone can teach anything they know, and learn anything they want to know, in a personalised “Just-In-Time” manner. This customised “on-demand” and “always-on” approach to studying, can allow motivated students to learn what they truly care about – regardless of entry criteria, distances, or time zones –, while also empowering Continuous Professional Development with the help of an AI that remembers you, and can quickly figure out what you can recall and what else you have learned in the meantime by yourself. There is thus no need to antagonise AI — we can work together to cultivate students who are creative and critical thinkers, ready and willing to introduce positive change to our world.


If you visited the Dark Cave in Malaysia, you would be welcomed by a large banner displaying a statement that Baba Dioum – a Senegalese forestry engineer – addressed to the International Union for the Conservation of Nature and Natural Resources back in 1968: “In the end, we will conserve only what we love, we will love only what we understand, and we will understand only what we are taught” (Valenti and Tavana, 2005).

The most important point to take with you from this talk – regardless of whether you are acting as a teacher or a student – is: “keep an open mind”; history repeats itself — only the names change. To break the cycle of deleterious reoccurrences, we must be prepared to absolve ourselves of our acquired habits, making room for awakening and ascension.

Our utmost goal as teachers should be to become the “yāna” — which is Sanskrit for “vehicle” or “path” (Walser, 2009), and thus help students go on their way of further perfecting themselves. To achieve this, we should make the best possible use of all the tools available to us, regardless of any negative preconceptions fuelled by the fear of our collapsing habits. We must first enhance ourselves before we can enhance others, and technology can help us realise this even more efficiently. Embrace this potent medium, and utilise it in the pursuit of our synergetic transcendence.

Thank you very much.



All images in the public domain by the British Library.


Adams, R. (2017). Almost half of all young people in England go on to Higher Education. [online] the Guardian. Available at: [Accessed 01 Jul 2018].

ALA/ACRL. (2000). Information literacy competency standards for Higher Education. Community & Junior College Libraries, 9(4), pp. 2-4.

Allen, G. (1895). The Woman Who Did. John Lane, London, p. 15.

Automated Insights (2017). Overcoming Dashboard Deficiencies: Using Automation to Gain Clear, Actionable Insights. [online] Available at: [Accessed 01 Jul 2018].

Baker, S. (2018). ‘New elite’ emerges as UK ranking combines TEF and REF. [online] Available at: [Accessed 01 Jul 2018].

Biggs, J.B. (1985). The role of meta-learning in study process. British Journal of Educational Psychology, 55, pp. 185-212.

Bishop, T. (2018). Tableau acquires MIT AI spinoff Empirical Systems, opens new R&D center in Boston area. [online] GeekWire. Available at: [Accessed 01 Jul 2018].

Blass, E. and Hayward, P. (2014). Innovation in Higher Education; will there be a role for “the academe/university” in 2025?. European Journal of Futures Research, 2:41.

Bodkin, H. (2017). ‘Inspirational’ robots to begin replacing teachers within 10 years. [online] The Telegraph. Available at: [Accessed 01 Jul 2018].

Bodovski, K. (2018). Why I Collapsed on the Job. [online] The Chronicle of Higher Education. Available at: [Accessed 01 Jul 2018].

Brembs, B. (2018). Prestigious Science Journals Struggle to Reach Even Average Reliability. Frontiers in human neuroscience, 12, p. 37.

Brown, A. and Green, T. (2016). Virtual reality: Low-cost tools and resources for the classroom. TechTrends, 60(5), pp. 517-519.

Bruns, A. (2011). Beyond Difference: Reconfiguring Education for the User-Led Age. Digital Difference. SensePublishers, pp. 133-144.

Caliskan, A., Bryson, J.J. and Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science, 356(6334), pp. 183-186.

Carbonell, J.R. (1970). AI in CAI: An artificial-intelligence approach to computer-assisted instruction. IEEE transactions on man-machine systems, 11(4), pp. 190-202.

Choy, S. (2002). Nontraditional undergraduates: Findings from the condition of education 2002. Institute of Education Sciences, U.S. Department of Education. pp. 2-3.

Clancey, W.J. (1987). Knowledge-Based Tutoring: The GUIDON Program. The MIT Press, Cambridge, Mass, pp. 1-13.

Cobcroft, R.S., Towers, S.J., Smith, J.E. and Bruns, A. (2006). Mobile learning in review: Opportunities and challenges for learners, teachers, and institutions. In Proceedings Online Learning and Teaching (OLT) Conference 2006, pp. 21-30.

Conway, M.E. (1968). How do committees invent?. Datamation, 14(4), pp. 28-31.

Cowen, T. (2017). Larry Summers on Macroeconomics, Mentorship, and Avoiding Complacency (Ep. 28 — Live). [online] Medium. Available at: [Accessed 01 Jul 2018].

Craft, M. (1982). Education for Diversity: The Challenge of Cultural Pluralism. University of Nottingham, School of Education, p. 5.

Dalziel, J., Conole, G., Wills, S., Walker, S., Bennett, S., Dobozy, E., Cameron, L., Badilescu-Buga, E. and Bower, M. (2016) The Larnaca Declaration on Learning Design. Journal of Interactive Media in Education, 2016(1):7, p. 22.

De Rond, M. and Miller, A.N. (2005). Publish or perish: bane or boon of academic life?. Journal of Management Inquiry, 14(4), pp. 321-329.

Derry, J. (2017) Why knowledge matters: rescuing our children from failed educational theories. By E. D. Hirsch, Jr., British Journal of Educational Studies, 65:4, pp. 517-519.

Dewey, J. (2013). The school and society and the child and the curriculum. University of Chicago Press, p. 64.

Dobie, J.F. (1945). A Texan in England. Little, Brown, p. 26.

Doin, G. and Guzzo, V. (2012). La Educación Prohibida (The Forbidden Education). Argentina: Eulam Producciones. Recuperado de, 02:07:07-02:09:16.

Duhigg, C. (2012). The power of habit: Why we do what we do in life and business. Random House, p. 12.

Dylan, B. (2014). The Lyrics: Since 1962. Simon and Schuster, p. 104.

Ellis, C.A., Gibbs, S.J. and Rein, G. (1991). Groupware: some issues and experiences. Communications of the ACM, 34(1), pp. 39-58.

Etzkowitz, H., Webster, A., Gebhardt, C. and Terra, B.R.C. (2000). The future of the university and the university of the future: evolution of ivory tower to entrepreneurial paradigm. Research policy, 29(2), pp. 313-330.

Evans, T.M., Bira, L., Gastelum, J.B., Weiss, L.T. and Vanderford, N.L. (2018). Evidence for a mental health crisis in graduate education. Nature biotechnology, 36(3), p. 282.

Falmagne, J.C. and Doignon, J.P. (2011). Learning spaces: Interdisciplinary applied mathematics. Springer Science & Business Media, pp. 359-374.

Feldman, M.Q., Cho, J.Y., Ong, M., Gulwani, S., Popović, Z. and Andersen, E. (2018). Automatic Diagnosis of Students’ Misconceptions in K-8 Mathematics. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 264.

Fowler, H.N. (1926). Plato: Cratylus. Parmenides. Greater Hippias. Lesser Hippias, Cambridge (MA): LOEB, pp. 66-67.

Friend, T. (2018). How Frightened Should We Be of A.I.?. [online] The New Yorker. Available at: [Accessed 01 Jul 2018].

Galor, O. and Moav, O. (2006). Das Human-Kapital: A theory of the demise of the class structure. The Review of Economic Studies, 73(1), pp. 85-117.

Gardner, W.L. and Martinko, M.J. (1996). Using the Myers-Briggs Type Indicator to study managers: A literature review and research agenda. Journal of Management, 22(1), pp. 45-83.

Giles, J. (2011). Automated marking takes teachers out of the loop. [online] New Scientist. Available at: [Accessed 01 Jul 2018].

Gkoutzis, K. (2017). (Mis)Understanding Universal Design. [online] Available at: [Accessed 01 Jul 2018].

Graesser, A.C., Wiemer-Hastings, K., Wiemer-Hastings, P., Kreuz, R. and Tutoring Research Group (1999). AutoTutor: A simulation of a human tutor. Cognitive Systems Research, 1(1), pp. 35-51.

Graybiel, A.M. (1998). The basal ganglia and chunking of action repertoires. Neurobiology of learning and memory, 70(1-2), pp. 119-136.

Greenwood, X. and Adams, R. (2018). Oxford and Cambridge university colleges hold £21bn in riches. [online] the Guardian. Available at: [Accessed 01 Jul 2018].

Gummere, R.M. (1925). Seneca ad Lucilium epistulae morales III, p. 223.

Harari, Y.N. (2017). Homo Deus: A Brief History of Tomorrow. Vintage, pp. 427-439.

Hartocollis, A. (2018). Harvard Rated Asian-American Applicants Lower on Personality Traits, Suit Says. [online] Available at: [Accessed 01 Jul 2018].

Harvard Gazette. (2018). Record 42,742 apply to Harvard College Class of ’22. [online] Available at: [Accessed 01 Jul 2018].

HEA. (2018). Higher Education Academy frameWORKS – A shared point of reference for the sector. [online] Available at: [Accessed 01 Jul 2018].

Hirsch, E. D., Jr. (2016). Why Knowledge Matters: Rescuing Our Children from Failed Educational Theories. Harvard Education Press, p. 7.

Hsiao, J.W.D.L. (1998). The impact of reflective facilitation on middle school students’ self-regulated learning and their academic achievement in a computer-supported collaborative learning environment. The University of Texas at Austin, pp. 18-46.

Husmann, P.R. and O’Loughlin, V.D. (2018). Another nail in the coffin for learning styles? Disparities among undergraduate anatomy students’ study strategies, class performance, and reported VARK learning styles. Anatomical sciences education.

Hutton, E.L. (2014). Xunzi: The complete text. Princeton University Press, p. 64.

Huxley, A. (1947). Brave New World. Chatto & Windus, London, p. 66.

Jones, N. (2016). AI science search engines expand their reach. [online] Nature. Available at: [Accessed 01 Jul 2018].

Kataoka, S., Langley, A.K., Wong, M., Baweja, S. and Stein, B.D. (2012). Responding to students with posttraumatic stress disorder in schools. Child and Adolescent Psychiatric Clinics, 21(1), pp. 119-133.

Khomami, N. (2017). A tool or a distraction? How UK schools’ approaches to mobile phones vary widely. [online] the Guardian. Available at: [Accessed 01 Jul. 2018].

Kinchin, I.M., Lygo-Baker, S. and Hay, D.B. (2008). Universities as centres of non-learning. Studies in Higher Education, 33(1), pp. 89-103.

Kliegr, T., Bahník, Š. and Fürnkranz, J. (2018). A review of possible effects of cognitive biases on interpretation of rule-based machine learning models. arXiv preprint arXiv:1804.02969.

Knuth, D.E. (1984). The TeXbook. Addison-Wesley Professional, Ch. 3 p. 9.

KU Leuven (2018). KU Leuven once again tops Reuters ranking of Europe’s most innovative universities. [online] Available at: [Accessed 01 Jul 2018].

Kübler-Ross, E. (2014). On Death & Dying: What the Dying Have to Teach Doctors, Nurses, Clergy & Their Own Families. Scribner Book Company, pp. 37-134.

Land, G. and Jarman, B. (1992). Breakpoint and beyond: Mastering the future today. New York: Harper Business, pp. 153-156.

Laurillard, D. (2016). Foreword. In: Dalziel, J. Learning Design: Conceptualizing a framework for teaching and learning online. Routledge, vii-x.

Lee, T.D., Swinnen, S.P. and Serrien, D.J. (1994). Cognitive effort and motor learning. Quest, 46(3), pp. 328-344.

Lenat, D.B. (1995). CYC: A large-scale investment in knowledge infrastructure. Communications of the ACM, 38(11), pp. 33-38.

LeVine, S. (2017). Artificial intelligence pioneer says we need to start over. [online] Axios. Available at: [Accessed 01 Jul 2018].

Lewis, L.H. (1986). Theater: A catalyst for dialogue and action. New Directions for Adult and Continuing Education, 1986(30), pp. 91-98.

Lohr, S. (2015). Data-ism. Oneworld Publications, pp. 207-215.

Maderer, J. (2016). Artificial Intelligence Course Creates AI Teaching Assistant. [online] Available at: [Accessed 01 Jul 2018].

Mansfield, R.S., Busse, T.V. and Krepelka, E.J. (1978). The effectiveness of creativity training. Review of Educational Research, 48(4), pp. 517-536.

Marcus, G. (2018). Deep Learning: A Critical Appraisal. arXiv preprint arXiv:1801.00631.

Maslow, A.H. (1966). The Psychology of Science: A Reconnaissance. Gateway / Henry Regnery, p. 15.

Mason, P. (2016). Postcapitalism: A guide to our future. Macmillan, pp. 125-134.

Mejia, Z. (2017). Elon Musk: This simple question can help fix what’s wrong with the U.S. education system. [online] CNBC. Available at: [Accessed 01 Jul 2018].

Mencken, H.L. (1920). Prejudices: Second Series. Jonathan Cape, London, p. 158.

Miller, G.A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological review, 63(2), p. 81.

Mokyr, J. (2001). The Rise and Fall of the Factory System: Technology, firms, and households since the Industrial Revolution. In Carnegie-Rochester Conference Series on Public Policy (Vol. 55, No. 1), pp. 1-45.

Nagel, T. (1974). What is it like to be a bat?. The philosophical review, 83(4), pp. 435-450.

Nemrodov, D., Niemeier, M., Patel, A. and Nestor, A. (2018). The neural dynamics of facial identity processing: Insights from EEG-based pattern analysis and image reconstruction. eNeuro, pp. ENEURO-0358.

O’Reilly, M. (2004). Educational design as transdisciplinary partnership: supporting assessment design for online. Beyond the comfort zone, Proceedings of the 21st ASCILITE Conference, Australasian Society for Computers in Learning in Tertiary Education, pp. 724-733.

OECD. (2018). The future of education and skills: Education 2030. [online] Available at: [Accessed 01 Jul 2018].

Olney, A.M., D’Mello, S., Person, N., Cade, W., Hays, P., Williams, C., Lehman, B. and Graesser, A. (2012), June. Guru: A computer tutor that models expert human tutors. In International Conference on Intelligent Tutoring Systems. Springer, Berlin, Heidelberg, pp. 256-261.

Orwell, G. (1949). Nineteen Eighty-Four. Secker & Warburg, p. 68.

Owen, N. and Smith, M.E. (2018). TEF-REF ranking marks rise of ‘new elite’ in UK Higher Education. [online] Available at: [Accessed 01 Jul 2018].

Pape, B. and Vander Ark, T. (2018). Policies and Practices That Meet Learners Where They Are. [online] National Center for Research in Advanced Information and Digital Technologies – Digital Promise. Available at: [Accessed 01 Jul 2018].

Parker, S.T., Langer, J. and Milbrath, C. eds. (2014). Biology and knowledge revisited: From neurogenesis to psychogenesis. Psychology Press, p. 36.

Patel, N.V. (2017). A Radical New Theory Could Change the Way We Build Artificial Intelligence. [online] Inverse. Available at: [Accessed 01 Jul 2018].

Pea, R.D. (1985). Beyond amplification: Using the computer to reorganize mental functioning. Educational psychologist, 20(4), pp. 167-182.

Pearl, J. and Mackenzie, D. (2018). The Book of Why: The New Science of Cause and Effect. Hachette UK, p. 37.

Phelps, E.S. (2013). Mass flourishing: How grassroots innovation created jobs, challenge, and change. Princeton University Press, p. 143.

Piaget, J. (1971). Biology and knowledge: An essay on the relations between organic regulations and cognitive processes, p. 19.

Pistrui, J. (2018). The Future of Human Work Is Imagination, Creativity, and Strategy. [online] Harvard Business Review. Available at: [Accessed 01 Jul 2018].

Pittenger, D.J. (2005). Cautionary comments regarding the Myers-Briggs Type Indicator. Consulting Psychology Journal: Practice and Research, 57(3), p. 210.

Prensky, M. (2001). Digital natives, digital immigrants part 1. On the horizon, 9(5), pp. 1-6.

Postman, N. (2006). Amusing ourselves to death: Public discourse in the age of show business. Penguin, xix.

QAA Scotland (2005). Reflections on Assessment: Volume I. Enhancement Themes, p. 74.

Rackham, H. (1914). Cicero de finibus bonorum et malorum. London, William Heinemann Ltd., p. 37.

Resta, P. and Laferrière, T. (2007). Technology in support of collaborative learning. Educational Psychology Review, 19(1), pp. 65-83.

Reynolds, J. H. and Glaser, R. (1964). Effects of repetition and spaced review upon retention of a complex learning task. Journal of Educational Psychology, 55(5), pp. 297-308.

Richardson, J.G. ed. (1986). Handbook of Theory and Research for the Sociology of Education, Westport, CT: Greenwood, pp. 241–58.

Robinson, K. (2011). Out of our minds: Learning to be creative (2nd ed). John Wiley & Sons, pp. 141-142.

Roll, I., Wiese, E.S., Long, Y., Aleven, V. and Koedinger, K.R. (2014). Tutoring self-and co-regulation with intelligent tutoring systems to help students acquire better learning skills. Design recommendations for intelligent tutoring systems, 2, pp. 169-182.

Ross, M. (2012). SFIAplus in the Curriculum. Using SFIA in education and workplace learning, Milton Keynes, The Open University.

Ross-Gordon, J.M. (2011). Research on adult learners: Supporting the needs of a student population that is no longer nontraditional. Peer Review, 13(1), p. 26.

Santos, E.A., Campbell, J.C., Patel, D., Hindle, A. and Amaral, J.N. (2018). Syntax and sensibility: Using language models to detect and correct syntax errors. In 2018 IEEE 25th International Conference on Software Analysis, Evolution and Reengineering (SANER), pp. 311-322.

Satell, G. (2016). Innovative Companies Get Their Best Ideas from Academic Research — Here’s How They Do It. [online] Harvard Business Review. Available at: [Accessed 01 Jul 2018].

Saxe, J.G. (1875). The poems of John Godfrey Saxe. Boston: James R. Osgood and Company, pp. 135-136.

Schachtman, T.R. and Reilly, S.S. eds. (2011). Associative learning and conditioning theory: Human and non-human applications. OUP USA, pp. 3-23.

Schopenhauer, A. (2010). The Cambridge edition of the works of Schopenhauer (1818), trans. C. Janaway. Cambridge University Press, p. 10.

Schrager, A. and Wang, A.X. (2017). Imagine how great universities could be without all those human teachers. [online] Quartz. Available at: [Accessed 01 Jul 2018].

Schumpeter, J.A. (1943). Capitalism, socialism and democracy. Routledge, pp. 81-86.

Scientific Computing World. (2017). sciNote adds AI capabilities. [online] Available at: [Accessed 01 Jul 2018].

sciNote. (2018). Manuscript Writer Generates Draft of Your Scientific Manuscript. [online] Available at: [Accessed 01 Jul 2018].

Scott, G., Leritz, L.E. and Mumford, M.D. (2004). The effectiveness of creativity training: A quantitative review. Creativity Research Journal, 16(4), pp. 361-388.

Secker, J. and Coonan, E. (2011). A New Curriculum for Information Literacy (ANCIL): curriculum and supporting documents. Cambridge: Cambridge University Library, p. 4.

Seim, C., Reynolds-Haertle, S., Srinivas, S. and Starner, T. (2016). Tactile taps teach rhythmic text entry: passive haptic learning of Morse code. In Proceedings of the 2016 ACM International Symposium on Wearable Computers, pp. 164-171.

Sharot, T., Korn, C.W. and Dolan, R.J. (2011). How unrealistic optimism is maintained in the face of reality. Nature neuroscience, 14(11), pp. 1475-1479.

Shaw, G.B. (1919). Man and Superman. 1903. Constable and Company Ltd. London, pp. 230, 238.

Shorey, P. and Bury, R.G., (1969). Plato (Vols. 5 & 6). Cambridge, MA, Harvard University Press; London, William Heinemann Ltd., p. 536.

Shtyrov, Y., Nikulin, V.V. and Pulvermüller, F. (2010). Rapid cortical plasticity underlying novel word learning. Journal of Neuroscience, 30(50), pp. 16864-16867.

Singer, N. (2017). The Silicon Valley Billionaires Remaking America’s Schools. [online] The New York Times. Available at: [Accessed 01 Jul 2018].

Smith, L. (2002). Piaget’s model. Blackwell handbook of childhood cognitive development, pp. 515-537.

Smith, L. and Vonèche, J. eds. (2006). Norms in human development. Cambridge University Press, p. 111-112.

Somers, J. (2018). The Scientific Paper Is Obsolete. [online] The Atlantic. Available at: [Accessed 01 Jul 2018].

Srnicek, N. and Williams, A. (2015). Inventing the future: Postcapitalism and a world without work. Verso Books, p. 55.

Strayer, D.L. and Drews F.A. (2007). Attention. Handbook of Applied Cognition (Chapter 2). Chichester; New York: Wiley, p. 39.

Stribling, J., Krohn, M. and Aguayo, D. (2005). SCIgen – An Automatic CS Paper Generator. [online] Available at: [Accessed 01 Jul 2018].

TalentWorks. (2018). The Science of The Job Search, Part III. [online] Available at: [Accessed 01 Jul 2018].

The Economist. (2017). Established education providers v new contenders [online] Available at: [Accessed 01 Jul 2018].

Thoreau, H.D. (1854). Walden. Reprint, New York: Heritage Press, 1939, p. 45.

Time. (1976). Modern Living: Ozmosis in Central Park. [online] Available at:,9171,918412,00.html [Accessed 01 Jul 2018].

Turing, A.M. (1950). Computing machinery and intelligence. Mind, New Series, Vol. 59, No. 236 (Oct., 1950), pp. 433-460.

UCLH. (2018). Revolutionising healthcare with AI and data science: UCLH and The Alan Turing Institute announces breakthrough partnership today. [online] Available at: [Accessed 01 Jul 2018].

UNESCO/IFLA. (2005). Beacons of the Information Society: The Alexandria Proclamation on Information Literacy and Lifelong Learning. [online] Available at: [Accessed 01 Jul 2018].

Valenti, J.M. and Tavana, G. (2005). Report: continuing science education for environmental journalists and science writers: in situ with the experts. Science Communication, 27(2), pp. 308.

Vygotsky, L.S., van der Veer, R.E., Valsiner, J.E. and Prout, T.T. (1994). The Vygotsky reader. Basil Blackwell, p. 270.

Waddington, C. H. (1961). The nature of life. London: George Allen & Unwin, p. 64.

Waite, R. and Braidwood, E. (2016). Mental health problems exposed by AJ Student Survey 2016. [online] Architects Journal. Available at: [Accessed 01 Jul 2018].

Walser, J. (2009). The origin of the term ‘Mahāyāna’ (The Great Vehicle) and its relationship to the Āgamas. Journal of the International Association of Buddhist Studies, 30(1-2), pp. 219-250.

Waters, J.K. (2014). Adaptive learning: Are we there yet?. [online] THE Journal (Technological Horizons In Education) digital edition. Available at: [Accessed 01 Jul 2018].

Watters, A. (2014). The monsters of education technology. CreateSpace Independent Publishing Platform, p. 43-53.

Weale, S. (2016). Calls for ‘complete overhaul’ of UK university application process. [online] the Guardian. Available at: [Accessed 01 Jul 2018].

Weller, C. (2017). A study of 36,000 students just backed Bill Gates’ favorite style of education. [online] ΒusinessΙ Available at: [Accessed 01 Jul 2018].

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge university press, p. 109.

Wilson, K. and Nichols, Z. (2015). The Knewton Platform. A General-Purpose Adaptive Learning Infrastructure. Knewton White Paper, pp. 6-20.

Wren, D.J. (1999). School culture: Exploring the hidden curriculum. Adolescence, 34(135), p. 593.

Yuan, X., Wang, T., Gulcehre, C., Sordoni, A., Bachman, P., Subramanian, S., Zhang, S. and Trischler, A. (2017). Machine Comprehension by Text-to-Text Neural Question Generation. arXiv preprint arXiv:1705.02012.

How To Cite This Post

Gkoutzis, K. (2018). The Role of Universities in the Age of AI. [online] Available at: [Accessed XX YYY ZZZZ].