AI

Generative AI: Transforming education into a personalized, addictive learning experience

Comment

Abstract image of faces with gears, arrows, clouds, and faucets to illustrate the concept of learning.
Image Credits: DrAfter123 (opens in a new window) / Getty Images

Tigran Sloyan

Contributor

Tigran Sloyan is the co-founder and CEO of CodeSignal — a platform for assessing and measuring technical aptitude during the engineer hiring process.

It’s no surprise that educators have an uneasy relationship with generative AI. They fear the impact of plagiarism and machine-generated essays and the “hallucinations” — where the system confidently asserts something is true that isn’t simply because it doesn’t know any better — of tools like ChatGPT and Bard. There’s a palpable concern that generative AI will become a substitute for authentic learning: something that will help a person pass a test without the need to absorb and internalize the material.

While there is no doubt that AI has been used to circumvent the learning process, ChatGPT has already assumed the role of an ad hoc personal tutor for millions, changing learning consumption patterns and enhancing our relationship with education. The possibility of an AI-powered teaching assistant — one that mentors, encourages, and guides learners through the material in a one-to-one relationship — is within grasp. And the scalability of AI means that anyone can benefit from it.

AI can make — and, for many, already has made — learning addictive. The reasons why have little to do with cutting-edge advancements in AI and computer science and more to do with the fundamentals of what makes a learner engaged, motivated, and excited.

Growing up in Armenia, I was enthralled by the fiercely competitive math Olympiads, and my desire to win drove me to spend hours studying and practicing. Yet, as an adult, I couldn’t find that same motivation while studying math at MIT. I’ve spent a great deal of my life researching and understanding the motivations behind learning, some of which I’ve distilled into this piece and much of which led to me founding CodeSignal.

What do we mean by addictive?

When I talk about AI making learning addictive, I’m talking about a sense of excitement and eagerness — instilling a voracious appetite for self-improvement and growth within a learner. But, more importantly, it continues long after they’ve accomplished what started their journey. Essentially, this boils down to sustained, long-term motivation. Creating self-motivated learners is a challenge that most educators face, and a mountain of educational research touches on this topic.

It’s hard to overstate the importance of motivation. Whether you’re learning to speak a new language or taking the first steps to a career in programming, learning is inherently iterative, where the learner gradually builds confidence and fluency over time. The prolific programming educator Zed Shaw once described this as “climbing a mountain of ignorance.” Those first few months — when you aren’t confident and don’t understand the subject — are the hardest, and it’s all too easy to give up. And that’s why you need an external force to encourage the learner to keep going. Confidence, ability, and perhaps even greatness are just around the corner.

One of my favorite examples of this is Judit Polgár, widely regarded as the greatest female chess player of all time and the world’s youngest chess grandmaster. Judit’s father, László, believed that geniuses were made, not born, and they just required sustained education and coaching from a young age. László, breaking with the societal expectations of communist-era Hungary, opted to homeschool Judit and her two sisters, with an intense focus on chess.

And it worked. Before she was even a teenager, Judit was described as a potential prodigy akin to Garry Kasparov and Bobby Fischer. By age 15, she beat a record previously set by Fischer, and two years later defeated Boris Spassky — another chess heavyweight — in an exhibition match.

While the role of nature and nurture is hotly debated (particularly in analytical games like chess), it’s clear that László’s approach worked. By combining intense training with the inherent motivating factor that comes with one-on-one coaching, Judit became a force within the chess world before reaching adulthood. Her sisters, Zsuzsa and Zsófia, also went on to become grandmasters.

In a post-retirement interview with Chess.com, Judit attributed the success of her father’s teaching method to the confidence it instilled in her: “I do believe that having private tuition in whatever field makes children improve so much faster, and because of this, they gain a lot more confidence, which increases their speed and appetite for improving. I think this is one of the most important things for any child, whether at school or not. If you can keep their curiosity, they can improve extremely fast.”

Generative AI can handle the motivational aspect of learning — the encouragement, the relevance, and the specificity — while avoiding the inevitable mistakes that emerge from a cookie-cutter, one-size-fits-all education system. But how?

The search for relevance

Academic research about the impact of generative AI as a tool for learning is still ongoing. Much of the existing academic literature is inherently speculative or anecdotal, discussing what might happen rather than what they’ve observed. This is an inevitable consequence of the newness of generative AI. ChatGPT was less than a year old at the time of writing, and research takes time. As more researchers investigate tools like ChatGPT, it will be interesting to see how my assumptions and predictions align with their findings.

As mentioned, motivation is critical to learner success, and relevance plays a huge role in achieving that. It’s one of the cornerstone factors in John Keller’s ARCS (attention, relevance, confidence, and satisfaction) model of motivation, an established concept in pedagogic theory.

Within the ARCS model, Keller identified several critical components of relevance, with two seeming especially pertinent to the subject of generative AI: needs matching, where the teacher relates the content to the learner’s needs, and modeling, which shows learners how to apply the learning in a practical sense.

Generative AI is well-positioned to achieve these components. As anyone who has used a GPT-4-based product can attest, it can create a hyper-targeted, hyper-personalized lesson about almost any subject. In a matter of seconds, ChatGPT can tell you how trigonometry can be used in the real world or how a niche part of a computer science class relates to a broader context, even if it may seem abstract and confusing. These examples can be created off the cuff, often due to the student’s unique requirements and requests. This process works for educators, too.

Education has always been centered around the human touch, and it’s hard to imagine a world where machines can replace that. Humans have an ineffable emotional intelligence that cannot be articulated in code alone. I see generative AI extending the capabilities of teachers, who are often overburdened and overextended. One example of how this could work is in modifying, improving, and tailoring learning material.

Typically, a teacher would need more time or energy to create tailored worksheets for each student’s ability, interests, or learning style. They’re overworked and overstretched, and classroom materials are expensive — and often paid for by the teacher. But now they can generate tailored learning materials at scale and on demand at negligible cost to the school or teacher. Using a tool like ChatGPT, a teacher can paste in a lesson plan and, with simple written instruction, substantially change the format or the presentation for an individual student while preserving the core material.

This process takes seconds, making it a practical option for even the busiest teacher. It’s a use case I imagine many teachers will embrace alongside generative AI’s other capabilities for ideation, proofreading, and suggestion.

It’s easy to see how generative AI’s potential for content-tailoring could be combined with other proven learning methods, like gamification.

Video games keep people engaged by creating dopamine loops. These loops only work if there’s a semblance of progress. For the gaming experience to feel worthwhile, your character needs to keep evolving and improving. With each challenge, your character acquires new skills and equipment to help them tackle future, more demanding challenges. This mechanic is repetitive on its own, so the “loop” moves with the player, bringing them to new locations and story lines to keep an element of novelty.

Generative AI makes it possible to implement these mechanics in an educational setting. With endless variations of content tailored to the student’s situation and ability, learners can obtain the repetition and reinforcement necessary for long-term success — without the content feeling tired or dull. This loop can continue as the learner moves throughout the subject, tackling more complex and advanced material as they progress.

Person-centered learning

Content alignment is fundamental to long-term success. This includes both the curriculum, which must be relevant to the learners’ interests, and the people themselves. People have different incentives and starting abilities that must be addressed from the very beginning.

One paper, published in the IEEE Signal Processing Magazine, provides an overview of the potential impact of AI in person-centered learning and student incentivization. As other pedagogical researchers have observed, it notes that students, and people in general, respond to different incentives during learning.

The author writes: “Some learners exhibit hyperbolic preferences, overweighting the present so much that future rewards are largely ignored. Some learners show strong reactions even to nonmonetary rewards. Some learners demonstrate reference-dependent preferences, implying that the utility is largely determined by its distance from a reference point, for example, a pre-defined goal or the average performance.”

In short, some people want a sense of immediacy, some desire a kind of nontangible reward (a grade, a certificate, or another form of recognition), and others are more focused on how the content will get them to a predetermined destination. These are factors that need to be considered when crafting learning material.

At the same time, it’s essential to recognize that abilities vary. Content needs to be articulated in different ways to be effective. While some may feel comfortable with a dense, academically written explanation of a subject with field-specific jargon incomprehensible to outsiders, others may prefer something more approachable. This is why a one-size-fits-all approach succeeds with some but fails many others.

And there’s the relationship between the teacher and the student, which also plays a crucial role in student motivation. Robert Gower and Jon Saphier, two respected writers on education, highlight three key messages of encouragement that work: “This is important”; “You can do it”; and “I won’t give up on you.” It remains to be seen whether these sentiments will retain their impact when delivered by an AI chatbot. But it’s something that, with a trivial amount of effort, a system could be programmed to say.

While many of the components mentioned are yet to feature in a mainstream generative AI tool (incentivization in particular), others are firmly within grasp. ChatGPT, for example, can provide high-level and low-level explanations of topics. It can respond to prompts to simplify or provide more detail or complexity. Much of the functionality required already exists, albeit in an ad hoc form, and it’s time for generative AI to play a more significant role — not merely in the classroom but also in how people more broadly engage with education.

Building lifelong learners

AI, particularly large language models, has the potential to revolutionize how students learn. This change will be fundamentally beneficial, especially regarding how individuals relate to learning and how it alters their consumption patterns.

Much of the focus has been on AI’s ability to scale personalized education or democratize education beyond the halls of expensive university campuses. While I don’t disagree with these assessments, it’s essential to acknowledge the psychological and sociological impacts of these changes. The notion that AI could make learning not only “fun” but also profoundly compelling feels realistic and imminent. Doing so will create a new generation of hyper-capable, hyper-passionate individuals who can readily adapt to change and constantly refresh and refine their skills.

That will benefit individuals, the economy, and — ultimately — society.

More TechCrunch

The company says it’s refocusing and prioritizing fewer initiatives that will have the biggest impact on customers and add value to the business.

SeekOut, a recruiting startup last valued at $1.2 billion, lays off 30% of its workforce

The U.K.’s self-proclaimed “world-leading” regulations for self-driving cars are now official, after the Automated Vehicles (AV) Act received royal assent — the final rubber stamp any legislation must go through…

UK’s autonomous vehicle legislation becomes law, paving the way for first driverless cars by 2026

ChatGPT, OpenAI’s text-generating AI chatbot, has taken the world by storm. What started as a tool to hyper-charge productivity through writing essays and code with short text prompts has evolved…

ChatGPT: Everything you need to know about the AI-powered chatbot

SoLo Funds CEO Travis Holoway: “Regulators seem driven by press releases when they should be motivated by true consumer protection and empowering equitable solutions.”

Fintech lender Solo Funds is being sued again by the government over its lending practices

Hard tech startups generate a lot of buzz, but there’s a growing cohort of companies building digital tools squarely focused on making hard tech development faster, more efficient and —…

Rollup wants to be the hardware engineer’s workhorse

TechCrunch Disrupt 2024 is not just about groundbreaking innovations, insightful panels, and visionary speakers — it’s also about listening to YOU, the audience, and what you feel is top of…

Disrupt Audience Choice vote closes Friday

Google says the new SDK would help Google expand on its core mission of connecting the right audience to the right content at the right time.

Google is launching a new Android feature to drive users back into their installed apps

Jolla has taken the official wraps off the first version of its personal server-based AI assistant in the making. The reborn startup is building a privacy-focused AI device — aka…

Jolla debuts privacy-focused AI hardware

OpenAI is removing one of the voices used by ChatGPT after users found that it sounded similar to Scarlett Johansson, the company announced on Monday. The voice, called Sky, is…

OpenAI to remove ChatGPT’s Scarlett Johansson-like voice

The ChatGPT mobile app’s net revenue first jumped 22% on the day of the GPT-4o launch and continued to grow in the following days.

ChatGPT’s mobile app revenue saw its biggest spike yet following GPT-4o launch

Dating app maker Bumble has acquired Geneva, an online platform built around forming real-world groups and clubs. The company said that the deal is designed to help it expand its…

Bumble buys community building app Geneva to expand further into friendships

CyberArk — one of the army of larger security companies founded out of Israel — is acquiring Venafi, a specialist in machine identity, for $1.54 billion. 

CyberArk snaps up Venafi for $1.54B to ramp up in machine-to-machine security

Founder-market fit is one of the most crucial factors in a startup’s success, and operators (someone involved in the day-to-day operations of a startup) turned founders have an almost unfair advantage…

OpenseedVC, which backs operators in Africa and Europe starting their companies, reaches first close of $10M fund

A Singapore High Court has effectively approved Pine Labs’ request to shift its operations to India.

Pine Labs gets Singapore court approval to shift base to India

The AI Safety Institute, a U.K. body that aims to assess and address risks in AI platforms, has said it will open a second location in San Francisco. 

UK opens office in San Francisco to tackle AI risk

Companies are always looking for an edge, and searching for ways to encourage their employees to innovate. One way to do that is by running an internal hackathon around a…

Why companies are turning to internal hackathons

Featured Article

I’m rooting for Melinda French Gates to fix tech’s broken ‘brilliant jerk’ culture

Women in tech still face a shocking level of mistreatment at work. Melinda French Gates is one of the few working to change that.

1 day ago
I’m rooting for Melinda French Gates to fix tech’s  broken ‘brilliant jerk’ culture

Blue Origin has successfully completed its NS-25 mission, resuming crewed flights for the first time in nearly two years. The mission brought six tourist crew members to the edge of…

Blue Origin successfully launches its first crewed mission since 2022

Creative Artists Agency (CAA), one of the top entertainment and sports talent agencies, is hoping to be at the forefront of AI protection services for celebrities in Hollywood. With many…

Hollywood agency CAA aims to help stars manage their own AI likenesses

Expedia says Rathi Murthy and Sreenivas Rachamadugu, respectively its CTO and senior vice president of core services product & engineering, are no longer employed at the travel booking company. In…

Expedia says two execs dismissed after ‘violation of company policy’

Welcome back to TechCrunch’s Week in Review. This week had two major events from OpenAI and Google. OpenAI’s spring update event saw the reveal of its new model, GPT-4o, which…

OpenAI and Google lay out their competing AI visions

When Jeffrey Wang posted to X asking if anyone wanted to go in on an order of fancy-but-affordable office nap pods, he didn’t expect the post to go viral.

With AI startups booming, nap pods and Silicon Valley hustle culture are back

OpenAI’s Superalignment team, responsible for developing ways to govern and steer “superintelligent” AI systems, was promised 20% of the company’s compute resources, according to a person from that team. But…

OpenAI created a team to control ‘superintelligent’ AI — then let it wither, source says

A new crop of early-stage startups — along with some recent VC investments — illustrates a niche emerging in the autonomous vehicle technology sector. Unlike the companies bringing robotaxis to…

VCs and the military are fueling self-driving startups that don’t need roads

When the founders of Sagetap, Sahil Khanna and Kevin Hughes, started working at early-stage enterprise software startups, they were surprised to find that the companies they worked at were trying…

Deal Dive: Sagetap looks to bring enterprise software sales into the 21st century

Keeping up with an industry as fast-moving as AI is a tall order. So until an AI can do it for you, here’s a handy roundup of recent stories in the world…

This Week in AI: OpenAI moves away from safety

After Apple loosened its App Store guidelines to permit game emulators, the retro game emulator Delta — an app 10 years in the making — hit the top of the…

Adobe comes after indie game emulator Delta for copying its logo

Meta is once again taking on its competitors by developing a feature that borrows concepts from others — in this case, BeReal and Snapchat. The company is developing a feature…

Meta’s latest experiment borrows from BeReal’s and Snapchat’s core ideas

Welcome to Startups Weekly! We’ve been drowning in AI news this week, with Google’s I/O setting the pace. And Elon Musk rages against the machine.

Startups Weekly: It’s the dawning of the age of AI — plus,  Musk is raging against the machine

IndieBio’s Bay Area incubator is about to debut its 15th cohort of biotech startups. We took special note of a few, which were making some major, bordering on ludicrous, claims…

IndieBio’s SF incubator lineup is making some wild biotech promises