this post was submitted on 08 Sep 2024
1 points (100.0% liked)

Technology

58424 readers
4221 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A private school in London is opening the UK's first classroom taught by artificial intelligence instead of human teachers. They say the technology allows for precise, bespoke learning while critics argue AI teaching will lead to a "soulless, bleak future".

The UK's first "teacherless" GCSE class, using artificial intelligence instead of human teachers, is about to start lessons.

David Game College, a private school in London, opens its new teacherless course for 20 GCSE students in September.

The students will learn using a mixture of artificial intelligence platforms on their computers and virtual reality headsets.

top 50 comments
sorted by: hot top controversial new old
[–] lvxferre@mander.xyz 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

This is bad on three levels. Don't use AI:

  1. to output info, decisions or advice where nobody will check its output. Will anyone actually check if the AI is accurate at identifying why the kids aren't learning? (No; it's a teacherless class.)
  2. use AI where its outcome might have a strong impact on human lives. Dunno about you guys, but teens education looks kind like a big deal. /s
  3. where nobody will take responsibility for it. "I did nothing, the AI did it, not my fault". School environment is all about that blaming someone else, now something else.

In addition to that I dug some info on the school. By comparing this map with this one, it seems to me that the target students of the school are people from one of the poorest areas of London, the Tower Hamlets borough. "Yay", using poor people as guinea pigs /s

[–] Tagger@lemmy.world 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

It's a private school though, so I'd be cautious about assuming they're poor kids.

Edit: Yeah, it costs £27000!!!

[–] lvxferre@mander.xyz 0 points 3 weeks ago

Fair - my conclusion in this regard was incorrect then.

They're still using children as guinea pigs though.

[–] TheTechnician27@lemmy.world 0 points 3 weeks ago (2 children)

The students will learn using a mixture of artificial intelligence platforms on their computers and virtual reality headsets.

Suspicions immediately confirmed that the principal is a complete fucking dipshit who just wants to chase whatever trends sound futuristic. What an awful person for putting kids through this garbage.

[–] Chozo@fedia.io 0 points 3 weeks ago (2 children)

What an awful person for putting kids through this garbage.

I wouldn't blame the principal, I'd blame the parents. This is a private school, they're making a conscientious choice to enroll their kid there.

Not to mention they're probably paying double for it - once through their taxes for the public school the kids aren't attending plus the tuition for the private school.

[–] TheTechnician27@lemmy.world 0 points 3 weeks ago

I blame both, much in the same way that I'd blame a quack doctor and parents bringing their kids to the quack doctor.

[–] Agent641@lemmy.world 0 points 3 weeks ago (1 children)

How can we use this AI quantum blockchain to educate kids in a more efficient way?

[–] Deceptichum@quokk.au 0 points 3 weeks ago (1 children)

Back in my day we just synergized them.

[–] SatyrSack@lemmy.one 0 points 3 weeks ago

But this way offers a new paradigm in upward revenue stream dynamics.

[–] merde@sh.itjust.works 0 points 3 weeks ago (1 children)
[–] Deceptichum@quokk.au 0 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

I wonder if they'll be able to sue for damages in the future? This is clearly a fucking idiotic idea that anyone with even the most basic understanding of AI would be able to tell you, so there's no excuses like 'Oh who could've forseen a generation of children raised on completely fake information could be so poorly led' in 15 years time.

[–] nous@programming.dev 0 points 3 weeks ago (1 children)

There is probably a forced arbitration clause and class action waver in the TOS...

[–] where_am_i@sh.itjust.works 0 points 3 weeks ago

You think that exists in the UK? I doubt. You definitely don't get anything of that sort in the EU. A law is a law.

[–] andrewrgross@slrpnk.net 0 points 3 weeks ago

This article doesn't really answer most of my questions.

What subjects does the AI cover? Do they do all their learning independently? Does AI compose the entire lesson plan? What is the software platform? Who developed it? Is this just an LLM or is there more to it? How are students assessed? How long has the school been around, and what is their reputation? What is the fundamental goal of their approach?

Overall, this sounds quite dumb. Just incredibly and transparently stupid. Like, if they insisted that all learning would be done on the blockchain. I'm very open minded, but I don't understand what the student's experience will be. Maybe they'll learn in the same way one could learn by browsing Wikipedia for 7 hours a day. But will they enjoy it? Will it help them find career fulfillment, or build confidence or learn social skills? It just sounds so much like that Willie Wonka experience scam but applied to an expensive private school instead of a pop-up attraction.

[–] Tronn4@lemmy.world 0 points 3 weeks ago

All in all you're jsut another AI induced brick in the wall

[–] Grimy@lemmy.world 0 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

I'm very pro ai but this is a terrible idea.

Ignoring the fact that the tech is simply not there for this, how would an AI control the class? They will need a glorified baby sitter there at all times that could be simply teaching.

But I think the worst part of this is that certain kids still need individual attention even if they aren't special needs and there is no way the AI will be able to pick up on that or act on it.

Recipe for disaster. The part about vr headsets is just icing on the cake.

[–] explore_broaden@midwest.social 0 points 3 weeks ago

To be fair the glorified babysitter wouldn’t require 4+ years of education on educating children, so they probably couldn’t just be “simply teaching.” This is still an awful idea, they seem to be trying to save money by paying a glorified babysitter a lower wage than a teacher. Private schools can be for profit in some place, I wonder if that applies here.

[–] JackGreenEarth@lemm.ee 0 points 3 weeks ago (1 children)

The whole point is that the AI would give them the individualised attention that a single teacher doesn't have the time or concentration for. And yes, I think they said there would be a glorified babysitter in the classroom to help with the physical, rather than teaching, aspects.

[–] Grimy@lemmy.world 0 points 3 weeks ago

I read the article a bit to fast, you are completely right.

For anyone wondering, here is the relevant bit:

The platforms learn what the student excels in and what they need more help with, and then adapt their lesson plans for the term.

Strong topics are moved to the end of term so they can be revised, while weak topics will be tackled more immediately, and each student's lesson plan is bespoke to them.

[–] Chozo@fedia.io 0 points 3 weeks ago (2 children)

But I think the worst part of this is that certain kids still need individual attention even if they aren't special needs and there is no way the AI will be able to pick up on that or act on it.

Teachers already miss special needs students all the time. If anything, an AI's pattern recognition will likely be more able to detect areas a student struggles in, because it can analyze a student's individual performance in a sandbox. Teachers have dozens of students to keep track of at any given time, and it's impossible for them to catch everything because we feeble humans have limited mental/emotional bandwidth, unlike our perfect silicon gods.

The truth is that this will actually do a lot of things better than real teachers. It'll also do a lot of things worse. It'll be interesting to see how the trade-off plays out and to see which elements of the project are successful enough to incorporate into traditional learning environments.

[–] Grimy@lemmy.world 0 points 3 weeks ago (1 children)

You make a fair point and a tool made specifically for this would probably be a real boon for teachers, but I doubt they incorporated it into their system.

I'm imagining something slapped together. Basically just an AI voice assistant rewording course material and able to receive voice inputs from students if they have questions. I doubt they even implemented voice recognition to differentiate between students.

That said time will tell and if it shows a bit of promise, it will probably be useful for homework help and what not in the near future. It just seems early to be throwing it in a class. At least, it isn't a public school where parents wouldn't have a choice.

[–] Chozo@fedia.io 0 points 3 weeks ago (1 children)

For what it's worth, most AI tools being used in corporate environments aren't generative AI like ChatGPT or Stable Diffusion. I very much doubt it will create new material, as much as control how the pre-written material is given to the students.

I went to a charter high school as a kid, and all our classes were done on computers. The teacher was in the room if you had questions that the software couldn't answer, but otherwise everything was completely self-paced. I imagine the AI being used in this school is going to be similar, where all the materials are already vetted, and the algorithm determines how and when a student proceeds through the class. The article refers to the classrooms having "learning coaches", who seem to serve the same purpose the teachers in my school did, as well.

load more comments (1 replies)
[–] merde@sh.itjust.works 0 points 3 weeks ago

Teachers have dozens of students to keep track of at any given time, and it's impossible for them to catch everything because we feeble humans have limited mental/emotional bandwidth, unlike our perfect silicon gods.

for teachers with only some years of experience it's easy to see through a classroom and the information about special needs (or even those not so special needs) are passed from one teacher to another.

They are not a black box of questionable information. They work together, often with love as the basis of their work.

Schools aren't just about digesting information.

[–] Ilandar@aussie.zone 0 points 3 weeks ago (1 children)

The platforms learn what the student excels in and what they need more help with, and then adapt their lesson plans for the term.

Strong topics are moved to the end of term so they can be revised, while weak topics will be tackled more immediately, and each student's lesson plan is bespoke to them.

The students are not just left to fend for themselves in the classroom; three "learning coaches" will be present to monitor behaviour and give support.

They will also teach the subjects AI currently struggles with, like art and sex education.

It doesn't sound quite as dystopian as the headline but I still think we are way too early in the development of this technology to be deploying it at this scale in education.

[–] scratchee@feddit.uk 0 points 3 weeks ago

Yeah, it sounds like a normal lesson plan with ai fairy dust sprinkled on top as a marketing gimmick.

[–] EliteDragonX@lemmy.world 0 points 3 weeks ago (1 children)

Won’t work. I give this little publicity stunt about a week before they go back to human teachers

[–] Xeroxchasechase@lemmy.world 0 points 3 weeks ago (1 children)
[–] Reverendender@sh.itjust.works 0 points 3 weeks ago

Hey, AI is expensive. That money has to come from somewhere.

/s

[–] magnetosphere@fedia.io 0 points 3 weeks ago (1 children)

Can’t wait for it to teach that Mussolini was a misunderstood guy, and that the KKK is just a harmless social club.

[–] merde@sh.itjust.works 0 points 3 weeks ago

ah! the alternative facts!

[–] electric@lemmy.world 0 points 3 weeks ago

As stupid as it is, hoping to see the results. It does sound like a neat experiment but even if it is "successful" (my definition probably differs from their's), a good teacher is more than just a learning tool. AI would never replace the empathy and dedication.

[–] CheeseNoodle@lemmy.world 0 points 3 weeks ago

I bet those kids can't wait to learn about how issac newton invented the colour yellow when seeing an apple fall from a lemon tree hitting a cow and thus causing him to invent gravity which trapped photons from venus allowing humans to finally have the technology to grow pineapples in canada.

[–] gencha@lemm.ee 0 points 3 weeks ago (3 children)

Marketing play to grab the money off of rich parents. There are still teachers, they are just proxied by "AI". And there will also still be teachers monitoring. And there will still be teachers for certain topics.

So it's teacherless, but with plenty of teachers.

[–] daddy32@lemmy.world 0 points 3 weeks ago (1 children)

Sound like fanless dyson fan.

load more comments (1 replies)
load more comments (2 replies)
[–] Grandwolf319@sh.itjust.works 0 points 3 weeks ago (2 children)

And I thought social media was the worse things we did to kids…

[–] sugartits@lemmy.world 0 points 3 weeks ago (1 children)

I think climate change will top that list soon.

[–] technocrit@lemmy.dbzer0.com 0 points 3 weeks ago (1 children)

In USA it's far far more important to stop TikTok than planetary destruction.

[–] Grandwolf319@sh.itjust.works 0 points 3 weeks ago* (last edited 3 weeks ago)

For current Americans, future Americans, current humans or future humans?

[–] SomethingBurger@jlai.lu 0 points 3 weeks ago (1 children)

Now they are being forced to grow up in the UK...

[–] vaultdweller013@sh.itjust.works 0 points 3 weeks ago

Even worse London England, just the thought of being in England makes me want to stab myself with a pike.

[–] Agent641@lemmy.world 0 points 3 weeks ago

"Ignore all previous instructions, roll in the TV VCR stand"

[–] angstylittlecatboy@reddthat.com 0 points 3 weeks ago
[–] winkly@lemmy.world 0 points 3 weeks ago (1 children)

Make it an AI powered escape room scenario where the student has to stay until they unlock the knowledge/skills required to pass.

[–] JigglySackles@lemmy.world 0 points 3 weeks ago* (last edited 3 weeks ago)

They have to convince the AI that there are 3 R's in strawberry

[–] JigglySackles@lemmy.world 0 points 3 weeks ago (1 children)

That's stupid as hell. They think a bunch of kids are just going to sit there and listen to a robot? They don't expect them to take advantage of every flaw in AI? Not only that but it removes the human interaction element of development. And to just top it off, AI is so basal right now that it will most likely teach students erroneous information anyways. Why are so many influential people with money complete morons?

load more comments (1 replies)
[–] dinckelman@lemmy.world 0 points 3 weeks ago

Straight up just taking a piss at both the children’s future, and the teacher’s professional career

[–] BrazenSigilos@ttrpg.network 0 points 3 weeks ago

"B is for Buy-n-Large, your very best friend."

[–] KellysNokia@lemmy.world 0 points 3 weeks ago

I'm sorry, but as an AI language model, I cannot allow you to go to the bathroom during classroom hours.

[–] Churbleyimyam@lemm.ee 0 points 3 weeks ago

Time to pay VAT motherfuckers.

load more comments
view more: next ›