this post was submitted on 04 Sep 2024
2 points (100.0% liked)

Technology

59651 readers
2722 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale. 

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] Eiri@lemmy.ca 0 points 2 months ago (1 children)
[–] dual_sport_dork@lemmy.world 0 points 2 months ago (1 children)

IKR?

Cheaters who cheat rather than learn don't learn. More on this shocking development at 11.

[–] billwashere@lemmy.world 0 points 2 months ago (1 children)

Using ChatGPT as a study aid is cheating how?

[–] blackbirdbiryani@lemmy.world 0 points 2 months ago (1 children)

Because a huge part about learning is actually figuring out how to extract/summarise information from imperfect sources to solve related problems.

If you use CHATGPT as a crutch because you're too lazy to read between the lines and infer meaning from text, then you're not exercising that particular skill.

[–] billwashere@lemmy.world 0 points 2 months ago (2 children)

I don't disagree, but thats like saying using a calculator will hurt you in understanding higher order math. It's a tool, not a crutch. I've used it many times to help me understand concepts just out of reach. I don't trust anything LLMs implicitly but it can and does help me.

[–] WordBox@lemmy.world 0 points 2 months ago (4 children)

Congrats but there's a reason teachers ban calculators... And it's not always for the pain.

[–] Skates@feddit.nl 0 points 2 months ago (1 children)

There are many reasons for why some teachers do some things.

We should not forget that one of them is "because they're useless cunts who have no idea what they're doing and they're just powertripping their way through some kids' education until the next paycheck".

[–] Zoot@reddthat.com 0 points 2 months ago* (last edited 2 months ago)

Not knowing how to add 6 + 8 just because a calculator is always available isn't okay.

I have friends in my DnD session who have to count the numbers together on their fingers. I feel bad for the person. Don't blame a teacher for wanting you to be a smarter more efficient and productive person, for banning a calculator.

[–] billwashere@lemmy.world 0 points 2 months ago (2 children)

Take a college physics test without a calculator if you wanna talk about pain. And I doubt you could find a single person who could calculate trig functions or logarithms long hand. At some point you move past the point to prove you can do arithmetic. It's just not necessary.

The real interesting thing here is whether an LLM is useful as a study aid. It looks like there is more research necessary. But an LLM is not smart. It's a complicated next word predictor and they have been known to go off the rails for sure. And this article suggests its not as useful and you might think for new learners.

load more comments (2 replies)
[–] assassin_aragorn@lemmy.world 0 points 2 months ago

In some cases I'd argue, as an engineer, that having no calculator makes students better at advanced math and problem solving. It forces you to work with the variables and understand how to do the derivation. You learn a lot more manipulating the ideal gas formula as variables and then plugging in numbers at the end, versus adding numbers to start with. You start to implicitly understand the direct and inverse relationships with variables.

Plus, learning to directly use variables is very helpful for coding. And it makes problem solving much more of a focus. I once didn't have enough time left in an exam to come to a final numerical answer, so I instead wrote out exactly what steps I would take to get the answer -- which included doing some graphical solutions on a graphing calculator. I wrote how to use all the results, and I ended up with full credit for the question.

To me, that is the ultimate goal of math and problem solving education. The student should be able to describe how to solve the problem even without the tools to find the exact answer.

load more comments (1 replies)
[–] obbeel@lemmy.eco.br 0 points 2 months ago (1 children)

ChatGPT hallucinations inspire me to search for real references. It teaches we cannot blindly trust on things that are said. Teachers will commonly reinforce they are correct.

[–] billwashere@lemmy.world 0 points 2 months ago

I do honestly have a tendency to more thoroughly verify anything AI tells me.

[–] Akasazh@feddit.nl 0 points 2 months ago (2 children)
[–] Kusimulkku@lemm.ee 0 points 2 months ago (3 children)

I've found AI helpful in asking for it to explain stuff. Why is the problem solved like this, why did you use this and not that, could you put it in simpler terms and so on. Much like you might ask a teacher.

[–] NikkiDimes@lemmy.world 0 points 2 months ago* (last edited 2 months ago) (1 children)

I think this works great if the student is interested in the subject, but if you're just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain't gonna help you.

I have personally learned so much from LLMs (although you can't really take anything at face value and have to look things up independently, but it gives you a great starting place), but it comes from a genuine interest in the questions I'm asking and things I dig at.

load more comments (1 replies)
[–] Homescool@lemmy.world 0 points 2 months ago* (last edited 2 months ago) (1 children)

Yep. My first interaction with GPT pro lasted 36 hours and I nearly changed my religion.

AI is the best thing to come to learning, ever. If you are a curious person, this is bigger than Gutenberg, IMO.

[–] captainlezbian@lemmy.world 0 points 2 months ago

That sounds like a manic episode

load more comments (1 replies)
[–] blazeknave@lemmy.world 0 points 2 months ago (4 children)

Kids who take shortcuts and don't learn suck at recalling knowledge they never had..

[–] ameancow@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

The only reason we're trying to somehow compromise and allow or even incorporate cheating software into student education is because the tech-bros and singularity cultists have been hyping this technology like it's the new, unstoppable force of nature that is going to wash over all things and bring about the new Golden Age of humanity as none of us have to work ever again.

Meanwhile, 80% of AI startups sink and something like 75% of the "new techs" like AI drive-thru orders and AI phone support go to call centers in India and Philippines. The only thing we seem to have gotten is the absolute rotting destruction of all content on the internet and children growing up thinking it's normal to consume this watered-down, plagiarized, worthless content.

[–] ChickenLadyLovesLife@lemmy.world 0 points 2 months ago (3 children)

I took German in high school and cheated by inventing my own runic script. I would draw elaborate fantasy/sci-fi drawings on the covers of my notebooks with the German verb declensions and whatnot written all over monoliths or knight's armor or dueling spaceships, using my own script instead of regular characters, and then have these notebook sitting on my desk while taking the tests. I got 100% on every test and now the only German I can speak is the bullshit I remember Nightcrawler from the X-Men saying. Unglaublich!

load more comments (3 replies)
load more comments (1 replies)
[–] HawlSera@lemm.ee 0 points 2 months ago (1 children)

Unsurprised

I.would have no problem with AI if the shit actually worked

[–] Gestrid@lemmy.ca 0 points 2 months ago (1 children)

No, I think the point here is that the kids never learned the material, not that AI taught them the wrong material (though there is a high possibility of that).

load more comments (1 replies)
[–] friend_of_satan@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

See also: competitive cognitive artifacts. https://philosophicaldisquisitions.blogspot.com/2016/09/competitive-cognitive-artifacts-and.html

These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.

[–] BreathingUnderWater@lemmy.ca 0 points 2 months ago
[–] Facebones@reddthat.com 0 points 2 months ago
[–] terminhell@lemmy.world 0 points 2 months ago (1 children)

Maybe, if the system taught more of HOW to think and not WHAT. Basically more critical thinking/deduction.

This same kinda topic came up back when I was in middle/highschool when search engines became wide spread.

However, LLM's shouldn't be trusted for factual anything, same as Joe blows blog on some random subject. Did they forget to teach cross referencing too? I'm sounding too bitter and old so I'll stop.

load more comments (1 replies)
[–] Soup@lemmy.cafe 0 points 2 months ago

Kids using an AI system trained on edgelord Reddit posts aren’t doing well on tests?

Ya don’t say.

[–] michaelmrose@lemmy.world 0 points 2 months ago

TLDR: ChatGPT is terrible at math and most students just ask it the answer. Giving students the ability to ask something that doesn't know math the answer makes them less capable. An enhanced chatBOT which was pre-fed with questions and correct answers didn't screw up the learning process in the same fashion but also didn't help them perform any better on the test because again they just asked it to spoon feed them the answer.

references

ChatGPT’s errors also may have been a contributing factor. The chatbot only answered the math problems correctly half of the time. Its arithmetic computations were wrong 8 percent of the time, but the bigger problem was that its step-by-step approach for how to solve a problem was wrong 42 percent of the time.

The tutoring version of ChatGPT was directly fed the correct solutions and these errors were minimized.

The researchers believe the problem is that students are using the chatbot as a “crutch.” When they analyzed the questions that students typed into ChatGPT, students often simply asked for the answer.

[–] fne8w2ah@lemmy.world 0 points 2 months ago

Taking too many shortcuts doesn't help anyone learn anything.

load more comments
view more: ‹ prev next ›