this post was submitted on 04 Sep 2024
2 points (100.0% liked)

Technology

59587 readers
3117 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale. 

Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.

A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] MystikIncarnate@lemmy.ca 0 points 2 months ago

Something I've noticed with institutional education is that they're not looking for the factually correct answer, they're looking for the answer that matches whatever you were told in class. Those two things should not be different, but in my experience, they're not always the same thing.

I have no idea if this is a factor here, but it's something I've noticed. I have actually answered questions with a factually wrong answer, because that's what was taught, just to get the marks.

[–] Allero@lemmy.today 0 points 2 months ago

Perhaps unsurprisingly. Any sort of "assistance" with answers will do that.

Students have to learn why things work the way they do, and they won't be able to grasp it without going ahead and doing every piece manually.

[–] Cornelius_Wangenheim@lemmy.world 0 points 2 months ago (2 children)

This isn't a new issue. Wolfram alpha has been around for 15 years and can easily handle high school level math problems.

[–] PriorityMotif@lemmy.world 0 points 2 months ago (2 children)

Only old farts still use Wolfram

[–] bitjunkie@lemmy.world 0 points 2 months ago

Where did you think you were?

[–] Gradually_Adjusting@lemmy.world 0 points 2 months ago (2 children)
[–] GooseFinger@sh.itjust.works 0 points 2 months ago

ChatGPT apparently lol

[–] PriorityMotif@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

I can't remember, but my dad said before he retired he would just pirate Wolfram because he was too old to bother learning whatever they were using. He spent 25 years in academia teaching graduate chem-e before moving to the private sector. He very briefly worked with one of the Wolfram founders at UIUC.

load more comments (1 replies)
[–] Mr_Dr_Oink@lemmy.world 0 points 2 months ago (1 children)

Because AI and previously google searches are not a substitute for having knowledge and experience. You can learn by googling something and reading about how something works so you can figure out answers for yourself. But googling for answers will not teach you much. Even if it solves a problem, you won't learn how. And won't be able to fix something in the future without googling th answer again.

If you dont learn how to do something, you won't be experienced enough to know when you are doing it wrong.

I use google to give me answers all the time when im problem solving. But i have to spend a lot more time after the fact to learn why what i did fixed the problem.

[–] prosp3kt@lemmy.dbzer0.com 0 points 2 months ago (1 children)

Nope, this doesn't work like this. sometimes you need someone to explain, specially on math, youtube can take that spot, but not always.

[–] Mr_Dr_Oink@lemmy.world 0 points 2 months ago

That's what i am saying. You need to learn it. If someone explains it to you, then you are learning. If someone gives you the answer, then you dont understand it, so you are less good at said something.

You agree with me...

[–] prosp3kt@lemmy.dbzer0.com 0 points 2 months ago* (last edited 2 months ago) (1 children)

There are a part here that sounds interesting

The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better.

Do you think that these students that used ChatGPT can do the exercises "the old fashioned way"? For me it was a nightmare try to resolve a calculus problem just with the trash books that doesn't explain a damn fuck, I have to go to different resources, wolphram, youtube, but what happened when there was a problem that wasnt well explained in any resource?. I hate openAI, I want to punch Altman in the face. But this doesn't mean we have to bait this hard in the title.

load more comments (1 replies)
[–] Maggoty@lemmy.world 0 points 2 months ago

ChatGPT lies which is kind of an issue in education.

As far as seeing the answer, I learned a significant amount of math by looking at the answer for a type of question and working backwards. That's not the issue as long as you're honestly trying to understand the process.

[–] PriorityMotif@lemmy.world 0 points 2 months ago

There's a bunch of websites that give you the answers to most homework. You can just Google the question and find the answers pretty quickly. I assume the people using chatgtp to "study" are just cheating on homework anyway.

[–] flerp@lemm.ee 0 points 2 months ago (2 children)

Like any tool, it depends how you use it. I have been learning a lot of math recently and have been chatting with AI to increase my understanding of the concepts. There are times when the textbook shows some steps that I don't understand why they're happening and I've questioned AI about it. Sometimes it takes a few tries of asking until you figure out the right question to ask to get the right answer you need, but that process of thinking helps you along the way anyways by crystallizing in your brain what exactly it is that you don't understand.

I have found it to be a very helpful tool in my educational path. However I am learning things because I want to understand them, not because I have to pass a test and that determination in me to want to understand is a big difference. Just getting hints to help you solve the problem might not really help in the long run, but it you're actually curious about what you're learning and focus on getting a deeper understanding of why and how something works rather than just getting the right answer, it can be a very useful tool.

[–] Rekorse@sh.itjust.works 0 points 2 months ago (10 children)

Why are you so confident that the things you are learning from AI are correct? Are you just using it to gather other sources to review by hand or are you trying to have conversations with the AI?

We've all seen AI get the correct answer but the show your work part is nonsense, or vice versa. How do you verify what AI outputs to you?

[–] GaMEChld@lemmy.world 0 points 2 months ago (1 children)

You check it's work. I used it to calculate efficiency in a factory game and went through and made corrections to inconsistencies I spotted. Always check it's work.

load more comments (1 replies)
[–] rainerloeten@lemmy.world 0 points 2 months ago

I use it for explaining stuff when studying for uni and I do it like this: If I don't understand e.g. a definition, I ask an LLM to explain it, read the original definition again and see if it makes sense.

This is an informal approach, but if the definition is sufficiently complex, false answers are unlikely to lead to an understanding. Not impossible ofc, so always be wary.

For context: I'm studying computer science, so lots of math and theoretical computer science.

load more comments (8 replies)
[–] Gsus4@mander.xyz 0 points 2 months ago* (last edited 2 months ago)

Sometimes it leads me wildly astray when I do that, like a really bad tutor...but it is good if you want a refresher and can spot the bullshit on the side. It is good for spotting things that you didnt know before and can factcheck afterwards.

...but maybe other review papers and textbooks are still better...

[–] pineapplelover@lemm.ee 0 points 2 months ago

Youdontsay.png

[–] randon31415@lemmy.world 0 points 2 months ago (3 children)

Kids who use ChatGPT as a study assistant do worse on tests

But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores

Headline: People who flip coins have a much worse chance of calling it if they call heads!

Text: Studies show that people who call heads when flipping coins have an even chance of getting it right compared to people who do the old fashion way of calling tails.

load more comments (3 replies)
load more comments
view more: ‹ prev next ›