Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
IKR?
Cheaters who cheat rather than learn don't learn. More on this shocking development at 11.
Using ChatGPT as a study aid is cheating how?
Because a huge part about learning is actually figuring out how to extract/summarise information from imperfect sources to solve related problems.
If you use CHATGPT as a crutch because you're too lazy to read between the lines and infer meaning from text, then you're not exercising that particular skill.
I don't disagree, but thats like saying using a calculator will hurt you in understanding higher order math. It's a tool, not a crutch. I've used it many times to help me understand concepts just out of reach. I don't trust anything LLMs implicitly but it can and does help me.
Congrats but there's a reason teachers ban calculators... And it's not always for the pain.
There are many reasons for why some teachers do some things.
We should not forget that one of them is "because they're useless cunts who have no idea what they're doing and they're just powertripping their way through some kids' education until the next paycheck".
Not knowing how to add 6 + 8 just because a calculator is always available isn't okay.
I have friends in my DnD session who have to count the numbers together on their fingers. I feel bad for the person. Don't blame a teacher for wanting you to be a smarter more efficient and productive person, for banning a calculator.
Take a college physics test without a calculator if you wanna talk about pain. And I doubt you could find a single person who could calculate trig functions or logarithms long hand. At some point you move past the point to prove you can do arithmetic. It's just not necessary.
The real interesting thing here is whether an LLM is useful as a study aid. It looks like there is more research necessary. But an LLM is not smart. It's a complicated next word predictor and they have been known to go off the rails for sure. And this article suggests its not as useful and you might think for new learners.
In some cases I'd argue, as an engineer, that having no calculator makes students better at advanced math and problem solving. It forces you to work with the variables and understand how to do the derivation. You learn a lot more manipulating the ideal gas formula as variables and then plugging in numbers at the end, versus adding numbers to start with. You start to implicitly understand the direct and inverse relationships with variables.
Plus, learning to directly use variables is very helpful for coding. And it makes problem solving much more of a focus. I once didn't have enough time left in an exam to come to a final numerical answer, so I instead wrote out exactly what steps I would take to get the answer -- which included doing some graphical solutions on a graphing calculator. I wrote how to use all the results, and I ended up with full credit for the question.
To me, that is the ultimate goal of math and problem solving education. The student should be able to describe how to solve the problem even without the tools to find the exact answer.
ChatGPT hallucinations inspire me to search for real references. It teaches we cannot blindly trust on things that are said. Teachers will commonly reinforce they are correct.
I do honestly have a tendency to more thoroughly verify anything AI tells me.
I've found AI helpful in asking for it to explain stuff. Why is the problem solved like this, why did you use this and not that, could you put it in simpler terms and so on. Much like you might ask a teacher.
I think this works great if the student is interested in the subject, but if you're just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain't gonna help you.
I have personally learned so much from LLMs (although you can't really take anything at face value and have to look things up independently, but it gives you a great starting place), but it comes from a genuine interest in the questions I'm asking and things I dig at.
Yep. My first interaction with GPT pro lasted 36 hours and I nearly changed my religion.
AI is the best thing to come to learning, ever. If you are a curious person, this is bigger than Gutenberg, IMO.
That sounds like a manic episode
Kids who take shortcuts and don't learn suck at recalling knowledge they never had..
The only reason we're trying to somehow compromise and allow or even incorporate cheating software into student education is because the tech-bros and singularity cultists have been hyping this technology like it's the new, unstoppable force of nature that is going to wash over all things and bring about the new Golden Age of humanity as none of us have to work ever again.
Meanwhile, 80% of AI startups sink and something like 75% of the "new techs" like AI drive-thru orders and AI phone support go to call centers in India and Philippines. The only thing we seem to have gotten is the absolute rotting destruction of all content on the internet and children growing up thinking it's normal to consume this watered-down, plagiarized, worthless content.
I took German in high school and cheated by inventing my own runic script. I would draw elaborate fantasy/sci-fi drawings on the covers of my notebooks with the German verb declensions and whatnot written all over monoliths or knight's armor or dueling spaceships, using my own script instead of regular characters, and then have these notebook sitting on my desk while taking the tests. I got 100% on every test and now the only German I can speak is the bullshit I remember Nightcrawler from the X-Men saying. Unglaublich!
Good tl;dr
Unsurprised
I.would have no problem with AI if the shit actually worked
No, I think the point here is that the kids never learned the material, not that AI taught them the wrong material (though there is a high possibility of that).
See also: competitive cognitive artifacts. https://philosophicaldisquisitions.blogspot.com/2016/09/competitive-cognitive-artifacts-and.html
These are artifacts that amplify and improve our abilities to perform cognitive tasks when we have use of the artifact but when we take away the artifact we are no better (and possibly worse) at performing the cognitive task than we were before.
No shit
Duh
Maybe, if the system taught more of HOW to think and not WHAT. Basically more critical thinking/deduction.
This same kinda topic came up back when I was in middle/highschool when search engines became wide spread.
However, LLM's shouldn't be trusted for factual anything, same as Joe blows blog on some random subject. Did they forget to teach cross referencing too? I'm sounding too bitter and old so I'll stop.
Kids using an AI system trained on edgelord Reddit posts aren’t doing well on tests?
Ya don’t say.
TLDR: ChatGPT is terrible at math and most students just ask it the answer. Giving students the ability to ask something that doesn't know math the answer makes them less capable. An enhanced chatBOT which was pre-fed with questions and correct answers didn't screw up the learning process in the same fashion but also didn't help them perform any better on the test because again they just asked it to spoon feed them the answer.
references
ChatGPT’s errors also may have been a contributing factor. The chatbot only answered the math problems correctly half of the time. Its arithmetic computations were wrong 8 percent of the time, but the bigger problem was that its step-by-step approach for how to solve a problem was wrong 42 percent of the time.
The tutoring version of ChatGPT was directly fed the correct solutions and these errors were minimized.
The researchers believe the problem is that students are using the chatbot as a “crutch.” When they analyzed the questions that students typed into ChatGPT, students often simply asked for the answer.
Taking too many shortcuts doesn't help anyone learn anything.