this post was submitted on 10 Aug 2023
358 points (96.6% liked)

Asklemmy

43945 readers
638 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Just out of curiosity. I have no moral stance on it, if a tool works for you I'm definitely not judging anyone for using it. Do whatever you can to get your work done!

you are viewing a single comment's thread
view the rest of the comments
[–] flynnguy@programming.dev 68 points 1 year ago (4 children)

I had a coworker come to me with an "issue" he learned about. It was wrong and it wasn't really an issue and the it came out that he got it from ChatGPT and didn't really know what he was talking about, nor could he cite an actual source.

I've also played around with it and it's given me straight up wrong answers. I don't think it's really worth it.

It's just predictive text, it's not really AI.

[–] Echo71Niner@kbin.social 16 points 1 year ago (1 children)

I concur. ChatGPT is, in fact, not an AI; rather, it operates as a predictive text tool. This is the reason behind the numerous errors it tends to generate and its lack of self-review prior to generating responses is clearest indication of it not being an AI. You can identify instances where CHATGPT provides incorrect information, you correct it, and within 5 seconds of asking again, it repeat the same inaccurate information in its response.

[–] rbhfd@lemmy.world 25 points 1 year ago (1 children)

It's definitely not artificial general intelligence, but it's for sure AI.

None of the criteria you mentioned are needed for it be labeled as AI. Definition from Oxford Libraries:

the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

It definitely fits in this category. It is being used in ways that previously, customer support or a domain expert was needed to talk to. Yes, it makes mistakes, but so do humans. And even if talking to a human would still be better, it's still a useful AI tool, even if it's not flawless yet.

[–] howrar@lemmy.ca 4 points 1 year ago (2 children)

It just seems to me that by this definition, the moment we figure out how to do something with a computer, it ceases to be AI because it no longer requires human intelligence to accomplish.

[–] jamesravey@lemmy.nopro.be 8 points 1 year ago

As Larry Tesler once said "AI is whatever hasn't been done yet."

[–] Applejuicy@feddit.nl 1 points 1 year ago

I guess the word "normally" takes care of that. It implies a situation outside of the program in question.

[–] dbilitated@aussie.zone 5 points 1 year ago (1 children)

i think learning where it can actually help is a bit of an art - it's just predictive text, but it's very good predictive text - if you know what you need and get good and giving it the right input it can save a huge about of time. you're right though, it doesn't offer much if you don't already know what you need.

[–] 7bicycles@hexbear.net 3 points 1 year ago (1 children)

Can you hand me an example? I keep hearing this but every time somebody presents something, be it work related or not, it feels like at best it would serve as better lorem ipsum

[–] surrendertogravity@wayfarershaven.eu 4 points 1 year ago (1 children)

I’ve had good success using it to write Python scripts for me. They’re simple enough I would be able to write them myself, but it would take a lot of time searching and reading StackOverflow/library docs/etc since I’m an amateur and not a pro. GPT lets me spend more time actually doing the things I need the scripts for.

[–] LordXenu@lemm.ee 1 points 1 year ago

A use it with web development by describing what I want something to look like and have it generate a React component based on my description.

Is what it gives me the final product? Sometimes, but it’s such a help to knock out a bunch of boilerplate and get me close to what I want.

Also generating documentation is nice. I wanted to fill out some internal wiki articles to help people new to the industry have something to reference. Spent maybe an hour having a conversation asking all of the questions I normally run into. Cleaned up the GPT text, checked for inaccuracies, and cranked out a ton of resources. That would have taken me days, if not weeks.

At the end of the day, GPT is better with words than I am, but it doesn’t have the years of experience I have.

[–] EliasChao@lemmy.one 3 points 1 year ago

More often than not you need to be very specific and have some knowledge on the stuff you ask it.

However, you can guide it to give you exactly what you want. I feel like knowing how to interact with GPT it’s becoming similar as being good at googling stuff.

[–] idle@158436977.xyz 1 points 1 year ago

Isn't that what humans also do and it's what makes us intelligent? We analyze patterns and predict what will come next.