Just recently, my dad introduced me to ChatGPT and said that talking with it is like talking to a human. I was intrigued, thinking about all the possibilities. Would it be like on TV when the main character has a cool robot/AI best friend? Or maybe like HAL9000 from 2001: A Space Odyssey where the AI has a mind of its own and some evil plans? As soon as I could, I rushed to my computer to check it out and lo and behold, it is futuristic in a creepy way.
ChatGPT, a new AI chatbot, can mimic human skills such as reading and writing, which I think makes it a good candidate to replace jobs.. But just because an AI could do a human’s job, doesn’t mean that those jobs would be done better. Or, does it?
Artificial intelligence isn’t new, of course. Alexa has been around for almost 10 years and it has had its fair share of attention. Almost every house I’ve been to has at least one, either in the kitchen, living room, bedroom or even bathroom. After learning about ChatGPT, Alexa doesn’t seem impressive anymore.
I first started playing around with ChatGPT by asking random questions to see how it would react. For example, I asked if it was my friend, if it knew who I was, and even its feelings about the Alexa system. And to all the questions, it gave me detailed answers on why it’s not my friend (because it barely knows me), on how it has no way of knowing who I am and why it’s incapable of hating Alexa because it’s just software.
As I was doing this, it felt like I was just having a conversation with a real person. One of the requests I made was for it to sing me a song, which was a feature Alexa already had. But to my surprise, the AI wrote a song on the spot. I had suspicions that it might’ve been pre-written, so I specifically asked it to write a song about spilling milk. I was absolutely flabbergasted when the AI started writing a ballad on the sadness and tragedy of spilt milk. It had a whole chorus, verses, and an outro. The song was created quickly and was surprisingly good (it’s not Taylor-Swift-level, but it’s better than some songs I have heard that have been composed by humans). It was almost scary. Discovering the limits of ChatGPT’s knowledge and abilities is comparable to chatting with your friend and realizing that they knew more about you than you thought they did.
I continued to mess around with it and discovered that it can help you write lots of things, like birthday cards, letters, poems, and even team chants. I experimented with the AI by telling it to write a birthday card for my friend named BFF (a fake name, for the purpose of this example). I then proceeded to feed it fake information about this “BFF” including favorite characters, favorite foods, their best personality traits and how old they were turning. In less than 30 seconds, it wrote me a card to give to “BFF” , perfectly incorporating all the information I gave it. And unless I told “BFF” that an AI wrote their card, they would have no way of knowing. I was skeptical, however, if the AI had just put the information in a template and gave it to me. I copied the information I gave earlier for the birthday card, refreshed my browser, pasted the information back into the AI text box and got a different card. It still had all the same information, but it was written differently from the card before, which cleared my suspicion.
But songs written by software and birthday cards aren’t that revolutionary. What truly are the limits? Could it compose a novel? Perhaps even a whole movie script? Shockingly, the answer is yes. I put in the logline of a movie that I was producing into the AI. My jaw dropped to the earth’s core when, without hesitation, ChatGPT started writing a whole movie screenplay. It didn’t write it the way that I was going to, however, so instead of telling it the logline again and adding a couple of notes regarding the story, I just told the AI to rewrite the script and added the notes. The AI can remember everything said (or typed) in the current conversation, like a person, so there’s no need to repeat the same prompt 100 times until you receive the desired outcome.
After seeing ChatGPT write a screenplay, I woke up to the harsh reality that AI could start succeeding humans in skill-based jobs. Writers could be replaced, of course, but there are definitely more jobs that could be in jeopardy. My prediction is that therapists, teachers, and any other jobs that require talking to people or writing things could be replaced by this AI considering, how it can talk and write like a human. A therapist could be first to be replaced, seeing how in demand they probably are with depression rates rising in the US. If you have the AI read a book on therapy, I think it could know everything a therapist needs to get licensed. And it’s the same with teachers. Give the AI all the information that it would learn in a college class about teaching, give it a lesson plan, and boom. Especially since a lot of teachers have been leaving the job.
But AI can’t replace all jobs that require talking and writing, and it still has its limitations. It can’t replicate or feel human emotions like we can. It doesn’t have common sense, so it can give responses that seem crazy in a specific context. It can create biased responses because of the information it’s been trained on, and it can’t learn from experience and adapt to new situations. At least, that’s what the AI told me its limitations are.
ChatGPT is fairly new and has only been around since last November 2022, but why was it created? The great thing about writing about this AI is that I can do most of the research by talking to it as if it’s a person and I’m interviewing it.
ChatGPT was created to be used in many different ways like language translation, content creation and customer service chat boxes, with the hope that it would help advance artificial intelligence development. However, it’s time to get real. Because ChatGPT can do so many things that humans can do and get paid for, it’s most definitely going to create controversy. Because the AI lacks emotional intelligence, I hope no one decides that it should replace a therapist.
Therapy can be such an emotional and vulnerable thing for a lot of people and one thing that humans can do really well is show empathy and understanding. When people talk to people about issues in their lives, there is compassion and connection, because for the most part, a lot of people have similar experiences. People can be sad and if someone is talking to a human therapist, they will know that their therapist might’ve had the same feelings and knows what kind of place they are in and thus can show empathy. But I feel like talking to an AI like it’s a therapist would not be as helpful and constructive. If you have had a bad day and it’s therapy day, are you going to want to talk to the human therapist who has also had bad days and can talk you through it, or the AI therapist who has never experienced human error or bad days.
Because of AI’s lack of human feelings, anything written by the likes of ChatGPT – like songs, stories and screenplays – wouldn’t feel as personal or emotive as human-written material. If all of Taylor Swift’s songs were written by an AI, I guarantee they wouldn’t be as relatable. It would feel as though when on TV, the Gen Z character is obviously written by a millennial because of the character’s lackluster humor. Besides, imagine the jobs that would be lost if movie studios and writers started to shift to AI to create stories.
Even though AI can write a whole story, it doesn’t mean it’s going to be good. In fact, it will most likely have tons of flaws due to the AI’s lack of common sense. But seeing now that the quality of movie plots is slowly decreasing, movie studios might not even notice and just continue on saving money on the writers that they aren’t hiring.
Imagine how bland a story would be if an AI wrote it. What if all the Harry Potter books were written by an AI? There wouldn’t be the made up spells and words like “muggle” and “Expelliarmus” , so there would be no imagination per say. Could an AI write Avatar the same way as it is now? I don’t think it could even make the logline for it. You can’t create a planet with aliens that speak a different language and have different ways of life without imagination. Franchises like Avatar and Harry Potter, would not have the same magic if it was written by a robot. I hope movie studios think before using AI to write their screenplay.
That being said, there is one aspect that I think AI can be used for – educational purposes. Don’t understand the homework questions? Ask the AI (just don’t cheat). Don’t understand a language? Ask the AI. Imagine if kids were raised playing with an AI instead of Cocomelon? Maybe future generations actually have a chance at being successful adults instead of screen-addicted zombies.
ChatGPT could seriously be the next Google or Alexa, and I think that everyone should familiarize themselves with it, even if it’s just for 10 minutes. But while Chat GPT is awesome and can do so many things a person can, there is no reason why it should replace any jobs any time soon. Just because an AI has the skills technically needed to perform a job, doesn’t mean that it’s going to do any better, if not worse, than a human.