Lien: AI Is Terrifyingly Competent

Photo+by+Shantanu+Kumar%3A+https%3A%2F%2Fwww.pexels.com%2Fphoto%2Fchatgpt-webpage-open-on-iphone-16474955%2F

Photo by Shantanu Kumar: https://www.pexels.com/photo/chatgpt-webpage-open-on-iphone-16474955/

By Kayla Lien, Opinion Writer

 

Easy-to-use AIs like ChatGPT have sprung up in surprising numbers this past year. Instead of being locked behind paywalls, many are public access and no-cost. This is in addition to voice-operated AI systems — such as Siri, Alexa, Google, etc. — that are already in use and quickly becoming more obsolete.

AI usage allows for increased productivity, but has some potentially large ramifications for writers.

ChatGPT owes its conception to the software company OpenAI, which also produced DALL-E last year. DALL-E allows users to turn text into images, while ChatGPT has the ability to interact “in a conversational way” via writing. For those who haven’t used the AI, it’s an incredibly simple interface. All a person needs to do is ask the chat a question and in seconds ChatGPT will load the answer with easily understandable language.

For example, when asked what ChatGPT can do for news writers, the system gave me a succinct list that included research assistance, fact-checking and automated news generation, among others. This list even provided the reasoning behind each benefit. It’s wildly similar to how real people writing factual articles sound, almost unnervingly so. When a user asks for sources, it will load up an itemized list of websites and books, as well as how and why the particular source was used.

ChatGPT can write news articles and content pieces. It can write in-character — though, as I’ve been told, bland — fanfiction. It can even diagnose patients and pass a medical licensing exam. It has knowledge of pop culture and can easily find answers to obscure questions. “What is the trash monster that tried eating Luke Skywalker?” Apparently, it’s called a Dianoga, and was given the name “Omi” in a non-canonical Star Wars novel.

ChatGPT can also be used in creative writing by providing suggestions and prompts for writers. It can power chatbots to reply to customers, gather information for researchers, translate texts from one language to another and act as a personal assistant.

This brings up issues of plagiarism and keeping individuals accountable. If a student uses the AI to write an essay, it’s difficult to prove that they didn’t do the work as it is almost impossible to recreate the exact circumstances in which a response was received. To alleviate the issue, Edward Tian, a 22-year-old Princeton University student, has created an app that should “quickly and efficiently” tell if ChatGPT was used in writing an essay.

Alongside academic issues, ChatGPT raises legitimate ethical and social concerns. When we can outsource jobs to AI and receive free labor, it takes away from the skilled individuals who previously had these jobs. Additionally, chatbots are not always accurate and can give wrong information. Though they can relay information convincingly, that doesn’t mean they’re correct. This is incredibly dangerous for doctors using it to help diagnose patients.

I’m terrified of what ChatGPT means for my future. I’m nearly finished with a degree in journalism, and I’ve realized just how oversaturated the workforce is, even before factoring in AI. I worry about artificial intelligence putting me out of a job now that employers can automate content production. It’s an easy decision to make — there’s no need to worry about human error anymore.

However, while it is capable of expressing opinions, ChatGPT doesn’t have the actual voice of a person writing it. AI doesn’t have emotions about things the way human writers do. It lacks the human experience. Sure, it can write a poem that makes sense and follows a standard poetic format, but the AI doesn’t have the range of emotions necessary to understand what it’s writing. It has the ability to write about love and family and what it is to be alive, but it’s just following patterns recognized in pre-existing pieces with the same topics. It can only draw on what’s already out there, and cannot theorize or create new pieces without other publications and information to use as inspiration. That’s a small piece of solace that I’m hanging onto.

AI is a powerful tool for expanding one’s understanding of a particular subject, but too much of a good thing can become a bad thing. While ChatGPT is useful in a lot of ways, there are some jobs that cannot and should not be automated and need the full range of the human experience. With AI becoming more accessible as platforms even such as Snapchat adopt it, it’s important to make your own decisions regarding how much technological interference is too much. Where we draw the line is an individual choice, but a necessary one to make.

 

[email protected]

@kaylahlien