Should you be using ChatGPT? Experts say ‘yes,’ but don’t confuse it with a friend by Alena Kuzub May 16, 2023 Share Mastodon Facebook LinkedIn Twitter Photo by Matthew Modoono/Northeastern University When research and development company OpenAI released ChatGPT late last year, it instantly attracted the attention of the media and general public. In just two months, the artificial intelligence software—or a chatbot that can process natural human language and generate answers—reached 100 million active users monthly, beating TikTok (nine months), Instagram (20) and Uber (70) to that milestone. “AI will become more and more ubiquitous. This trend cannot be stopped,” says Dakuo Wang, associate professor at Northeastern University, whose research lies at the intersection of human-computer interaction and artificial intelligence. Wang advises everyone who hasn’t done so to try using the public version of ChatGPT. “It’s called AI literacy,” Wang says. Everyone needs to start learning what AI can and cannot do, he says, and playing with ChatGPT can help relieve people’s concerns about AI taking over the world or eliminating jobs. AI will change people’s lives and jobs, Wang says, but it will not replace humans. Instead, more productive people who embrace AI will replace less productive people who are reluctant to learn how it works. So, how can an average person use ChatGPT or other chatbots like Replika? Let’s start with the less obvious ones. Yes, you can have a conversation with chatbots When the public first received access to ChatGPT, people quickly started having personal conversations with it. In one case, the chatbot professed love to a New York Times reporter. Replika is presented as a safe space to share thoughts, feelings and dreams with an AI companion that helps users learn more about themselves. The chatbot was created by a Russian-American entrepreneur Eugenia Kuyda to allow her to speak to a deceased friend. But should I be sharing my deep secrets with AI? Is it healthy to have in-depth conversations with a chatbot when you are feeling lonely? Robert Volpe, professor and chair of the department of applied psychology at Northeastern, says talking to a chatbot can be therapeutic because it allows people to speak their minds, and it comes across as empathetic. “If the chatbot asks you how your day was today and you say ‘It is really crummy,’” Volpe says. “And [then the chatbot says] ‘Oh, tell me about that.’ The part of just saying it out loud is therapeutic.” Remember, chatbots are not your friend When someone begins to perceive a chatbot as a friend, they risk developing a strong attachment to the “cold machine.” “It’s the way we’re wired,” Volpe says. Users should be aware that chatbots have the potential to reinforce wrong ideas or affirm everything an individual tells them. For example, Volpe believes chatbots should not tell people how to behave or give advice to someone who is experiencing abuse or suicidal ideation. In those cases, they should reach out to human professionals. ChatGPT can improve your writing So, chatbots can’t be your friend, but ChatGPT could be better than a friend when it comes to editing written work. If used properly, ChatGPT can make a paper shorter, rewrite some passages, check grammar, and improve or change the style of text. However, users will need to give the chatbot a detailed prompt on what they want it to do. Very rarely, Wang says, the first prompt will result in users getting exactly the response they want. ChatGPT is not good for fact-checking Users should keep in mind that ChatGPT is not good for fact-checking. First, its knowledge is currently limited to information that existed prior to 2021. Second, it is intrinsically different from search engines like Google. Search engines index every source and piece of information, Wang says, while AI-based chatbots were inspired by the human brain. Just like our brain has 86 billion neurons, AI models like ChatGPT integrate hundreds of billions of parameters and are pre-trained on massive text datasets. The human brain comes up with a response to a question based on a person’s knowledge, memory and perception instead of referring to an encyclopedia in one’s head. Similarly, ChatGPT does not rely on any specific source but rather generates a coherent and contextually relevant response by predicting what the next word will be. AI tools are useful for sketching and brainstorming AI tools are another way of interacting with computers that can be quite useful in a learning setting, says Rahul Bhargava, assistant professor of art, design and journalism at Northeastern. He has introduced his students to ChatGPT and DALL-E, another AI system from OpenAI that can create realistic images and art from a description in natural language. DALL-E is good for creating sketches, while ChatGPT can help students brainstorm ideas. For example, Bhargava and his students used ChatGPT to design miniature tabletop golf models. They asked the chatbot what interactive components should be included in a “Tom and Jerry” cartoon-inspired course. “The results were on-point for a first round of ideas,” Bhargava says, “and it got students into a brainstorming mindset.” AI assistants can teach you how to code AI coding assistants like GitHub Copilot, owned by Microsoft, or CodeGPT are useful for individuals whose goal is not to become software developers or computer scientists, Bhargava says. With AI assistants, Bhargava’s students learn modules of a program that can be later stitched together. They can generate example code much closer to what a user wants to build. It can be an interactive news story or an animated interaction on screen for their project. It can help you plan a party or meal If all else fails, you can ask ChatGPT to help you plan your next party or meal. Ask the chatbot for step-by-step actionable instructions, Wang says, and it will come up with a list of steps that one should take to make a meatloaf or organize a fun first birthday party. Alena Kuzub is a Northeastern Global News reporter. Email her at firstname.lastname@example.org. Follow her on Twitter @AlenaKuzub.