This post originally appeared on News@Northeastern. It was published by Cody Mello-Klein.
At the tail end of 2022, OpenAI released ChatGPT, a chatbot powered by the company's AI algorithm, that lit more than just the internet on fire.
It's a well-worn idea: Type in a prompt, get a response. While ChatGPT had been around for a while, its latest version was advanced enough to be useful in a wide range of applications, from generating code to writing essays, poems and jokes. It was quickly hailed as a “tipping point for AI,” and Microsoft recently invested billions in OpenAI.
It is also a tool with significant limitations. Some people praised it for how it could completely change the way we work; others feared it could spell the end of jobs in creative fields and certain kinds of education. Like with AI in general, the story is complicated.
Regardless of where people come down on ChatGPT, most people have been obsessing over what it can produce. But Laura Huang, a distinguished professor of management and organizational development at Northeastern, says the future of tools like ChatGPT rests on the other side of the process.
“Most people are focused on the output of this,” Huang says. “Very few people are focused on the input. You don't magically just get something.”
Tinker enough with ChatGPT and it's clear the tool is far from perfect. It can create surprisingly creative pieces of writing––and flat out incorrect responses. The power lies not in the tool itself, Huang says, but in how people are able to interact with it. ChatGPT is only as creative as the prompt it gets––like the user who asked it to write a “biblical verse in the style of the King James Bible explaining how to remove a peanut butter sandwich from a VCR.”
Huang, who recently joined the Northeastern faculty after five years as an associate professor in the Harvard Business School, compares ChatGPT to a more advanced form of Google and other search engines. But ChatGPT goes a few steps further than traditional search engines. While Google or Wikipedia provide raw information, Chat GPT can provide direct comparisons between different sets of data. It can, for example, summarize how the styles and upbringings of Notorious B.I.G. and Tupac are different.
Huang postulates that the next evolution of the technology is to personalize those results based on the user's interactions, personality and background. But in order to get to that point requires what she calls “the jobs of the future,” a new kind of professional that knows how to communicate with AI in order to get the most valuable information. In other words, an AI whisperer.
“There are going to be huge, huge opportunities for people who know how to write prompts, writing specific prompts, writing them in a certain way, knowing how to massage and analyze what comes out of things like ChatGPT or the next generation of ChatGPT,” Huang says.
“If you don't ask the right questions, you're not going to get useful answers in life,” Huang adds. “You can't solve the problem unless you actually know how to articulate the problem, and the same applies here. You can't actually get things that are useful unless you understand what's not useful and how to manipulate what's actually going in and what's coming out.”
As tools like ChatGPT become more pervasive, there will be a need for specially trained workers who understand how to manipulate the technology and sidestep the biases built into it. Huang says it's a new form of communication. Like with email and social media, people became literate in these new tools and developed a kind of etiquette.
The only difference here is that AI whisperers will need to negotiate conversations that have a third, invisible participant.
“It's no longer just about communicating from one person to another or even interpersonally, from A to B and B to A,” Huang says. “Now it's about being able to communicate in this invisible triatic way where we've got the A, the B, back to the A and we've got all of these invisible Cs everywhere that we need to also be taking into account and communicating in a way that accounts for those thoughts and behaviors.”