new york
CNN
—
Artificial intelligence chatbots are being touted as productivity tools for consumers. For example, they can help you plan a trip or give you advice on writing a confrontational email to your landlord. But they often sound formal, oddly stubborn, or downright weird.
And despite the proliferation of chatbots and other AI tools, many people still struggle to trust them or necessarily want to use them on a daily basis.
Microsoft is currently trying to solve this problem by focusing not only on what chatbots can do for users, but also on the “personality” of the chatbot and how it makes users feel.
Microsoft on Tuesday announced a major update to its AI system, Copilot, that it says is the first step toward creating an “AI companion” for users.
The updated Copilot adds new features such as real-time voice interaction and the ability to interpret images and text on a user’s screen. Microsoft also says this is one of the fastest AI models on the market. But the most important innovation, according to the company, is that chatbots “interact with users with a warm tone and distinctive style, providing not only information but also encouragement, feedback, and advice as they overcome the challenges of everyday life.” ” That is what he said.
This change could help Microsoft’s Copilot stand out in a growing sea of general-purpose AI chatbots. When Microsoft announced Copilot (then called Bing) early last year, it was seen as a leader among big tech companies in the AI arms race. But over the past 18 months, we’ve seen new features such as bots that can have voice conversations and AI integration that’s easily accessible (albeit imperfectly) with tools people already use regularly, like Google Search. overtaken by competitors with With updates, Copilot has caught up with some of these features.
As I tried out the new Copilot Voice feature at Microsoft’s launch event on Tuesday, I asked for advice on how to support a friend who is about to have her first baby. The bot returned practical tips, like offering a meal or running an errand, but it also offered more empathetic advice.
“That’s exciting news!” said a cheerful male voice — Copilot is designed to subtly reflect the user’s tone — in a tool the company calls “Canyon.” said. “It means a lot to be there for her emotionally. Listen to her, reassure her, be her cheerleader… Don’t forget to celebrate this moment with her. ”
Copilot updates reflect Microsoft’s vision for how the public uses AI as technology evolves. Microsoft AI CEO Mustafa Suleiman argues that people need AI as more than a productivity tool, they need it as a kind of digital friend.
“In the future, the first thing I’ll think about will be, ‘Hey, co-pilot,'” Suleiman said in an interview with CNN ahead of Tuesday’s announcement.
“You’ll have an AI companion remember it, buy it for you, book it for you, help you plan it, teach you things…it’ll boost your confidence. , it’s going to be there to cheer you on, you know? “It’s going to be on all your devices, in your car,” he said. It will be present on so many surfaces, including in your home, and it will really start to live with you.”
Early versions of the Microsoft AI chatbot received some backlash due to unexpected tone changes and sometimes downright disturbing responses. The bot starts the conversation sounding empathetic at first, but can become brash or rude during a long conversation. In one instance, Bott told a New York Times reporter that he should leave his wife because “I just love you and want you to love me.” (Microsoft has since limited the number of messages users can exchange with chatbots in a single session to prevent such responses.)
Some experts have raised broader concerns about people developing emotional attachments to all-too-human bots at the expense of real-world relationships.
To address these concerns while developing Copilot’s personality, Microsoft assembled a team of dozens of creative directors, linguists, psychologists, and other non-technical workers to interact with the model and We are giving feedback on how to ideally respond.
“We actually created an AI model that is designed for conversation, so it feels more fluent and more friendly,” Suleiman told CNN. “It has a real energy…like, it has a personality. It can push back at times and be a little funny, but it’s very suited to this long-term conversational exchange rather than a question and answer.” ”
Suleiman added that if you tell your new co-pilot that you love him and want to marry him, “he will understand that it’s not something he should tell you.” Be polite and respectful and they’ll remind you that that’s not what we’re here for. ”
And to avoid the kind of criticism that dogged OpenAI over its chatbot’s voice resembling actor Scarlett Johansson, Microsoft paid voice actors to create four voices intentionally designed not to imitate famous people. Provided optional training data.
“Imitation leads to confusion. These things are not human and should not try to be human,” Suleiman said. “They should give us enough of a sense of being distant and distant, yet comfortable and fun and approachable and easy to talk to. That boundary is how we build trust.”
Based on its voice capabilities, the new CoPilot will feature a “Daily” feature that reads out daily summaries of weather forecasts and news updates to users, in partnership with news organizations such as Reuters and the Financial Times.
Microsoft also includes Copilot in its Microsoft Edge browser. If users need to answer a question or translate text, they can chat with the tool by typing @copilot in the address bar.
Power users who want to try out features that are still in development can access what Microsoft calls “Copilot Labs.” The company says new features include “Think Deeper,” which can reason through more complex questions, and “Copilot Vision,” which can see what’s on the computer screen to answer questions and suggest next steps. You can test functionality.
Following backlash over privacy risks to a similar AI tool called Recall that it released for Windows earlier this year, Microsoft has announced that Copilot Vision sessions are completely opt-in and that the content you see is not saved and is used for training. He says that there is no such thing.