- Microsoft officially launched its Copilot for Office 365 earlier this month.
- The tool can go to meetings for you, write emails, and summarize documents.
- While Copilot has obvious timesaving advantages, it’s not risk-free.
Microsoft officially launched its Copilot for Office 365 to users earlier this month.
The AI-productivity tool can summarize documents, write emails, and go to meetings for you. It’s designed to be rolled out across an organization and links to Microsoft apps, including Word, Excel, PowerPoint, Outlook, and Teams.
So far, it’s proved popular so I headed down to Microsoft’s UK offices to try the tool out myself.
First off, Copilot is surprisingly easy to use. The tool adopts the casual chatbot style so it’s easy to prompt and accessible even to non-techie workers.
Workers can ask the bot to summarise long documents or turn them into PowerPoints or spreadsheets with a few clicks. It can also reply to emails and scan through your inbox, pulling out the most important messages and drafting responses.
Copilot can read through long email chains or Teams chats and provide summarize the main points.
Workers can even send the bot to meetings for them. When enabled on a Teams call, Copilot can record meetings, provide a transcript, and separate the subjects discussed into clips called “chapters.”
The tool could be a game-changer for busy, email-laden workers and help them get up to speed fast.
It doesn’t have much personalization yet, but Microsoft is working on it. The company plans to launch a feature called “Sounds like me,” which lets Copilot mimic a user’s voice and tone in emails and messages.
The tool itself is powered by OpenAI’s GPT and uses the AI company’s DALL-E 3 for image generation. The chatbot feature is similar to Bing chat, which has been known to make mistakes.
Copilot has obvious timesaving advantages, but it’s not risk-free.
The safest way to use AI is to treat it like a new and inexperienced intern. Microsoft said the tool is aimed at helping workers get to a first draft faster and encourages users to edit and check the final work.
While Microsoft has taken strides to try and combat AI hallucinations, including adding footnote references in the bot’s responses that link out to the source of information, the tech isn’t perfect. Microsoft encourages companies to train their workers not to treat AI like an infallible search tool.
Read the full article here