Shopping cart

Texas News App is your reliable source for real-time updates across Texas, covering Local News, Politics, Business, Sports, and more. With a focus on all 15 Texas regions, we bring the stories that matter most to communities statewide. Stay informed and connected with an app designed to reach Texans wherever they are.

TnewsTnews
  • Home
  • Business & Tech
  • Microsoft’s Copilot has an oversharing problem. The company is trying to help customers fix it.
Business & Tech

Microsoft’s Copilot has an oversharing problem. The company is trying to help customers fix it.

Microsoft’s Copilot has an oversharing problem. The company is trying to help customers fix it.
Email :25
Microsoft Copilot Microsoft Build
  • Microsoft released tools to address security issues with its AI assistant Copilot.
  • Copilot’s indexing of internal data led to oversharing of sensitive company information.
  • Some corporate customers delayed Copilot deployment due to security and oversharing concerns.

You know when a colleague overshares at work? It’s awkward at best.

Microsoft’s Copilot has been doing an AI version of this behavior, which has unnerved corporate customers so much that some have delayed deploying the product, as Business Insider first reported last week.

Now, the software giant is trying to fix the problem. On Tuesday, Microsoft released new tools and a guide to help customers mitigate a Copilot security issue that inadvertently let employees access sensitive information, such as CEO emails and HR documents.

These updates are designed “to identify and mitigate oversharing and ongoing governance concerns,” the company explained in a new blueprint for Microsoft’s 365 productivity software suite.

“Many data governance challenges in the context of AI were not caused by AI’s arrival,” a Microsoft spokesperson told BI on Wednesday.

AI is simply the latest call to action for enterprises to take proactive management of their internal documents and other information, the spokesman added.

These decisions are controlled by each company’s unique situation. Factors such as specific industry regulations and varying risk tolerance should inform these decisions, according to the Microsoft spokesperson. For instance, different employees should have access to different types of files, workspaces, and other resources.

“Microsoft is helping customers enhance their central governance of identities and permissions, to help organizations continuously update and manage these fundamental controls,” the spokesman said.

Copilot’s magic — its ability to create a 10-slide road-mapping presentation, or to summon up a list of your company’s most profitable products — works by browsing and indexing all of your company’s internal information, like the web crawlers used by search engines.

Historically, IT departments at some companies have set up lax permissions for who can access internal documents — selecting “allow all,” say, for the company’s HR software, rather than going through the trouble of selecting specific users.

That never created much of a problem, because there wasn’t a tool that an average employee could use to identify and retrieve sensitive company documents — until Copilot.

As a result, some customers have deployed Copilot, only to discover that it can enable employees to read an executive’s inbox or access sensitive HR documents.

“Now, when Joe Blow logs into an account and kicks off Copilot, they can see everything,” said one Microsoft employee familiar with customer complaints. “All of a sudden Joe Blow can see the CEO’s emails.”

Are you a Microsoft employee or someone else with insight to share?

Contact Ashley Stewart via the encrypted messaging app Signal (+1-425-344-8242) or email (astewart@businessinsider.com). Use a nonwork device.

Read the original article on Business Insider



This article was originally published by Ashley Stewart at All Content from Business Insider – Read this article and more at (https://www.businessinsider.com/microsoft-copilot-oversharing-problem-fix-customers-2024-11).

General Content Disclaimer



The content on this website, including articles generated by artificial intelligence or syndicated from third-party sources, is provided for informational purposes only. We do not own the rights to all images and have not independently verified the accuracy of all information presented. Opinions expressed are those of the original authors and do not necessarily reflect our views. Reader discretion is advised, as some content may contain sensitive, controversial, or unverified information. We are not responsible for user-generated content, technical issues, or the accuracy of external links. Some content may be sponsored or contain affiliate links, which will be identified accordingly. By using this website, you agree to our privacy policy. For concerns, including copyright infringement (DMCA) notices, contact us at info@texasnews.app.

Comments are closed

Related Posts

0
YOUR CART
  • No products in the cart.