Samsung Reportedly Bans Use of Generative AI Tools – CNET

Samsung has reportedly banned employees from using popular generative AI tools such as ChatGPT, Google Bard and Bing out of security concerns.

The Korea-based company notified employees at one of its biggest divisions on Monday of the new policy, Bloomberg reported. The ban is out of concern that data used by AI platforms is stored on external servers and could end up being disclosed to others, Bloomberg reported.

“Interest in generative AI platforms such as ChatGPT has been growing internally and externally,” Samsung told staff. “While this interest focuses on the usefulness and efficiency of these platforms, there are also growing concerns about security risks presented by generative AI.”

Generative AI captured the public’s attention with November’s launch of OpenAI’s ChatGPT, a chatbot built on a powerful AI engine that can write software, hold conversations and compose poetry. Microsoft is employing ChatGPT’s technology foundation, GPT-4, to boost Bing search results, offer email writing tips and help build presentations.

The new policy comes amid heightened concern over the risk associated with AI. In March, hundreds of tech executives and experts in AI signed an open letter urging leading artificial intelligence labs to pause development of AI systems, citing “profound risks” to human society.

The new policy comes after Samsung engineers accidentally leaked internal source code by uploading it to ChatGPT, the memo said. 

“HQ is reviewing security measures to create a secure environment for safely using generative AI to enhance employees’ productivity and efficiency,” the memo said. “However, until these measures are prepared, we are temporarily restricting the use of generative AI.” 

The new rules ban the use of generative AI systems on Samsung-owned computers, tablets and phones, Bloomberg reported.

Samsung didn’t immediately respond to a request for comment.

Editors’ note: CNET is using an AI engine to create some personal finance explainers that are edited and fact-checked by our editors. For more, see this post.

Source