HomeTechnologySamsung employees leak sensitive data on ChatGPT

Samsung employees leak sensitive data on ChatGPT

While Samsung has allowed its manufacturing engineers to use ChatGPT in a professional setting, three incidents have been documented.

It seems unlikely that Samsung will repeat the experience. Samsung Semiconductor engineers began using ChatGPT to fix bugs in their source code that leaked sensitive information. Korean media report Economist.

Internal data was used in ChatGPT queries, such as internal meeting notes and data related to performance and manufacturing yields. There have been three problematic use cases of ChatGPT that have led to data leaks. The first concerns an employee who allegedly sent the source code of a Samsung app to the AI.

The second concerns an employee who entered test patterns intended to identify defective chips for optimization. Another used the Naver Clova tool to convert a meeting recording into a document and then used ChatGPT to prepare a presentation from that recording.

3% of employees have disclosed information

According to The Economist, Samsung has blocked the use of ChatGPT in its workplaces for fear of further leaks of exiled internal confidential information to external systems. The company also plans to develop its own ChatGPT-like AI service for internal use.

Above all, artificial intelligences are not subject to professional secrecy. A study Conducted by cybersecurity company Cyberhaven, published in February, shows that 3% of employees from different companies have already transmitted internal company information to ChatGPT. For these reasons, companies like JP Morgan or Verizon block access to ChatGPT for their employees.

Author: margaux vulliet
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here