Apple ‘restricts’ internal use of ChatGPT, GitHub’s co-driver at risk of data leaks

New Delhi: Apple reportedly restricts internal use of its ChatGPT and Copilot AI chatbots from GitHub over concerns that its confidential data could end up with developers training AI models on user data. According to the Wall Street Journal, iPhone makers are “afraid workers might leak confidential data because they developed similar technology themselves.”

Apple has restricted the use of ChatGPT and other external AI tools to certain employees “as Apple develops its own similar technology,” according to documents reviewed by the WSJ. (Also read: 47% of Americans use ChatGPT for sharing: study)

The tech giant is developing its own generative AI model but not expanding its uses, according to the report. In March, The New York Times reported that several Apple teams, including the team working on Siri, were experimenting with language-generating AI. (Also read: Apple removes 1,474 apps following government takedown request in 2022, including 14 from India)

ChatGPT was reportedly on Apple’s restricted software list for months.

Samsung would also block the use of generative AI tools such as ChatGPT on company-owned devices as well as non-company-owned devices running on internal networks.

The South Korean giant would develop its own artificial intelligence tools in-house for “software development and translation”. The decision was apparently made after sensitive Samsung data was accidentally leaked to ChatGPT last month.

Organizations such as Bank of America, Citi, Deutsche Bank, Goldman Sachs, Wells Fargo, JPMorgan, Walmart, and Verizon have also banned their employees from accessing ChatGPT.

Get updates latest news And viral news, football news and other world news daily only on Drita.com.

Jordan Carlson

"Zombie geek. Beer trailblazer. Avid bacon advocate. Extreme introvert. Unapologetic food evangelist. Internet lover. Twitter nerd."

Leave a Reply

Your email address will not be published. Required fields are marked *