Apple has restricted employee use of ChatGPT and other external artificial intelligence utilities amid the development of its own similar technology, according toThe Wall Street Journal (a subscription is required to read the article).
The report says the tech giant is concerned that AI tools could leak the company’s confidential data. In addition to ChatGPT, Apple has barred staff from using GitHub’s Copilot, a tool that helps write code with autocompletion.
In March it was reported that Apple is testing AI (artificial intelligence) features (think ChatGPT) that could eventually come to Siri, according to The New York Times (a subscription is also required to read this article).
Apple engineers, including members of the Siri team, have reportedly been testing language-generation concepts “every week” in response to the rise of chatbots like ChatGPT, the article adds. ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI’s GPT-3 family of large language models and has been fine-tuned using both supervised and reinforcement learning techniques.