Apple has restricted employee use of ChatGPT and other external artificial intelligence utilities amid the development of its own similar technology, according toThe Wall Street Journal (a subscription is required to read the article).
The report says the tech giant is concerned that AI tools could leak the company’s confidential data. In addition to ChatGPT, Apple has barred staff from using GitHub’s Copilot, a tool that helps write code with autocompletion.
In March it was reported that Apple is testing AI (artificial intelligence) features (think ChatGPT) that could eventually come to Siri, according to The New York Times (a subscription is also required to read this article).
Apple engineers, including members of the Siri team, have reportedly been testing language-generation concepts “every week” in response to the rise of chatbots like ChatGPT, the article adds. ChatGPT is an artificial intelligence chatbot developed by OpenAI and launched in November 2022. It is built on top of OpenAI’s GPT-3 family of large language models and has been fine-tuned using both supervised and reinforcement learning techniques.
Apple has closed in on an agreement with OpenAI to use the startup’s technology on…
Here are the top Apple-related articles at Apple World Today for the week of May…
The fast-paced global economy sees outsourcing as a cornerstone strategy for businesses striving to streamline…
Thanks to today’s deal, you can get a Costco 1-Year Gold Star membership and a…
Apple TV+ has opted not to continue with a second season of “Constellation,” its sci-fi…
Apple plans to announce that it will bring generative A.I. to iPhones after the company’s…