Apple is expected to go all in with artificial intelligence this year. Now the company has been granted a patent (number US 11907823 B2) for a “Customizable Chip for AI Applications.”
Apple is quietly working on artificial intelligence (AI) tools that could challenge those of OpenAI Inc., Alphabet Inc.’s Google and others, but the company has yet to devise a clear strategy for releasing the technology to consumers, according to Bloomberg‘s Mark Gurman in an July 2023 report.
The company has built its own framework to create large language models — the AI-based systems at the heart of new offerings like ChatGPT and Google’s Bard — the article adds. With that foundation, known as “Ajax,” Apple also has created a chatbot service that some engineers call “Apple GPT,” according to Gurman. Since the tech giant makes its own A.x and M-x processors for iPhones, iPads, Macs, and other devices, making a customizable cap for AI applications seems a likely next step.
Apple’s patent involves neutral networks. In information technology a neural network is defined as a system of programs and data structures that approximates the operation of the human brain. A neural network usually involves a large number of processors operating in parallel, each with its own small sphere of knowledge and access to data in its local memory. Apple patent involves a neutral network implemented as a customized integrated circuit.
Example applications include security, access control, identification/authentication, machine vision, artificial intelligence, engineering, manufacturing, robotics, systems control, autonomous vehicles, and other situations involving some form of object or pattern recognition, object detection, or automated decision-making based on an image.
Here’s Apple’s (very technical) abstract of the patent: “In one embodiment, a computing device includes an input sensor providing an input data; a programmable logic device (PLD) implementing a convolutional neural network (CNN), wherein: each compute block of the PLD corresponds to one of a multiple of convolutional layers of the CNN, each compute block of the PLD is placed in proximity to at least two memory blocks, a first one of the memory blocks serves as a buffer for the corresponding layer of the CNN, and a second one of the memory blocks stores model-specific parameters for the corresponding layer of the CNN.”
Here are the top Apple-related articles at Apple World Today for the week of April…
With the The Parking Spot, you can get easy long-term parking at airports nationwide. With…
The start of the civil trial over who is responsible for the deaths and injuries…
iPad sales are expected to grow by double digits this year.
The global smartphone market grew by 6% year-over-year (YoY) to reach 296.9 million unit shipments…
Slight Change of Sawblades +, Dicey Sungeons +, and Summer Pop + are now available…