Apple has yet to announce a significant productive artificial intelligence project, unlike its competitors who have rushed applications with various degrees of success. This has sometimes been interpreted as a sign that the company may be lagging behind. The company has cautiously stated that it wants to avoid the technology's notorious flaws. Among these flaws is the cloud-based use of generative AI on devices. For this, Apple is working on artificial intelligence models that can work on the device instead of the cloud on iPhone.
Artificial intelligence will work directly on iPhone
Apple's hiring records, previous disclosures and analyst commentary suggest that the company will announce the results of its investments in productive AI this year. Apple's future products will likely run generative AI models using onboard hardware rather than cloud services.
While Apple, like OpenAI, Microsoft, Google and Adobe, has been cautious about generative AI chatbots or image generators, it has made significant investments in the sector. A report published last September showed that the company was spending millions of dollars a day on numerous AI projects using text, voice and images. Apple CEO Tim Cook acknowledged the company's investments last May, saying he wanted to avoid the pitfalls that companies like Google and OpenAI faced as early pioneers.
Wedbush Securities analyst Daniel Ives told the FT that he expects Apple to take a significant step forward in AI this year. In addition, Morgan Stanley reported that deep learning was mentioned in almost half of Apple's AI job adverts, indicating an aggressive breakthrough in the sector.
Morgan Stanley also expects iOS 18, which Apple will introduce at WWDC in June, to focus on artificial intelligence. The company's largest core LLM, internally called "Ajax GPT", could become the power behind an improved version of its virtual assistant Siri. The artificial intelligence iOS 18 will be available in the iPhone 16 family, which will be introduced in September.
Apple's AI models could benefit from the neural engines (NPU) of its latest chips, such as the M3 and A17 Pro. Similarly, the Neural Processing Unit (NPU) underpins Intel's recent push into AI computing with its Meteor Lake laptop processors.
However, the clearest indication of Apple's goal of processing LLMs on its devices is the company's AI study released earlier this month, which suggested using flash memory for this task. Samsung, however, recently unveiled the Galaxy S24 series, which uses built-in AI hardware for image editing, real-time text translation, search and other tasks.
