Apple Unveils its New Open Source AI Models, which Operate On-device Instead of Cloud

By Consultants Review Team Thursday, 25 April 2024

Apple has entered the AI competition at last. OpenELM (Open-source Efficient Language Models), a new open source large language model (LLM) from the Cupertino-based tech giant, is intended to run directly on devices instead than via cloud services. Hugging Face Hub, a well-known community site for sharing AI code, presently offers OpenELM models.

Apple's LLM is a set of eight language models, four of which are instruction-tuned and four of which are pre-trained using the CoreNet library, according to the release white paper PDF.  With these models, the business is utilizing a layer-wise scaling technique to maximize accuracy and efficiency.

Apple has provided the complete framework, including code, training logs, and various versions, to differentiate its LLM from competitors that only sell pre-trained models. 

With the release of OpenELMmodels as open source software, Apple hopes to empower and enhance the research community by providing cutting edge language models. Apple claims that by making models available as open source, researchers are able to not only use the models but also explore their internal workings, leading to "more trustworthy results" and quicker advancements in the field of natural language artificial intelligence.

Apple's OpenELM models can be customized to meet specific purposes or used as-is by researchers, developers, and businesses.With this transparency, businesses are also abandoning their prior methods, which frequently limited their offerings to model weights and inference code, with no access to the underlying setups or training data.

Apple's on-device AI processing, meanwhile, offers two advantages: efficiency and privacy. OpenELM addresses rising worries about user privacy and potential intrusions of cloud servers by keeping data and processing local. Furthermore, on-device processing removes the need for internet access, allowing AI features to operate offline. Apple highlights this benefit by pointing out that OpenELM offers "enhanced accuracy" with less resource use when compared to models that are similar.

Researchers gain from Open Sourcing, but Apple also stands to gain strategically from it. With the information freely shared, Apple will be able to work with other researchers in the field, allowing others to improve and add to OpenELM. Top personnel will also be drawn to the organization by this open atmosphere, including scientists, engineers, and specialists. Apple claims that OpenELM effectively acts as a launchpad for more AI developments, which is advantageous to both Apple and the AI community as a whole.

Despite the fact that Apple has not yet included these AI elements to its devices, rumors regarding the upcoming release of iOS 18 and its intention to include on-device AI features are circulating. With the introduction of its own LLM, Apple has made it evident that it is setting the stage for the AI upgrade of all of its products, including Macs, iPads, and iPhones. It is anticipated that Apple would integrate its extensive language models into its products, making user experiences more effective and customized. With this move to on-device processing, Apple may be able to protect user privacy and provide developers with easily accessible, effective AI capabilities.

Current Issue