Breaking from its traditionally cautious stance on AI, Apple has quietly rolled out groundbreaking frameworks and model libraries tailored for its chips, hinting at the possibility of unleashing generative AI applications on MacBooks.
Apple's machine learning research team introduced MLX, a specialized framework enabling developers to construct models optimized for Apple Silicon's efficient performance. Accompanying this is MLX Data, a deep learning model library. Both are available on open-source platforms like GitHub and PyPI, amplifying accessibility for developers.
Drawing inspiration from established frameworks like PyTorch, Jax, and ArrayFire, Apple's MLX stands out with its shared memory feature, allowing seamless task execution across supported devices—presently CPUs and GPUs—without data movement. Reports indicate MLX's capability to handle intricate AI model training akin to Meta's Llama and Stable Diffusion.
Awni Hannun, an Apple machine learning researcher, highlighted MLX Data as a versatile, efficient package for data loading. Compatible with MLX, PyTorch, or Jax frameworks, it emphasizes flexibility and ease of use, serving as a significant asset for developers.
While Apple has integrated AI into its products for years, its focus has predominantly been on machine learning, distancing itself from the popular generative AI realm pursued by rivals like Microsoft and Google. Notably, Apple often sidesteps using the term AI in its keynote presentations.
Apple's recent strides, however, signify a potential shift, laying groundwork for delving deeper into foundational models, signaling a fresh chapter in the company's AI trajectory.