Unfortunately Apple appears to be blocking the use of these llms within apps on their app store. I've been trying to ship an app that contains local llms and have hit a brick wall with issue 2.5.2
I think Apple will become increasingly draconian about LLMs. Very soon people won't need to buy many of their apps. They can just make them. This threatens Apple's entire business model.
Though of course Apple's rules aren't always consistent, I have 2 separate apps currently on my phone that can/are running this (Google's Edge Gallery and Locally AI)
Is this an issue with Cactus compute stuff as well?
What is your app doing? Just LLM inference?
Seriously, how do people put up with being nannied by Apple?
Come on folks, their IT hardware may be nice but supporting them is not worth it.
In case someone don't know, this is the full text:
> 2.5.2 Apps should be self-contained in their bundles, and may not read or write data outside the designated container area, nor may they download, install, or execute code which introduces or changes features or functionality of the app, including other apps. Educational apps designed to teach, develop, or allow students to test executable code may, in limited circumstances, download code provided that such code is not used for other purposes. Such apps must make the source code provided by the app completely viewable and editable by the user.
Why is this related to local LLMs in app?