Mehh
Very nice project but not that usable yet. Would use if i really need too but the app is so laggy. Im using it on 14 pro max. Also not enough documentation.
Yes, LLM Farm is completely free and it doesn't have any in-app purchases or subscriptions.
🤔 The LLM Farm app's quality is mixed. Some users are satisfied, while others report issues. Consider reading individual reviews for more context.
LLM Farm is free.
To get estimated revenue of LLM Farm app and other AppStore insights you can sign up to AppTail Mobile Analytics Platform.
3 out of 5
1 ratings in Türkiye
Very nice project but not that usable yet. Would use if i really need too but the app is so laggy. Im using it on 14 pro max. Also not enough documentation.
I don’t know if it is because the settings or not, but I keep getting weird responses where it would try to respond to me only for it to suddenly change the text to random Chinese characters mid generation and stuck infinitely loading. I am running default settings btw.
The app keeps crashing every time i send the first message “hi” it do crash without any respond
Tons of AI fun…great app
The app need Markdown for LLM responses. Often time the LLM will output in markdown but the app doesn’t parse it. Other then that, this is the best LLM app for mobile! It has a lot of customizable settings and it supports a lot of quantized models. It also supports all the shortcuts. What other open source LLM app can compete with this? Its developer is active and constantly trying to improved the app. I love this app.
Love this app. Surprisingly powerful and has tons of tweaking options that many apps on full computers lack. I use it on phone and iPad. I would love to see support for newer Gemma models and the ability act as an inference server like ollama or LM studio so I can run local inference for Obsidian or other apps.
iOS equivalent of draw things for LLM and not stable diffusion. Would love more settings tho and integration to the neural engine ect. And get the actual training to work
This app looks pretty promising, but it’s a little bit daunting to someone who’s not as familiar with setting up LLMs. For example, how do you download the LLMs and where do you go to get them? Which alarms are likely to work? It might be a good idea to include specific LLMs that have been tested on which devices. Some sort of tutorial or instructions would be really useful.
This is one of the best LLM apps for mobile! It has a lot of customizable settings and it supports a lot of quantized models. It also supports all the quantized models. Shortcut integration is all this amazing app needs please great developer add app intents to the app!
I downloaded this app as an alternative to being forced to sign up for an OpenAI account to get ChatGPT. After trying a few models and finding one I liked (LLaMA2-7B) I was quite impressed as it was my first time trying this sort of large language model. The first-launch flow could be improved. When I first opened this app I kind of expected some models pre installed and was disappointed that I had to download some from the app’s website. If models can’t be included in the app due to the file size, at least it would be nice to have a get- started prompt show up when no models are loaded, such as “Welcome to LLM Farm. To get started, you’ll need to download a model…” Also, once you have models imported and configured, it would be nice to also have a “quick start” option when starting a new chat that would let you just duplicate the model and parameters of an existing chat (but with zero history), instead of always having to go through the rather complicated setup menu as it is right now. Although the “LLaMA2 Chat 7B NEW” settings template really helps!! Some performance issues: when the model is started, it momentarily freezes my entire device (iPhone 15) — can’t type on keyboard, can’t open the control center, can’t go to Home Screen — for about 5 seconds, and then while the model is running and producing output it causes any other background tasks running such as playing music to glitch out and sometimes crash completely. (App Store readers: this just goes to show you how much computing power these models take!) it would be helpful if there was a toggle to put the model in a “throttled” mode where it runs slower to try and allow other things to run at the same time, and additionally when the model is in throttled mode it will allow the model to run in the background while you use other apps and you get a notification when the model completes its output. I don’t know much about how LLMs operate, but if this is physically possible I would really appreciate it being added.
Chart
|
Category
|
Rank
|
---|---|---|
Top Free
|
|
42
|
Top Free
|
|
42
|
Top Free
|
|
44
|
Top Free
|
|
48
|
Top Free
|
|
49
|
Name | Downloads (30d) | Monthly Revenue | Reviews | Ratings | Recent release | |
---|---|---|---|---|---|---|
Jellycuts
Supercharge your Shortcuts
|
Unlock
|
Unlock
|
0
|
|
1 month ago | |
LogSnag
|
Unlock
|
Unlock
|
0
|
|
1 year ago | |
Easy CSV Editor Mobile
Elegant & powerful CSV editor
|
Unlock
|
Unlock
|
0
|
|
2 months ago | |
Server: Host Files Locally
|
Unlock
|
Unlock
|
0
|
|
1 year ago | |
Vision Detector
Run your Vision CoreML model
|
Unlock
|
Unlock
|
0
|
|
9 months ago | |
NanoBeacon BLE Scanner
The #1 Bluetooth LE scanner
|
Unlock
|
Unlock
|
0
|
|
1 year ago | |
Device Info Tool
Hardware and software details
|
Unlock
|
Unlock
|
0
|
|
11 months ago | |
ChatOnMac.com — AI Chat Bots
OpenSource bots for Llama, GPT
|
Unlock
|
Unlock
|
0
|
|
1 year ago | |
AWS IoT Sensors
|
Unlock
|
Unlock
|
0
|
|
1 week ago | |
Enchanted LLM
Chat with your private LLM
|
Unlock
|
Unlock
|
0
|
|
5 months ago |