Downloads don’t work
Hello, Too bad, the downloads don’t work… Impossible to get the models.
Ja, LLM Farm ist komplett kostenlos und enthält keine In-App-Käufe oder Abonnements.
🤔 Die Qualität der LLM Farm-App ist gemischt. Einige Nutzer sind zufrieden, während andere Probleme melden. Ziehen Sie in Betracht, einzelne Bewertungen für mehr Kontext zu lesen.
LLM Farm ist kostenlos.
Um geschätzte Einnahmen der LLM Farm-App und weitere AppStore-Einblicke zu erhalten, können Sie sich bei der AppTail Mobile Analytics Platform anmelden.
3 von 5
2 Bewertungen in Frankreich
Hello, Too bad, the downloads don’t work… Impossible to get the models.
I don’t know if it is because the settings or not, but I keep getting weird responses where it would try to respond to me only for it to suddenly change the text to random Chinese characters mid generation and stuck infinitely loading. I am running default settings btw.
The app keeps crashing every time i send the first message “hi” it do crash without any respond
Tons of AI fun…great app
The app need Markdown for LLM responses. Often time the LLM will output in markdown but the app doesn’t parse it. Other then that, this is the best LLM app for mobile! It has a lot of customizable settings and it supports a lot of quantized models. It also supports all the shortcuts. What other open source LLM app can compete with this? Its developer is active and constantly trying to improved the app. I love this app.
Love this app. Surprisingly powerful and has tons of tweaking options that many apps on full computers lack. I use it on phone and iPad. I would love to see support for newer Gemma models and the ability act as an inference server like ollama or LM studio so I can run local inference for Obsidian or other apps.
iOS equivalent of draw things for LLM and not stable diffusion. Would love more settings tho and integration to the neural engine ect. And get the actual training to work
This app looks pretty promising, but it’s a little bit daunting to someone who’s not as familiar with setting up LLMs. For example, how do you download the LLMs and where do you go to get them? Which alarms are likely to work? It might be a good idea to include specific LLMs that have been tested on which devices. Some sort of tutorial or instructions would be really useful.
This is one of the best LLM apps for mobile! It has a lot of customizable settings and it supports a lot of quantized models. It also supports all the quantized models. Shortcut integration is all this amazing app needs please great developer add app intents to the app!
I downloaded this app as an alternative to being forced to sign up for an OpenAI account to get ChatGPT. After trying a few models and finding one I liked (LLaMA2-7B) I was quite impressed as it was my first time trying this sort of large language model. The first-launch flow could be improved. When I first opened this app I kind of expected some models pre installed and was disappointed that I had to download some from the app’s website. If models can’t be included in the app due to the file size, at least it would be nice to have a get- started prompt show up when no models are loaded, such as “Welcome to LLM Farm. To get started, you’ll need to download a model…” Also, once you have models imported and configured, it would be nice to also have a “quick start” option when starting a new chat that would let you just duplicate the model and parameters of an existing chat (but with zero history), instead of always having to go through the rather complicated setup menu as it is right now. Although the “LLaMA2 Chat 7B NEW” settings template really helps!! Some performance issues: when the model is started, it momentarily freezes my entire device (iPhone 15) — can’t type on keyboard, can’t open the control center, can’t go to Home Screen — for about 5 seconds, and then while the model is running and producing output it causes any other background tasks running such as playing music to glitch out and sometimes crash completely. (App Store readers: this just goes to show you how much computing power these models take!) it would be helpful if there was a toggle to put the model in a “throttled” mode where it runs slower to try and allow other things to run at the same time, and additionally when the model is in throttled mode it will allow the model to run in the background while you use other apps and you get a notification when the model completes its output. I don’t know much about how LLMs operate, but if this is physically possible I would really appreciate it being added.
Diagramm
|
Kategorie
|
Rang
|
---|---|---|
Top Kostenlos
|
|
29
|
Top Kostenlos
|
|
46
|
Top Kostenlos
|
|
50
|
Top Kostenlos
|
|
53
|
Top Kostenlos
|
|
53
|
Name | Downloads (30d) | Monatlicher Umsatz | Rezensionen | Bewertungen | Letzte Veröffentlichung | |
---|---|---|---|---|---|---|
Jellycuts
Supercharge your Shortcuts
|
Freischalten
|
Freischalten
|
2
|
|
vor 6 Tagen | |
Easy CSV Editor Mobile
Elegant & powerful CSV editor
|
Freischalten
|
Freischalten
|
0
|
|
vor 1 Monat | |
Server: Host Files Locally
|
Freischalten
|
Freischalten
|
3
|
|
vor 1 Jahr | |
NanoBeacon BLE Scanner
The #1 Bluetooth LE scanner
|
Freischalten
|
Freischalten
|
0
|
|
vor 1 Jahr | |
Changes for Linear
Native linear.app client
|
Freischalten
|
Freischalten
|
1
|
|
vor 8 Monaten | |
Device Info Tool
Hardware and software details
|
Freischalten
|
Freischalten
|
0
|
|
vor 10 Monaten | |
Source Files: Git Storage
for GitHub, GitLab & BitBucket
|
Freischalten
|
Freischalten
|
0
|
|
vor 1 Woche | |
ChatOnMac.com — AI Chat Bots
OpenSource bots for Llama, GPT
|
Freischalten
|
Freischalten
|
0
|
|
vor 11 Monaten | |
AWS IoT Sensors
|
Freischalten
|
Freischalten
|
0
|
|
vor 6 Monaten | |
Enchanted LLM
Chat with your private LLM
|
Freischalten
|
Freischalten
|
3
|
|
vor 4 Monaten |