BebéTechManiac
4
Saturday, January 3, 2026
I was able to get the gguf file, selected it and couldn't test much this app.The load time on first use to the character I selected took a long time, but the bad thing was that asking something and getting a response was not fast also even with the data stored on the phone.It started very slowly to display word by word.My phone has 8GB, is not a new phone,but my experience with online AI has been a lot better than this on this phone.Even the idea is nice, probably you will need a flagship phone.
Pumpkinhead
1
Sunday, March 15, 2026
Pretty cool. However, i had to tell it that when a cassette tape messes up, it doesn't skip and repeat one syllable over and over like a cd.
Arachia Botanical
2
Sunday, March 29, 2026
while the chat is okay in a story driven manner. it doesn't have a setting I can find for auto response generation to make story chats go smoother with less editing from the user once a story has been established. also the images generated are all so far not generated at all they are just taken from various sites. not a single image matched what was in the text just barely looked like the a version of the character from the description. anime and unrealistic characters had real people images too
Layla Network.AI
Monday, May 4, 2026
Layla uses a realistic image generator by default. I will look into improving this part to make it follow your prompts better during chat.
captain mak sparrow
1
Sunday, April 19, 2026
can't do anything until I log in
Layla Network.AI
Monday, May 4, 2026
You can use offline GGUF models if you do not wish to login. Go to Inference Settings -> Add Custom Model and choose a GGUF file from your phone. GGUF is a popular format for local LLMs. You can find many of them on HuggingFace. If you'd like, please email me at
[email protected] and I'd be happy to recommend some for you!
yahya alabssi
0
Saturday, April 25, 2026
good