You can't run LLMs on your phone. You can however, host your own LLM, like Persimmon or Llama, and make use of an app that connects to the server side.
I am not an expert here, so I cannot help you extensively. I just like to play around with projects. There's already tutorials out there for running a private LLM, so you might want to refer that.
I've tried running Alpaca 7B on my laptop using Dalai, and let's just say it toasted my device (i5 8265U, 8GB RAM, 25W MX250 2GB VRAM). I wouldn't recommend Dalai as the front-end, the Node script is a mess and it also requires privilege escalation to run commands - there are probably some other better front-ends out there.
Persimmon 8B is a slightly better model. Then there's also Mistral 7B, but I don't know much about that.
You might want to run these models using Google Cloud's Vertex AI.