New ask Hacker News story: Did I just create the world's smallest AI server?

Did I just create the world's smallest AI server?
4 by willmccollum | 1 comments on Hacker News.
I successfully installed Ollama in Termux on my degoogled Unihertz Jelly Star, reputed to be the world's smallest smartphone, having a 3 inch screen. The Jelly Star packs 8+7 gb ram. I downloaded and then successfully ran distilled Deepseek-R1:7b locally on the device. Is it slow? Yes. But it still steadily outputs text word by word, does not crash, and takes no longer than a couple mins to respond in full. Anyone have other examples of micro AI workstations?

Comments