

I have a similar perspective. I built my own in-home AI server because I assumed if the technology had any staying power, I better learn how it works to some degree and see if I can run it myself.


I have a similar perspective. I built my own in-home AI server because I assumed if the technology had any staying power, I better learn how it works to some degree and see if I can run it myself.


Jan is another great recommendation!


I’m keeping an eye on Ollama’s service offerings - I don’t think they’re in enshittification territory yet, but I definitely share the concern.
I still don’t believe the other LLM engines out there have reached an equivalent ease of use compared to Ollama, and I still recommend it for now. If nothing else, it can be a stepping stone to other solutions for some.


In case you’re not aware, there are a decent number of open weight (and some open source) large language models.
The Ollama project makes it very approachable to download and use these models.


There’s not a significant amount of discourse suggesting that this is a natural cycle: https://news.cornell.edu/stories/2021/10/more-999-studies-agree-humans-caused-climate-change
Rhetoric like yours adds uncertainty where there is extremely little to be had.
These cycles also perpetrate over millennia, not decades, which is the current scope of detectable change we’re dealing with.
Deepseek R1 and OpenThinker are two more examples. There’s also SmolLM, which I believe also open sources its training data and ensures proper licensing for it.