Ok yes AI is everywhere. Duh.
I have been thinking about upgrading my GPU situation at home, as I am debating and deciding I am coming to the conclusion that this "feels" kinda like the beginning of the personal computer era. (Though with some differences) Let me explain.
Doing a Google it claimed in 1988 PCs were between $700 and $2000 for the average home. Converting that into today's dollars the range moves to $1917 and $5479.
Currently the prices of GPUs which (I think) sit in the middle ranges there are the 24-40 GB VRAM level devices (ignoring things like the M4 mini with their situation, largely because while it would run models it is likely to be slower than a dedicated GPU).
Here is my hypothesis. Open weight models (I guess I can't say open source), will get "Good Enough" that to the average consumer there isn't a benefit to using paid models, those models will fit on hardware in that range of pricing, and will be capable enough. The cloud is obviously a factor, but I speculate the big (private model) AI companies will continue to develop AI that is getting better and better.
This AI will ultimately be used for the "Big Problems", that consumers or even tech enthusiasts don't need. They will continue to offer a tiered access to their AI. I think that the extremely advanced models will only be accessed by people with big $$. Aka businesses, or rich folks. Models that compare to state of the art open source models will cost some amount of money, and worse models will be cheaper of course. Local hosted models will largely be a middle class space for those early tech adopters. I believe.
I was looking and for the price of $200 a month (one of OpenAIs tiers), it would take around a year to break even from buying your own lower end GPUs (ignoring costs like power).
Moving on. Thinking about running my own GPU. A few factors come into play here. 1. If you have your own GPU, there are no cost limits to token usage (besides the natural limit of how much you can compute in a given day based on processing power/etc). There is no guessing how much things will cost, or whether you will be rate limited per your plan. 2. Privacy, I think more people are slowly moving towards increased privacy. I am sure in a moment of experimentation I've asked ChatGPT things I wish I hadn't, so moving all my future conversations or potentially sensitive conversations off those platforms is in my best interest. 3. Ads, everyone is doing it, and ultimately it will come to the paid/private models. Though I can't say for sure whether local models will, I am willing to claim, there will be a market for someone to build the equivalent to pihole for your LLM where it retrains a model and removes ads, or processes the results inline and observes ads and removed them.
I think that the exploration and discovery of my youth is here, and while I want to be cautious and safe, I think the ability to discover and explore like that is here for my kids. There will be dangers and pitfalls, and the AI in my GPU might just turn out to be a 40 year old living in its parents basement, but I will end up buying a GPU just like my parents made the life changing decision to buy a PC when I was a child. Hopefully it will be life changing for my children like it was for me (in a good way).
What do you think? Shoot me an email ai@onaclovtech.com