Wednesday, January 28, 2026

Text To Speech and Voice Cloning

 There are seemingly a million Text to Speech and voice clone things being released. I'm going to note what I find, or my buddy Nate sends me here. (Also some other interesting tools will be noted too ;)). I'm noting the date of their first github commit on the repository too.

TTS (Text to Speech)

LuxTTS: https://github.com/ysharma3501/LuxTTS 1/23/2026

Qwen3TTS: https://github.com/QwenLM/Qwen3-TTS 1/22/2026 

PersonaPlex: https://research.nvidia.com/labs/adlr/personaplex/ 1/15/2026 

Chroma: https://github.com/FlashLabs-AI-Corp/FlashLabs-Chroma 11/13/2025

NeuTTS: https://github.com/neuphonic/neutts 10/2/2025 

Inworld TTS: https://github.com/inworld-ai/tts 7/29/2025

Pocket TTS: https://github.com/kyutai-labs/pocket-tts

Soprano TTS: https://github.com/ekwek1/soprano 

Sopro TTS: https://github.com/samuel-vitorino/sopro

VibeVoice: https://github.com/microsoft/VibeVoice

Liquid Audio: https://github.com/Liquid4All/liquid-audio

VoxTream TTS: https://github.com/herimor/voxtream 

Mimo Audio: https://github.com/XiaomiMiMo/MiMo-Audio 

VoxCPM TTS;  https://github.com/OpenBMB/VoxCPM

 

... There are plenty more, I'll update as I have time. 

STT (Speech to Text)

Whisper 

GLM ASR Nano: https://github.com/zai-org/GLM-ASR

Tools:

Qwen3-Audiobook-Converter: https://github.com/WhiskeyCoder/Qwen3-Audiobook-Converter

Meta Segment Anything Model Audio: (No link, I have facebook/meta blocked at my house, and don't feel like turning off my blocker).


 

 * Note I got hit with a rate limit checking the commit history of the repos soooo dunno when I'll have dates on some of these.

We are in the beginning of the personal AI Era

 Ok yes AI is everywhere. Duh.

I have been thinking about upgrading my GPU situation at home, as I am debating and deciding I am coming to the conclusion that this "feels" kinda like the beginning of the personal computer era. (Though with some differences) Let me explain.

Doing a Google it claimed in 1988 PCs were between $700 and $2000 for the average home. Converting that into today's dollars the range moves to $1917 and $5479. 

Currently the prices of GPUs which (I think) sit in the middle ranges there are the 24-40 GB VRAM level devices (ignoring things like the M4 mini with their situation, largely because while it would run models it is likely to be slower than a dedicated GPU).

Here is my hypothesis. Open weight models (I guess I can't say open source), will get "Good Enough" that to the average consumer there isn't a benefit to using paid models, those models will fit on hardware in that range of pricing, and will be capable enough. The cloud is obviously a factor, but I speculate the big (private model) AI companies will continue to develop AI that is getting better and better. 

 

This AI will ultimately be used for the "Big Problems", that consumers or even tech enthusiasts don't need. They will continue to offer a tiered access to their AI. I think that the extremely advanced models will only be accessed by people with big $$. Aka businesses, or rich folks.  Models that compare to state of the art open source models will cost some amount of money, and worse models will be cheaper of course. Local hosted models will largely be a middle class space for those early tech adopters. I believe.

 

I was looking and for the price of $200 a month (one of OpenAIs tiers), it would take around a year to break even from buying your own lower end GPUs (ignoring costs like power).  

 

Moving on. Thinking about running my own GPU.  A few factors come into play here. 1. If you have your own GPU, there are no cost limits to token usage (besides the natural limit of how much you can compute in a given day based on processing power/etc). There is no guessing how much things will cost, or whether you will be rate limited per your plan. 2. Privacy, I think more people are slowly moving towards increased privacy. I am sure in a moment of experimentation I've asked ChatGPT things I wish I hadn't, so moving all my future conversations or potentially sensitive conversations off those platforms is in my best interest. 3. Ads, everyone is doing it, and ultimately it will come to the paid/private models. Though I can't say for sure whether local models will, I am willing to claim, there will be a market for someone to build the equivalent to pihole for your LLM where it retrains a model and removes ads, or processes the results inline and observes ads and removed them.

I think that the exploration and discovery of my youth is here, and while I want to be cautious and safe, I think the ability to discover and explore like that is here for my kids. There will be dangers and pitfalls, and the AI in my GPU might just turn out to be a 40 year old living in its parents basement, but I will end up buying a GPU just like my parents made the life changing decision to buy a PC when I was a child. Hopefully it will be life changing for my children like it was for me (in a good way).

What do you think? Shoot me an email ai@onaclovtech.com 

Friday, January 16, 2026

the inattention economy

 I have been noticing something recently and while everything is trying to get our attention, I've noticed, I ignore a lot more. Oh an ad popped up, I jump to another device or task and blank it out. Or I pop an earbud out and look a way a bit *trying* to forget what I heard on the hopes it doesn't earworm itself into my brain.

 

I wonder if all the inattention in other parts of my life are a result of trying to avoid ads so badly.

 

I think I need to move away from anything with ads not sure if I can, but dang I want to badly. 

Saturday, January 3, 2026

ease of use

 I think I've come to realize that the easier things are to use the more you're likely to use them of this is obvious when it comes to computer interfaces but I've been realizing it more and more when it comes to even things like my house

I have a leaf blower but I almost never use it because it's a pain in the butt to get to. I have a cricut I use it occasionally but not often enough because it's a pain to get to I have to take it off of a shelf I have to move it to a location that has the space and so on. 

 

I'm looking into being more organized this coming year so hopefully I can do that.