I agree with you. I just think “they” will take that fact and just sit with it. I think “they” will do everything they can to get multiple backdoors in there (and I use the term ‘backdoor’ loosely to mean anything that can programmatically circumvent the encryption). There are more of them, in terms of power and funding, than there are of us. They will eventually succeed, if only for short times each interval. That’s why I wrote that the solution is a chat revolution. I don’t know what that will look like, but we need something they can’t successfully attack.
This is a really great use of LLM! Seriously great job! Once it’s fully self-hostable (including the LLM model), I will absolutely find it space on the home server. Maybe using Rupeshs fastdcpu as the model and generation backend could work. I don’t remember what his license is, though.
Edit: added link.