

more development on offline services
There is absolutely massive development on open weight models that can be used offline/privately. Minimax M2, most recent one, has comparable benchmark scores to the private US megatech models at 1/12th the cost, and at higher token throughput. Qwen, GLM, deepseek have comparable models to M2, and have smaller models more easily used on very modest hardware.
Closed megatech datacenter AI strategy is partnership with US government/military for oppressive control of humanity. Spending 12x more per token while empowering big tech/US empire to steal from and oppress you is not worth a small fraction in benchmark/quality improvement.
any chance this can be done through your router/modem, where your phone app connects to external ip of router and is the “server end point” for your doorbell?