I've got the strong feeling that AI model and agent requires different operating system (OS) paradigm that's data centric rather than file-system for more efficient, effective and trustworthy operations. This new OS should work seamlessly with data natively across different processors for examples CPU, GPU, TPU, NPU, accelarators, etc.
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
I don’t get it. It says nothing leaves your computer, but it’s sending things to OpenRouter, not running models locally. Perhaps I am dumb (and I always feel dumb after reading an AI generated README for yet another AI tool Tbf)
Yes it appears your personal data IS being sent to open router and the model provider here. The problem I think is that a lot of people (especially in the openclaw community) mistake “I run it on my mac mini” to mean their data is private. Meanwhile all data is being shipped off for training to anthropic via openrouter and both of those parties see everything.
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
> Yes. OpenYak is local-first. Your conversations and files are stored only on your machine. When using cloud models, only API calls to LLM providers leave your computer.
So local-first and still upload files to cloud models if you configure it.
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
Not to be too conspiratorial here but since the founder of OpenClaw was snatched up, there seems to be a rush of “open source” AI projects desperately bidding to be alternatives. Which can generate huge returns if one of the major players decides that “they also need a cowork-style product”
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
https://news.ycombinator.com/item?id=47560380#47560381
When it's clear he is one of the major contributors to the project?
https://github.com/openyak/desktop/graphs/contributors
For working example, please check TabulaROSA (Tabular Operating System Architecture) proposed by the MIT team. Instead of normal OS system call, it utilizes data based operations with D4M that can work mathematically via associative array with structured or non-structered data [1],[2].
With the advent of new CPU acceleration with fully homomorphic encryption as demonstrated by Intel, the AI model and agent can even analyze the data without even decrypting them [3],[4].
[1] TabulaROSA: Tabular Operating System Architecture for Massively Parallel Heterogeneous Compute Engines
https://dspace.mit.edu/handle/1721.1/126114
[2] D4M: Dynamic Distributed Dimensional Data Model:
https://d4m.mit.edu/
[3] Intel Demos Chip to Compute with Encrypted Data (121 comments):
https://news.ycombinator.com/item?id=47322815
[4] Intel Demos Chip to Compute With Encrypted Data: Fully homomorphic encryption chip speeds operations 5,000-fold:
https://spectrum.ieee.org/fhe-intel
I guess you could theoretically plug in a local model here but of course the readme should be more precise here when talking about privacy
I agree that someone may misunderstand their phrasing though
So local-first and still upload files to cloud models if you configure it.
Given the software‘s broad appeal, I’d rephrase to make it more clear every word/file you send would leave your computer.
Where does it say that?
It sends to OpenRouter if you chose to use OpenRouter. Can use Ollama. Idk how to get more local than that? Any tool will be non-local, when you do something explicitly non-local.
Are you saying this part is a lie?
Here are the prompts I use for my AI environment, though it's changed a bunch since the last snapshot
https://github.com/rbren/personal-ai-devbox
What do you mean by interfaces in "These interfaces can do literally anything on the host machine. You're responsible for your own security"?
Also, your backdooring image links to a 404.
So its uniquely viable to be a sellout here and attempt to clone a major lab’s attempt on the off-chance you get acquired later
Just when I thought it couldn't get worse than OpenClaw, someone proposes this, in all seriousness. I see a stellar future for them at OpenAI.