Trading billions worth of idle compute, in exchange for a high-strike call option on the #3 player in the most-promising-vertical for AI, plus (presmuably) some access to their data, starts to sound like not a bad trade. Especially if you're pre-committed to betting your entire rocket company on winning in AI, and you're currently in sixth or seventh place.
SpaceX has invested a small amount as a share of its value in XAI, and could survive the loss of its investment.
Also, he owned the Miss Universe org (including Miss USA and Miss Teen USA) for decades, and he was known to walk into the dressing rooms of teen contestants as young as 15 while they were undressed. [0]
Also, he bragged about molesting women, and a court of law found that he sexually assaulted E Jean Carroll.
I haven't proven the case that Trump had sex with a minor, but there's way more than enough probable cause to believe it's more likely than not.
[0] https://web.archive.org/web/20200111171647/https://www.rolli...
Not really relevant to the thread, but there are simple answers to the "eViDeNcE??" question. You may have already known this.
yes
> When deepfake generators are capable of creating convincing imagery of flawless ideal fake humans, why do you suppose there’s so many real humans who report being non-consensual subjects of deepfake porn?
?
You say this so casually, as though it were a normal thing to know, or as if a normal person would know it. Does that actually seem true where you live right now?
And how do you know that, anyway, Harsh? I mean, all those "unblocked" games you stole to give away and that you also put on Github, that's one thing. But this...
Maybe you can explain why it is that, whenever lately I'm less than perfectly accurate on the technical requirements of using AI to generate kiddie porn, an entire legion of creepy anons comes pouring out of the woodwork to well-actually and bikeshed and bullshit about it? Are you really so anxious to prove your empiricism superior in this?
OpenAI tried to acquire Windsurf last year for $3B and couldn't.
1) A gamble based on cursor's compute constraint 2) if 1) plays out, he can purchase cursor via shares of spaceX over valued shares, at a fixed price should the valuation increase.
Wild conjecture.
not that it isn't wild regardless
The main frenzy with Cursor started when you could access Anthropic models practically for free.
Otherwise it is just VS Code.
This is a bit simplistic. It's the VS Code that everyone used before cc came to town. Real devs, on real projects. All that data they collected is worth a lot more than "just vscode". Their composer2 is better than kimi2.5 and it's just a finetune on that data.
xAI had a decent model in grok4 (it was even sota on a bunch of benchmarks for a few weeks), but they didn't have great coding models (code-fast was ok-ish but nothing to write home about, certainly nowhere near SotA). Now that they've been banned from using claude, they'll get their expertise + data to build a coding model on top of whatever grok5 will be + their cluster for compute.
It doesn't sound like a bad plan to me, financial shenanigans or not.
> If you enable “Privacy Mode” in Cursor’s settings: zero data retention will be enabled for our model providers. Cursor may store some code data to provide extra features. None of your code will ever be trained on by us or any third-party.
Note the "may store some code data" and "none of your code will ever be trained on". In general you never want to include actual customer code in training the data, because of leaks that you may not want. Say someone has a hash somewhere, and your model autocompletes that hash. Bad. But that's not to say you couldn't train a reward model on pairs of prompts + completions. You have "some code data" (which could be acceptance rate) and use that. You just need to store the acceptance rate. And later, when you train new models, you check against that reward model. Does my new model reply close enough to score higher? If so, you're going in the right direction.
> If you choose to turn off “Privacy Mode”: we may use and store codebase data, prompts, editor actions, code snippets, and other code data and actions to improve our AI features and train our models.
Self explainatory.
> Even if you use your API key, your requests will still go through our backend!
They are collecting data even if you BYOK.
> If you choose to index your codebase, Cursor will upload your codebase in small chunks to our server to compute embeddings, but all plaintext code for computing embeddings ceases to exist after the life of the request. The embeddings and metadata about your codebase (hashes, file names) may be stored in our database.
They don't store (nor need to store) plain text, but they may store embeddings and metadata. Again, you can use those to train other things, not necessarily models. You can use metadata to check if you're going in the right direction.
Cursor needs their own 1st party backend model.
Sounds like a match made in heaven.
SpaceX spending $1B a month on various AI services seems ~plausible
(EDIT - Or maybe it's an IP transfer, or maybe it's over a longer time horizon. Idk but SpaceX clearly expects value from 'our work together' even if they don't exercise.)
And on the AI development side they're the ones providing compute in the form of a "million H100 equivalent Colossus training supercomputer"... On top of the cash.
But I agree that it's hard to articulate what Cursor services you could blow this much money on.
Maybe it is all just an option! Or maybe they get a bunch of IP either way?
I didn't say it was Wise.
I said it seems within possibility for this, very particular, corporation.
The cluster’s already paid for, so likely in the $2B range for operating cash needs. Not more than $5.
If I imagine bringing in Cursor’s team to build a frontier model, ideally combined with Grok, which has one of the few truly proprietary data feeds available to it, and with a much larger custom model Cursor can solidify a place, and I get to do a stock swap to buy it, this sounds like a bet worth making.
Upshot - I bet there’s an MS/oAI deal on IP on the back of this; meanwhile the cluster goes brrr.