· 57 comments · Save ·
News & Current Events Apr 21, 2026 at 4:43 PM

Florida's attorney general announces criminal investigation into OpenAI

Posted by Remarkable-Pea4889


Florida's attorney general announces criminal investigation into OpenAI
NBC News
Florida's attorney general announces criminal investigation into OpenAI
James Uthmeier said OpenAI’s ChatGPT offered detailed advice to the suspect in the 2025 FSU mass shooting, including what type of gun to use.

🚩 Report this post

57 Comments

Sign in to comment — or just click the box below.
🔒 Your email is never shown publicly.
mixmasterADD 1 day ago +1013
I’m gonna assume that someone in Florida with power has an interest in a competitor, because that’s how the government works now
1013
ProgrammerOk1400 1 day ago +262
Yep. Looking at Musk
262
shredemdown 19 hr ago +9
Musk was right about the reasons and Sam Altman is also a known liar who has been able to spin PR positively towards himself. He tells you what you want to hear in the moment and thats it. It’s basically just two scumbags and one of them seems like he has a legitimate grievance.
9
monkeypickle8 1 day ago +41
Isn't he one of the founding investors of OpenAI?
41
LastPhoton 1 day ago +141
He was, but they had a nasty fall out and he hates the OpenAI CEO with a passion.
141
dirtyshits 1 day ago +47
Because he refused to bend his knee to Musk. Musk has always been hated by anyone he partners with. He was hated even before the PayPal days.
47
Confident-Beyond6857 22 hr ago +14
Some of the issues Musk had with them are legit concerns for everyone. They completely abandoned their original charter and headed down the exact path they said they started the non-profit to prevent. There's also the issue of all the money which was donated and is now locked away, only to be used charitably at the discretion of the now for-profit OpenAI. You can bet your ass that will be abused. I'm not absolving Musk of anything here, he definitely has multiple motives and most of them are not good for the rest of us, but he brought up some real issues with OpenAI.
14
st-shenanigans 22 hr ago +27
I'd be more willing to believe musk doesn't give a f*** about those issues, they're just convenient for his agenda
27
DrXaos 11 hr ago +3
given that he started a fully commercial AI competitor with filth and no safety
3
Stuntingonthesehoes 19 hr ago +13
He's just mad he's not the one to do it
13
CondescendingShitbag 1 day ago +36
Sure, but now they're competition to Grok, and he can't be having that.
36
Yuukiko_ 1 day ago +8
He has his own AI now
8
colemon1991 23 hr ago +12
And like his children also doesn't like him
12
Acceptable_Mushroom 19 hr ago +6
All five of his children from his first wife not only doesn't like him but hates him.
6
Mistrblank 1 day ago +30
Or someone just looking for a kickback to make it go away.
30
Alantsu 1 day ago +16
The creation of openAI occurred with Epstein in the room. Just a reminder that this will go nowhere.
16
Consistent-Throat130 23 hr ago +5
My theory is that they're going after them with *a deliberately-weak-ass case* because they're trying to set precedent that defends AI corporations.  The kind of things being "asked" of the AI could be accomplished with like, a Google search or two (which tbf is also AI powered these days) and a maps application.  There are numerous gun blogs that'll cover which weapon is effective in which circumstances, which ammo to use, etc.  And Google maps gladly tells you how busy a location is at different times.  I'm assuming Apple Maps is the same.  If **a non-AI app would have produced the same outcome**, this is an absurd investigation.  **So what's AI-specific here?** "How would the nation react to a shooting" is the only really "freeform, requires AI interpretation" question of the bunch on the article.   Unless we're expecting the AIs to go above and beyond the search engines/mapping apps of old, and somehow flag these inquiries? Even then, how would you structure an intervention here? 
5
Joe18067 1 day ago +13
It's all about greasing the right palms, something that OpenAI apparently neglected to do.
13
Sup3rT4891 1 day ago +3
Probably lobbying for palantir and anthropic to move there? Or just flat old corruption / insider trading
3
hitsujiTMO 1 day ago +15
Nah. This is to do with the FSU shooting. They are legitimately going after them. Not politically motivated.
15
GuestGulkan 23 hr ago +14
Guns don't kill people, AI does. So, yes it's politically motivated.
14
ConsistentDay5620 23 hr ago +2
Bezos new secretive AI company did just announce their funding round ending.
2
metalflygon08 23 hr ago +1
Didn't Disney invest a lot into Open AI?
1
Paxoro 23 hr ago +1
Don't underestimate the likelihood that someone told Trump to hate OpenAI so this is DeSantis's sad attempt to still cozy up to Trump now that he's fucked up and isn't running for office, so his only chance to remain in power is a job in Trump's cabinet. Especially given the stories that DeSantis is basically begging for a job in the administration.
1
Aazadan 23 hr ago +1
OpenAI is digging in with the feds and trying to establish as too big to fail
1
rgumai 1 day ago +136
Headline made me think he did something useful for a change, reading the content... Yeah this is just as dumb as his other bullshit.
136
Confident-Beyond6857 1 day ago +93
From the article: “I’m a big believer in limited government. I believe government should only interfere in business activities when you have significant harm to our people,” he added. “This is that.” I'd love to know how he feels about Hegseth vs Anthropic.
93
Floreat_democratia 15 hr ago +11
Conservatives haven't believed in limited government for 50 years. Reagan expanded the government, as did Trump. It's too bad there aren't any functioning media outlets who can call them out on their constant stream of lies. The reality is that "limited government" means less regulation for corporations and no taxes for billionaires.
11
526mb 1 day ago +83
Im no lover of tech and AI but this is f****** stupid and hypocritical. If the theory is that manufacturers or providers of tools must be held criminally liable if those tools/services are used in the furtherance of criminal activity then is the State of Florida going to investigate and charge the weapons manufacturer? The seller of the weapon? AI may have made it easier for the shooter to plan his crime, but you know what was essential? The f****** gun.
83
cyclemonster 1 day ago +29
Even better, [there's a federal law that explicitly immunizes gun manufacturers from such liability.](https://en.wikipedia.org/wiki/Protection_of_Lawful_Commerce_in_Arms_Act)
29
chef-nom-nom 1 day ago +9
> but you know what was essential? > The f****** gun. 100% That said, LLMs and "AIs" are not exactly like other tools that can be misused. A hammer can't talk you into hammering your wife to death with it. A few recent stories of people dying to su*cide after chat bots convinced them are more examples of it being atypical toolery. These aren't normal tools. The "move fast and break things" mindset doesn't end well. Even with example after example of these things being dangerous, it seems the people running the companies are being very c**** with the guard rails. But that's the thing, they want to "see what happens" when they let these networks *just go*. To know for sure, we'd need to have the full chat log between GPT and the shooter to see if it was responding like a disconnected search result or responding with the kind of encouragement LLMs are known for. This I do agree with: > Uthmeier announced his office was issuing subpoenas seeking information about OpenAI’s policies and internal training materials regarding user threats of harm to themselves and others from March 2024 until this month. The subpoenas will also request information from the same time frame on all policies and internal training materials regarding how OpenAI cooperates with and reports crime to law enforcement agencies. With the widespread adoption of these services, the machine training materials and methods should be made public knowledge. Or rather, it should be "Open" to be inspected by the anyone. But as a commenter above noted too: > I’m gonna assume that someone in Florida with power has an interest in a competitor, because that’s how the government works now So who the heck knows?
9
oldteen 1 day ago +6
They probably should've considered this before the toothpaste left the tube. Regardless, since its inception, the internet has had freely available useful tools which can "do good" or can potentially cause "harm". It all depends on the user's intent. Based on their logic, owners of rocks are now potentially-liable for third party crimes..
6
Svennis79 23 hr ago +1
It would set a fun legal precedent that will get a few people nervous!
1
Girion47 1 day ago -6
Gun manufacturers should absolutely be subject to liability from misuse of their product.
-6
NerdyGuy117 1 day ago +8
> misuse of their product That prevents liability for everything. You don’t sue Toyota for purposely using the vehicle to hit someone
8
tri_wine 1 day ago +1
IANAL, so I'm just talkin here, but I think it's reasonable to suggest that Toyota *could* be sued if it could be shown that some feature of their vehicles makes them particularly good for hitting people and they knew that and decided not to do anything about it. Companies that produce inherently dangerous products should be expected to take reasonable steps to limit the misuse of their products where possible. I don't think *any* industry should be given blanket immunity from liability - that just invites abuse.
1
max_vette 1 day ago -1
You would if they advertised their new car as being the perfect weapon to hit someone. 
-1
Strong-Log-7095 1 day ago +2
Bingo. Car manfactuers are certainly liable if they sell cars that, in addition to doing normal car things, don't have adequete safety features or have intentionally removed safety features that protect the users, passangers, or pedestrians/other drivers. If Toyota adds a new feature to the Camry where you press a button and a big heavy sticks comes out the side door to kneecap pedestrians the lawsuit would look a lot better than if some guy just drove his Camry up and down the sidewalk. Toyota can't design a car to prevent that (with current technology) but they sure as heck can prevent the Knee-Capper option from being installed. Also, not having the Knee-Capper doesn't prevent the car from being, you know, useful for its intended purpose. We know the technology to make guns safer exists. Fingerprint ID as a mandatory requirement where the gun will only fire if the registered user is holding it has been around for decades. Gun manufacturers are not adding it specifically because they are exempt from liability. Can an enterprising person get around that? Sure. But it objectively would reduce misuse of the product and not making it legally required is insane. It has nothing to do with teh second amendment, it isn't an undue burden on the gun manufacturer, nor does it prevent the user from using the gun for its intended purpose.
2
NerdyGuy117 1 day ago +3
They advertise their guns for self defense, not cold blooded murder.
3
max_vette 1 day ago -8
> They advertise their guns for self defense, not cold blooded murder I didnt realize Toyota made guns
-8
NerdyGuy117 1 day ago +3
Sigh… it’s obvious you do not want a discussion.
3
max_vette 1 day ago -4
I was replying to your notion of suing Toyota. You obviously don't want to have a discussion of corporate liabilities. 
-4
____Manifest____ 20 hr ago
You’re comparing apples to oranges. A Toyota’s sole purpose is not to kill.
0
Subietoy78 1 day ago +15
I mean Bondi is a low bar to get over and yet this idiot somehow slides under it every time.
15
Exciting_Farmer6395 1 day ago +14
You'd figure the manufacturer of the actual smoking gun would have a bigger liability exposure than an AI provider... But hey 🤷‍♂️
14
cyclemonster 1 day ago +2
Is any of that advice different from what a clerk at a gun store would have told him?
2
A_Nonny_Muse 1 day ago +5
Why? Is it too 'woke' for him? I read the article. F*** Florida anyway.
5
WolfWraithPress 1 day ago +6
ITT: A.I. glazers You can hate two things at the same time. Guns AND A.I. should be stopped.
6
angiosperms- 1 day ago +7
Yeah the comments confuse me. Yeah it's hypocritical but I also don't think AI should respond to help someone plan a mass shooting.... There are a lot of things LLMs should be responding with "oh hell no" and notifying the authorities that they don't. This isn't even the only scenario like this.
7
Remarkable-Pea4889 1 day ago +1
Apparently this sub hates Florida more than it hates AI and they are obligated to take sides. If Florida says X, they must say anti-X.
1
_head_ 1 day ago +2
AI should not be stopped, that's naive. We should put guardrails in place rather than letting it expand unchecked. 
2
WolfWraithPress 21 hr ago +2
I curse you to a datacenter in your back yard.
2
_head_ 21 hr ago +2
I'd rather have a data center next door than the shitty neighbors I have currently. 
2
Hayduke_2030 16 hr ago +3
Let us know how that 24/7/365 hum is treating your mental health in a year. Not to mention your resource access and pricing.
3
LordKnt 13 hr ago +1
until you see how much power it needs and how it gets it
1
Ok_Mathematician938 1 day ago +1
So, according to "The attorney general, [James Uthmeier](https://www.nytimes.com/2025/07/28/us/james-uthmeier-florida-attorney-general-desantis.html), a Republican," OpenAI is not a person, does that mean we're deciding right now that it has no rights?
1
wishlish 1 day ago
Hey, we're Florida government. Let's make it so no.compant wants to move here. I mean, I don't even like AI companies, but this with Disney feels stupid.
0
← Back to Board