this post was submitted on 14 Jun 2024
359 points (98.6% liked)

Technology

59300 readers
4818 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 45 comments
sorted by: hot top controversial new old
[–] cerement@slrpnk.net 141 points 5 months ago (2 children)
[–] neo@lemy.lol 27 points 5 months ago (1 children)

But don't worry those are one way mirrors

[–] unipadfox@pawb.social 6 points 5 months ago (1 children)

Yeah, so they can see inside...

[–] higgsboson@dubvee.org 8 points 5 months ago

you have correctly identified the joke.

[–] Zacryon@lemmy.wtf 10 points 5 months ago

"If you don't have anything to hide, you won't mind us looking, would you?"

[–] AlexanderESmith@social.alexanderesmith.com 50 points 5 months ago (1 children)

I was wondering what that ominous music was when I woke up this morning

[–] GluWu@lemm.ee 11 points 5 months ago

I thought I shit myself again but this is actually worse

[–] cyberpunk007@lemmy.ca 49 points 5 months ago (3 children)

Not sure what to make of this

[–] edwardbear@lemmy.world 135 points 5 months ago (1 children)

No reasons to be concerned, citizen. The former head of the largest surveillance agency in the world just joined as a C-level member to the largest data scraping company in the world

[–] cyberpunk007@lemmy.ca 44 points 5 months ago (1 children)

Ok that makes sense. I don't like this.

[–] WhatAmLemmy@lemmy.world 26 points 5 months ago (2 children)

You will tell the AI all of your most private thoughts and feelings. The AI will be your closest friend, lover, and confidant.

If you refuse to let the AI know everything about you, you will be considered a terrorist pedophile... a TERROR-PEDO!

[–] edwardbear@lemmy.world 11 points 5 months ago (1 children)

How dare you have secrets? What are you hiding there? Why are you trying to have privacy? How dare you?

[–] homesweethomeMrL@lemmy.world 11 points 5 months ago

Remain calm. Assume the position. Your patience is appreciated. A legally authorized operative will be with you shortly. Stop resisting. Or else it gets the hose again.

[–] Etterra@lemmy.world 7 points 5 months ago

Remember that Friend Computer loves you. Returning that love TO Friend Computer is MANDATORY, and failure to comply will be considered treason and thus grounds for IMMEDIATE TERMINATION. Thank you citizen and have a mandatorily happy day!

[–] ricecake@sh.itjust.works 21 points 5 months ago (2 children)

It's a bit of a non-story, beyond basic press release fodder.

In addition to it's role as "digital panopticon", they also have a legitimate role in cyber security assurance, and they're perfectly good at it. The guy in question was the head of both the worlds largest surveillance entity, but also the world's largest cyber security entity.
Opinions on the organization aside, that's solid experience managing a security organization.
If open AI wants to make the case that they take security seriously, former head of the NSA, Cyber command and central security service as well as department director at a university and trustee at another university who has a couple masters degrees isn't a bad way to try to send that message.

Other comments said open AI is the biggest scraping entity on the planet, but that pretty handily goes to Google, or more likely to the actual NSA, given the whole "digital panopticon" thing and "Google can't fisa warrant the phone company".

Joining boards so they can write memos to the CEO/dean/regent/chancellor is just what former high ranking government people do. The job aggressively selects for overactive Leslie Knope types who can't sit still and feel the need to keep contributing, for good or bad, in whatever way they think is important.

If the US wanted to influence open AI in some way, they'd just pay them. The Feds budget is big enough that bigger companies will absolutely prostrate themselves for a sample of it. Or if they just wanted influence, they'd... pay them.
They wouldn't do anything weird with retired or "retired" officers when a pile of money is much easier and less ambiguous.

At worst it's open AI trying to buy some access to the security apparatus to get contracts. Seems less likely to me, since I don't actually think they have anything valuable for that sector.

[–] lemmyvore@feddit.nl 8 points 5 months ago (2 children)

Lol. There are tons of security experts out there they could've hired. As Snowden said there's only one reason you hire from the NSA, to work with the NSA.

[–] ricecake@sh.itjust.works 5 points 5 months ago (1 children)

Yeah, there are a ton of security experts. But none of them are the former head of the NSA.

Snowden is not exactly a font of expertise in this area, so I'm not sure that his opinion is particularly relevant. His only actual relevance is that he had access to classified data. He had no role in policy, and never had anything to do with business hiring practices.

[–] lemmyvore@feddit.nl 4 points 5 months ago (1 children)

there are a ton of security experts. But none of them are the former head of the NSA.

That doesn't make the point you think it makes. 🙂

Look at it this way. You can get the same expertise, in any branch you'd care to name, elsewhere. Hiring, security etc.

What this guy is uniquely positioned to do, what you can't get from anybody else, is oversight of integration with NSA surveillance. And that's where the smell comes from.

[–] ricecake@sh.itjust.works 2 points 5 months ago

Well, I'd contend that the same expertise isn't just readily available. Yes, he's uniquely positioned for connection to the surveillance apparatus, but the reputation of being the federal governments head security is also a unique credential.

[–] ealoe@ani.social 0 points 5 months ago

Wtf would a low level IT contractor turned spy know about that? Quoting Snowden just makes you look like a moron

[–] exanime@lemmy.today 0 points 5 months ago (1 children)

At worst it's open AI trying to buy some access to the security apparatus to get contracts. Seems less likely to me, since I don't actually think they have anything valuable for that sector.

Didn't you just also said this in this same post?

The Feds budget is big enough that bigger companies will absolutely prostrate themselves for a sample of it

[–] ricecake@sh.itjust.works 2 points 5 months ago (2 children)

Those aren't contradictory. The Feds have an enormous budget for security, even just "traditional" security like everyone else uses for their systems, and not the "offensive security" we think of when we think "Federal security agencies". Companies like Amazon, Microsoft, and Cisco will change products, build out large infrastructure, or even share the source code for their systems to persuade the feds to spend their money. They'll do this because they have products that are valuable to the Feds in general, like AWS, or because they already have security products and services that are demonstrably valuable to the civil security sector.

OpenAI does not have a security product, they have a security problem. The same security problem as everyone else, that the NSA is in large part responsible for managing for significant parts of the government.
The government certainly has interest in AI technology, but OpenAI has productized their solutions with a different focus. They've already bought what everyone thinks OpenAI wants to build from Palantir.

So while it's entirely possible that they are making a play to try to get those lines of communication to government decision makers for sales purposes, it seems more likely that they're aiming to leverage "the guy who oversaw implementation of security protocol for military and key government services is now overseeing implementation of our security protocols, aren't we secure and able to be trusted with your sensitive corporate data".
If they were aiming for security productization and getting ties for that side of things, someone like Krebs would be more suitable, since CISA is a bit more well positioned for those ties to turn into early information about product recommendations and such.

So yeah, both of those statements are true. This is a non-event with bad optics if you're looking for it to be bad.

[–] lemmyvore@feddit.nl 1 points 5 months ago* (last edited 5 months ago) (1 children)

I've always kinda assumed that government, surveillance and analytics would be OpenAI's main goals, and that consumer stuff is just for marketing and a good image. There's no money and no point in enabling Jimmy Random to use GPT to find out if Africa exists, and the commercial applications of the models they produce can be better leveraged differently (black boxed TPU hardware for example).

That's also what I assume Goggle's been doing with all the data they collect. The location data alone they collect from billions of phones is an analyst's wet dream.

If it turns out they are NOT selling all that data to be mined by evil overlords I'm gonna be disappointed.

[–] ricecake@sh.itjust.works 3 points 5 months ago

Oh, to me it just doesn't remotely look like they're interested in surveillance type stuff or significant analytics.

We're already seeing growing commercial interest in using LLMs for stuff like replacing graphic designers, which is folly in my opinion, or for building better gateways and interpretive tools for existing knowledge based or complex UIs, which could potentially have some merit.

Chat gpt isn't the type of model that's helpful for surveillance because while it could tell you what's happening in a picture, it can't look at a billion sets of tagged gps coordinates and tell you which one is doing some shenanigans, or look at every bit of video footage from an area and tell you which times depict certain behaviors.

Looking to make OpenAI, who seem to me to be very clearly making a play for business to business knowledge management AI as a service, into a wannabe player for ominous government work seems like a stretch when we already have very clear cut cases of the AI companies that are doing exactly that and even more. Like, Palantirs advertisements openly boast about how they can help your drone kill people more accurately.

I just don't think we need to make OpenAI into Palantir when we already have Palantir, and OpenAI has their own distinct brand of shit they're trying to bring into the world.

Google doesn't benefit by selling their data, they benefit by selling conclusions from their data, or by being able to use the data effectively. If they sell it, people can use the data as often as they want. If they sell the conclusions or impact, they can charge each time.
While the FBI does sometimes buy aggregated location data, they can more easily subpoena the data if they have a specific need, and the NSA can do that without it even being public, directly from the phone company.
The biggest customer doesn't need to pay, so targeting them for sales doesn't fit, whereas knowing where you are and where you go so they can charge Arby's $2 to get you to buy some cheese beef is a solid, recurring revenue stream.

It's a boring dystopia where the second largest surveillance system on the planet is largely focused on giving soap companies an incremental edge in targeted freshness.

[–] exanime@lemmy.today -1 points 5 months ago (1 children)

So you are speculating this is all good and innocent while I'm speculating they hire this guy to aim their data harvesting in a way the government would want to pay tons for it... Yet your speculation is apparently more valid then mine because, checks notes, reasons

[–] ricecake@sh.itjust.works 1 points 5 months ago

Yes, neither of us is responsible for hiring someone for the OpenAI board of directors, making anything we think speculation.

I suppose you could dismiss any thought or reasoning behind an argument for a belief as "reasons" to try to minimize them, but it's kind of a weak argument position. You might consider instead justifying your beliefs, or saying why you disagree instead of just "yeah, well, that's just, like, your opinion, man".

[–] doylio@lemmy.ca 5 points 5 months ago

I don't like it

[–] Etterra@lemmy.world 38 points 5 months ago

I'm sure there's nothing nefarious going on here.

[–] Sibbo@sopuli.xyz 27 points 5 months ago (2 children)
[–] Mojave@lemmy.world 3 points 5 months ago (1 children)

Timothy D. Haugh is the current head. Nakasone is the former director of the NSA.

[–] Eximius@lemmy.world 5 points 5 months ago* (last edited 5 months ago)

Thanks for official figures. However his comment is valid still.

[–] 1984@lemmy.today 21 points 5 months ago (1 children)

Retired army general as well. Such a nice person.... Not.

[–] woodytrombone@lemmy.world 1 points 5 months ago

I've heard stories. Decent DIRNSA, proud father, but a real dick sometimes.

[–] conditional_soup@lemm.ee 13 points 5 months ago

Those tech bros are up to something

[–] gmtom@lemmy.world 13 points 5 months ago* (last edited 5 months ago)

Lmao my dyslexic ass read this as "formedlr head of NASA" and was confused why everyone was so doomy about itm

[–] toynbee@lemmy.world 10 points 5 months ago

I'm not alarmed. You're alarmed.

[–] phoneymouse@lemmy.world 1 points 5 months ago

Helen Toner was right

[–] mechoman444@lemmy.world -1 points 5 months ago (1 children)

We should do this for literally all people getting jobs: former waffle House line cook gets job at amazon sorting center.

[–] EncryptKeeper@lemmy.world 2 points 5 months ago (1 children)

Or maybe just the ones with interesting connotations behind them, like this one.

[–] mechoman444@lemmy.world -2 points 5 months ago (1 children)

There is no interesting connotation associated with this.

A high up employee of the NSA is going to go work for a tech firm associated with AI and y'all are sitting there looking at it going 🤔🤔🤔. I wonder what that means.

Come on. Grow up.

[–] EncryptKeeper@lemmy.world 4 points 5 months ago (1 children)

Not a “high up employee”, the director of the NSA. The United States premier agency for gathering information on American Citizens. And he’s not “going to work for” OpenAI he’s joining as a board member of a company that’s one of the United States premier private companies for gathering information from American citizens.

That’s a whole hell of a lot of overlap, especially considering your suspicious attempt to downplay not only this connection, but the role of the man himself.

[–] mechoman444@lemmy.world 0 points 4 months ago (1 children)

And what exactly do you think this situation is going to do exactly? This dude is going to dead drop microfilm under a park bench for the NSA to pick up?

Pick literally any other company that's working on AI right now (which is pretty much all of them) and have them hire this guy. Google, Apple, Microsoft... I bet you wouldn't be as "worried".

Furthermore, if you think that the US government and its various agencies, branches and affiliations haven't already been either using or exploiting AI for nefarious reasons. You're really naive.

[–] EncryptKeeper@lemmy.world 0 points 4 months ago* (last edited 4 months ago) (1 children)

You don’t think a “former” agent of the U.S. government agency with the explicit purpose of information gathering having a position of control in a private company with the explicit purpose of information gathering, might have a vested interest in that position beyond paying the bills? I suppose you think the corporate telco/isp lobbyists getting jobs at the FCC is all on the up and up as well.

Geez talk about naive. What is it that you think can’t happen?

I love the ongoing illogical downplaying you keep doing too. “Dead drop microfilm” lmao. How overdramatic.

[–] mechoman444@lemmy.world 0 points 4 months ago

The fact that you put former in quotations pretty much tells me everything I need to know.

People work in the government. They work in high-end positions. They even become directors, senators and congressmen and then they leave those positions and then they get other jobs. That doesn't mean that they are still somehow spying on that company for the government.

The government doesn't need to implant people into private companies that is completely unnecessary.

You need to get off this sub and go back r/conspiracy with your unsubstantiated anecdotal conspiratorial nonsense.