this post was submitted on 13 Aug 2023
69 points (80.0% liked)

World News

32327 readers
348 users here now

News from around the world!

Rules:

founded 5 years ago
MODERATORS
top 18 comments
sorted by: hot top controversial new old
[–] Aesthesiaphilia@kbin.social 103 points 1 year ago (1 children)

How did Netflix know I was gay before I did?

Honey

EVERYONE knew

[–] Madison_rogue@kbin.social 73 points 1 year ago* (last edited 1 year ago) (1 children)

Seriously though, she chose a show that was randomly chosen by the algorithm, she watched it, and more content of that type was suggested to her by the algorithm.

This isn't quite rocket science.

[–] reflex@kbin.social 31 points 1 year ago* (last edited 1 year ago) (5 children)

and more content of that type was suggested

That, or they might have figured it out from her search patterns alone—like how Target figured out that one woman was gregnant before she did.

[–] Legolution@feddit.uk 15 points 1 year ago

It was never proven that the baby was Greg's.

[–] shinjiikarus@mylem.eu 14 points 1 year ago (1 children)

Has this story ever been confirmed by Target directly? As this happened in America and her father was outraged about it, it would have been awfully convenient, to “blame” the algorithm for “discovering”, she was pregnant. It takes quite a data analyst to figure out trends before someone even knows they are pregnant. It doesn’t take a genius to figure out a pattern for someone if they know they are pregnant and are just hiding it from their dad.

[–] what_is_a_name@lemmy.world 17 points 1 year ago (1 children)

Yes. It’s many years in my past, but this was confirmed. Target still does their targeting but now scatter unrelated items in the ads to hide what they know.

[–] NotSpez@lemm.ee 9 points 1 year ago

target still does their targeting

Awesome sentence

[–] oldGregg@lemm.ee 9 points 1 year ago
[–] finthechat@kbin.social 9 points 1 year ago
[–] Madison_rogue@kbin.social 8 points 1 year ago (2 children)

They didn't figure anything out. There's no sentience in the algorithm, only the creators of said algorithm. It only chose content based on input. So it all revolves around the choices of the article's author.

Same thing with the woman who was pregnant, the algorithm gave choices based on the user's browsing history. It made the connection that the choice of product A was also chosen by pregnant mothers, therefore the shopper might be interested in product B which is something an expecting mother would buy.

[–] reflex@kbin.social 15 points 1 year ago (1 children)

They didn't figure anything out.

Ugh, I was agreeing with you, and you go pedant. Come on, you should know "figure out" doesnt necessarily imply sentience. It can also be used synonymously with, "determine."

[–] Madison_rogue@kbin.social 7 points 1 year ago

Sorry, I misunderstood your tone. Apologize for going all pedantic…it’s a character flaw.

[–] ExLisper@linux.community 4 points 1 year ago

I believe in case of the pregnant women she was offered diapers and stuff. Based on food she bought. So it's no simply "you both diet coke, maybe try diet chocolate?". In case of Netflix there's no " A show only gay people watch" so her complaints are silly.

[–] EnderWi99in@kbin.social 40 points 1 year ago

Because you watched stuff that a lot of gay people watched and then watched more stuff the algorithm suggested based on your previous watch history. It's not magic or anything.

[–] autotldr@lemmings.world 13 points 1 year ago

This is the best summary I could come up with:


"Big data is this vast mountain," says former Netflix executive Todd Yellin in a video for the website Future of StoryTelling.

Facebook had been keeping track of other websites I'd visited, including a language-learning tool and hotel listings sites.

Netflix told me that what a user has watched and how they've interacted with the app is a better indication of their tastes than demographic data, such as age or gender.

"No one is explicitly telling Netflix that they're gay," says Greg Serapio-Garcia, a PhD student at the University of Cambridge specialising in computational social psychology.

According to Greg, one possibility is that watching certain films and TV shows which are not specifically LGBTQ+ can still help the algorithm predict "your propensity to like queer content".

For me, it's a matter of curiosity, but in countries where homosexuality is illegal, Greg thinks that it could potentially put people in danger.


I'm a bot and I'm open source!

[–] fubo@lemmy.world 11 points 1 year ago* (last edited 1 year ago) (2 children)

This sort of thing is just gonna happen with recommendation systems. There was a case over a decade ago where Target, the store, figured out that a teenager was pregnant before she told her family, and sent relevant mailings.

[–] Clav64@lemmy.ml 9 points 1 year ago

It's possible that this story didn't happen. Some points raised here highlight some areas we should remain skeptical.

https://medium.com/@colin.fraser/target-didnt-figure-out-a-teen-girl-was-pregnant-before-her-father-did-a6be13b973a5

[–] BastingChemina@slrpnk.net 1 points 1 year ago

The one that creeped me out is the fact that my parents received a sample package of baby product from Nestle, under my name a week before my wife gave birth.

Because we were living in another country, did not say anything on social media and did not go to any medical appointment in my parents country.