The best Hacker News stories from All from the past week

Go back

Latest posts:

Pirate Library Mirror: Preserving 7TB of books (that are not in Libgen)

How much health insurers pay for almost everything is about to go public

YouTube removes criticism of dangerous fractal wood burning, but leaves up tips

Atlassian is 20 years old and unprofitable

Atlassian is 20 years old and unprofitable

Give up GitHub: The time has come

Give up GitHub: The time has come

Coinbase is reportedly selling geolocation data to ICE

Coinbase is reportedly selling geolocation data to ICE

Supreme Court limits EPA’s power to cut emissions

Supreme Court limits EPA’s power to cut emissions

Thunderbird 102

Thunderbird 102

Why America can’t build

Why America can’t build

Ask HN: GPT-3 reveals my full name – can I do anything?

Alternatively: What's the current status of Personally Identifying Information and language models?<p>I try to hide my real name whenever possible, out of an abundance of caution. You can still find it if you search carefully, but in today's hostile internet I see this kind of soft pseudonymity as my digital personal space, and expect to have it respected.<p>When playing around in GPT-3 I tried making sentences with my username. Imagine my surprise when I see it spitting out my (globally unique, unusual) full name!<p>Looking around, I found a paper that says language models spitting out personal information is a problem[1], a Google blog post that says there's not much that can be done[2], and an article that says OpenAI might automatically replace phone numbers in the future but other types of PII are harder to remove[3]. But nothing on what is <i>actually</i> being done.<p>If I had found my personal information on Google search results, or Facebook, I could ask the information to be removed, but GPT-3 seems to have no such support. Are we supposed to accept that large language models may reveal private information, with no recourse?<p>I don't care much about my <i>name</i> being public, but I don't know what else it might have memorized (political affiliations? Sexual preferences? Posts from 13-year old me?). In the age of GDPR this feels like an enormous regression in privacy.<p>EDIT: a small thank you for everybody commenting so far for not directly linking to specific results or actually writing my name, however easy it might be.<p>If my request for pseudonymity sounds strange given my lax infosec:<p>- I'm more worried about the consequences of language models in general than my own case, and<p>- people have done a lot more for a lot less name information[4].<p>[1]: <a href="https://arxiv.org/abs/2012.07805" rel="nofollow">https://arxiv.org/abs/2012.07805</a><p>[2]: <a href="https://ai.googleblog.com/2020/12/privacy-considerations-in-large.html" rel="nofollow">https://ai.googleblog.com/2020/12/privacy-considerations-in-...</a><p>[3]: <a href="https://www.theregister.com/2021/03/18/openai_gpt3_data/" rel="nofollow">https://www.theregister.com/2021/03/18/openai_gpt3_data/</a><p>[4]: <a href="https://en.wikipedia.org/wiki/Slate_Star_Codex#New_York_Times_controversy" rel="nofollow">https://en.wikipedia.org/wiki/Slate_Star_Codex#New_York_Time...</a>

Life is not short

Life is not short

What happened to the lab-leak hypothesis?

What happened to the lab-leak hypothesis?

< 1 2 3 ... 100 101 102 103 104 ... 147 148 149 >