Hacker Newsnew | past | comments | ask | show | jobs | submit | puppycodes's commentslogin

All overdoses and accidents like this are tragic but I struggle with the whole insinuation that we should blame an app like ChatGPT for the decisions folks make with their own bodies.

ChatGPT is a hell of a lot safer than the friends I had in high school.


I definitly think more tools like this are needed, but not open sourcing it is a mistake.

You will be quickly replaced by a friendlier competitor.


There already are open source extensions. Visor is one I remember off the top of my head. https://marketplace.visualstudio.com/items?itemName=sidhants...

We will make it open source soon! Follow me on X (@Davellele) to be updated when we do :)

Thats good to hear but what are you waiting for?

Twitter/X is a big nope.

For a while now traveling to the UK should be treated like visiting China or similar.

Leave your devices at home and expect zero privacy rights.



It’s almost like there’s a war on, or something, the way these major powers are acting…

How does this work over time?

Do you have to keep submitting this every month as they recollect your info from databases in other states?

Seems great in concept but I am skeptical this will change much.

Data doesn't respect state lines.


I would assume so. It's sort of a catch 22 because if they delete your data, they have no way of knowing about you when they buy another batch of data. To have some sort of no track list, they have to keep your data.

I'm also skeptical it will have any real effect. The law requires them to process deletion requests at a 45 day interval:

> Data brokers are required to process deletion requests at least once every 45 days beginning August 1, 2026.

But what if Broker A (based in CA) has a contract with Broker B, who doesn't do business in CA, to sync data once a day. Now Broker A will have your data on 44 out of 45 days and still be fully compliant with the law. Furthermore, it's not difficult to figure out when that 45 day interval comes up, so I would expect customers to figure that out and time their purchases accordingly.


> I would assume so. It's sort of a catch 22 because if they delete your data, they have no way of knowing about you when they buy another batch of data. To have some sort of no track list, they have to keep your data.

They could store a normalised, hashed version of your data and use it to filter any incoming datasets. But, of course, why would they?


That wouldn't really work because the hash key has to be both specific enough to be unique to you and also general enough to cover any incomplete data set that matches you.

It would work in many cases, though not all. You would not hash everything together. Instead, you hash normalized identifiers independently, such as email address, phone number, or physical address. An incoming dataset would only need to match one of these to be excluded.

> physical address

Not unique to a person

> email address, phone number

Also often not unique to a person, although email addresses probably tend to have much longer lifespans as identifiers than phone numbers.

If the idea is to have a true opt-out system, it's really really difficult to implement given how these systems work.

If you look at the data provided by services like accurint, you'll frequently see the same SSNs used for decades by multiple different individuals, often with IDs from different states with the same name and DoB despite obviously being different people. With how the system works in the US, it can often be impossible for anyone to determine which physical person the SSN was actually originally assigned to.

Same obviously applies to other identifiers you suggested, but even the seemingly good ones are not very good at uniquely identifying people.


You could of course key on things like SSNs, but data brokers wouldn't be very happy about that because there are lots of SSNs tied to multiple different people.

Won't somebody think of the data brokers!?

The government will, given that they're a fairly integral part of how the US economy.

Every single financial institution relies on these data-brokers. U-haul needs data brokers to be able to verify your driver's license, the TSA needs data brokers to let you on a flight without an ID. There are simply countless of reasons for why you wouldn't want to break this system for people who haven't opted in for breakage.


It is a delete request. Your behavior may change and is on you. So, if you always don’t consent, nothing to delete.

That isn't how the collection of data works.

It's not like brokers wait around for you to sign up for something new.

Old data is resold, merged with new data, mixed, stolen, discovered, reformatted... etc...

Your actions of course do have an impact, but does changing your behavior prevent the outcome of your data being collected?

Not even close.


But you did consent every time you agree to some TOS you don't read. This is, of course, stretching the definition of consent, but legally you did.

You can see this in action today, if you make the effort to manually remove yourself from data brokers.

Some of the brokers do offer an easy removal process and will handle your request right away, but then your record will reappear after some amount of time, obviously purchased from another broker.

I would not be surprised to discover that these individual brokers are, in fact, owned by the same entity and they merely exchange records periodically.

This is the reason that I choose to use Optery. They have the bandwidth and tools to chase my records on my behalf, for as long as I pay them.


> I would assume so. It's sort of a catch 22 because if they delete your data, they have no way of knowing about you when they buy another batch of data. To have some sort of no track list, they have to keep your data.

If I ever stumble upon such an obvious oversight/loophole, I find it's best to not immediately stop, but to ask: "How do they intend to solve this?"

In this case, the first part of the terms of use solves your conundrum:

> By submitting a deletion request through DROP, you consent to disclosure of your personal information to data brokers for purposes of processing your deletion request pursuant to Civil Code section 1798.99.80 et seq. unless or until you cancel your deletion request. Additionally, you acknowledge that data brokers receiving your deletion request will delete any non-exempt "personal information," as defined in Civil Code section 1798.140(v), which pertains to you and was collected from third parties or from you in a non-"first party" capacity (i.e., through an interaction where you did not intend or expect to interact with the data broker).


What happens when you trigger the models censorship function?

Does it comment that it can't comment on that?


this reminds me of learning cursive


long live anna's archive.

a true gift to humanity.


AI is getting way too much credit in this article.

There are much much bigger forces that impact society in the way the author describes.


Go on.


I'm certain the users of the ruby app don't care how "serious" your programming language is.


Do the user of _any_ language care how serious the language is?.. nowdays there are so many options to pick from, I love Ruby, I use it all the time, but I also like Python, Elixir and system languages like C++..


no they don't thats my point ;) use what works for you, everything else is just noise.


Another law that will only apply to the poor.

Politicians and the rich will be exempt.


They literally applied for exceptions for themselves and law enforcement xD


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: