Hi all. The wp mentions that access to data is a challenge for AI developers, and that contributors can effectively sell data to Agents and stipulate their own privacy restrictions. My question is what is to prevent companies from selling customer data they’ve collected over the years—data that customers never intended to be forever part of a general AI? Furthermore what is to prevent malicious use of data outside the stipulations set forth by the provider? Thanks!
This is already happening! This is part of what blockchain will help to solve in combination with homomorphic encryption and other technologies that will put people’s data back in their control. Privacy will be a much bigger issue in the future as AIs become far more intelligent.
Right, and I’m wondering how that applies here. Say you’re a grocery store with a loyalty card that tracks purchases for all your customers, including Bob. Bob was cool w/ the store tracking his purchases as long as it was protected and not shared. Then the store decides to source an AI to analyze the data, or monetize the it by contributing it to a data pool on SNet. What if Bob didn’t want his data accessible by the whole world? Once it’s out of the stores isolated silo and into a public domain, who knows what could happen to it. Or if it was just analyzed, whoever analyzed it would have to have access to it, and what’s to prevent them from acting maliciously?
I think that Bob would be willing to give up a bit of his privacy in order to find out when he might be pregnant, don’t you think (assuming you’re referencing Bob from Fight Club, who had all sorts of adaptations as a result of extensive steroid use)? As long as stores employ centralized, proprietary data silos with questionable privacy “agreements” or contracts with said customers, the exchange will mostly be one-sided, aside from the ancillary benefit of pregnancy determination without Bob having to go out and buy a test.
This is what blockchain solves, the safety of your data and the ability to eventually control your data. You will be able to revoke access to your data, for instance, in a medical information stored on the blockchain sort of situation, where the medical professionals will only be granted access to your medical history temporarily until those privileges are automatically revoked after treatment.
The big deal here is that you will control your data and where it goes and who has access to it. It is unclear how this will all play out in terms of adoption and execution, but what is clear is that higher levels of privacy and control over one’s data will play a huge role in our near future.
The lesson right now is: don’t give up your privacy for a few saved pennies if you are concerned about this exchange of private information for a tiny savings at the checkout. For me, it is not worth it.
The issue with the idea of controlling you own data is that, to some extent, you already can, but people don’t. Why? Because it’s complex, they’re somewhat lazy and people are spectacularly poor at assessing this sort of risk. If you’re making scores of personal transacrions per day, it’s an hour’s work each evening to see review what will likely be VCR instruction level privacy controls.
The example though… Seems very very familiar. Someone else used it, but can’t put my finger on it. OP?