Data Privacy and Homomorphic Encryption

[background fluff] I am currently enrolled in a “Software for Society” class at UCSC. It’s an upper division CS elective, where we discuss topics relating to non-profits, charities, and social good in relation to software (or computers in general), as well as write a piece of software requested by a non-profit in teams of five. A bit over a week ago, we had a guest speaker from the cryptography industry. Well, that might be a bit of a generous description. The speaker runs a business that makes claims like “Patent pending steganography and cryptography technology for increased security.” Patent pending, you say? Hoo boy, sign me up! Snideness aside, while I was pretty disappointed with the presenter’s technical knowledge and solutions, that’s not what I’m writing about. And indeed, that isn’t what I wrote about. When I went into that presentation, I was hoping for an interesting discussion on the relationship between privacy, encryption, and services that have profit models relying on user data. Instead, I was moored down by technical details. So a few days later, I made a post on the class’s internal forums in an attempt to engage in the conversation I want to have. Unfortunately, the class doesn’t seem to be terribly interested in these things. But this is something I think is important, so I decided to move the content of that forum post somewhere more public. So, here below lies an edited version of the post I made to the class forums. Opinions and discussion is more than welcome. [/background fluff]

 

Right now, companies are using data mining techniques to learn everything they can about us, so as to better market services, products, and other advertisements to us. This leads to a few problems, such as companies having more data on us than we feel comfortable with, and the algorithms they use being able to manipulate our behavior. Right now, these problems are broken into to largely incompatible solutions: let the corporation have the data they want so they can better provide the services they do for us, or don’t let the corporation have the data they want so as to protect privacy (either through encryption or by simply not using the service). However, there is a new form of encryption, known as Fully homomorphic encryption (FHE), that would stand somewhere in between these solutions. What fully homomorphic encryption allows is the ability for a function to take an encrypted input, run any arbitrary calculation on it, and output a correct encrypted output – all without the algorithm being able to tell what that input or output actually is.[1] So a company could take in your personal information in an encrypted form (e.g. your age, gender, race, sexual orientation, salary, whatever), run its algorithms for determining the things they currently do (e.g. what you would probably like to buy, who you should be friends with, etc.), and give you an encrypted output only you can decrypt. The company doesn’t ever know what the inputs or outputs are, it just knows that they work.

 

Fully homomorphic encryption is a very new field, and there are very, very few implementations of it. However, they are starting to come out. For example, https://hcrypt.com/ is a brand new open source implementation of some FHE encryption schemes. As it becomes more and more feasible, the question then becomes, is this ethical? While it solves the problem of the employees of companies knowing personal details about us, it does not prevent these algorithms being used in ways that we may not feel comfortable with. For example, using FHE, Target could provide their frequent buyer cards without keeping track of what we are buying. However, it would not stop the incident where Target started recommending baby things to a teenage girl whose family didn’t even know she was pregnant (http://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/). It could prevent the owners of a bank from going through and purposefully racially discriminating for loans, but would make it easier for an algorithm to racially discriminate if there was some correlation between race and successful loans. Is this something we should be okay with?

 

When it comes to matters like this, I generally prefer technical solutions over social ones. I trust AES to keep my information secure more than I trust the people I’m giving it to to not read it, regardless of what the law says. However, in this case, I think it might be an incident where social solutions are, at least currently, more viable than technical ones. That is, we need to ensure our algorithms don’t discriminate, and that we don’t give information to these services we wouldn’t want them to have. This is an unacceptable solution. We can’t honestly expect the services we give our information to to follow these guidelines, and we can’t be expected to be able to constantly discern what is and isn’t appropriate information for them to have. Similar to having one app on your phone being malicious with the right (wrong?) set of permissions is just as bad as having 10, a single slip up on our parts can lead to unforeseen disaster in our lives. What, then, are we to do? Homomorphic encryption provides some mitigation to these problems, but is far from the panacea we seek.

 

[1] For those who are interested, the Wikipedia article goes into some decent coverage as to how this works, but the basic idea is it uses encryption mechanisms that preserve both addition and multiplication, thereby preserving the ring the plaintext(/ciphertext) reside in.

Advertisements
This entry was posted in Uncategorized and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s