You still need to play a bit more in order to see your results!Keep playing
Copyright © 2019 Guard
The digital services we use are constantly gathering information about us, and the mere existence of this data makes us vulnerable in ways we can’t even anticipate yet. The worst part is we constantly agree to their terms because we don’t really understand what privacy policies say – and neither do computers. Guard is an AI that reads privacy policies for you, but as any other AI, it needs to be taught. This site is in part an experiment to teach computers the meaning of human privacy, so they can protect us.
This experiment aims to gather a human perspective on moral decisions made by machine intelligence. This means that we need to teach them what's acceptable and what's not in regard to privacy.
To do that, we’ll show you the actual privacy practices of some of the world’s most known services, and you will need to choose which of the two given sentences looks more ethical and respectful to human privacy.
This is the first research experiment of its kind in the world. Teaching artificial intelligences the meaning of ethics in privacy is something that's never been done before, so join us and become a pioneer!
These surveys are supposed to be ethical dilemmas: given two sentences, it could happen that none is better than the other, or that both are neutral or equally bad. You still should select the one that looks more ethical or more privacy-friendly than the other based on your judgement. There's no right or wrong answer.
The results of this experiment will be used to train an Artificial Intelligence to build and ensure a better and privacy-friendly future for the internet. The data collected will be treated in the most private and ethical way as humanly possible.
Let’s teach machines what’s acceptable and what’s not.
Sign up for free to contribute and become part of a privacy-centered community of thousands of people from all around the world.
This feature is not ready yet. We're building it as fast as we can, please check back in a few days.
You did it!
Before showing you your results, getting to know you helps me ensure there are no underrepresented minorities in the data I learn from – so I’m sure I’m being taught in the most open and ethical way as humanly possible.
You can obviously skip this step.
This data will be completely confidential, unidentifiable, anonymized and treated in the most private and ethical way possible. The only reason to ask for such a large amount of data is to have a wide range of precise, balanced demograpics for a solid, ethical and valid academic study. This is a privacy-first project – your data will be used for the study purposes only and will never leave Guard.