Do you remember when you were a child, you probably had a favorite toy that was a constant companion, like Christopher Robin had Winnie the Pooh, and your imagination fueled endless adventures? What could be more innocent than that? Well, let me introduce you to my friend Cayla.
Cayla was voted toy of the year in countries around the world. She connects to the internet and uses speech recognition technology to answer your child's questions, respond just like a friend. But the power doesn't lie with your child's imagination. It actually lies with the company harvesting masses of personal information while your family is innocently chatting away in the safety of their home, a dangerously false sense of security. This case sounded alarm bells for me, as it is my job to protect consumers' rights in my country. And with billions of devices such as cars, energy meters and even vacuum cleaners expected to come online by 2020, we thought this was a case worth investigating further. Because what was Cayla doing with all the interesting things she was learning? Did she have another friend she was loyal to and shared her information with? Yes, you guessed right. She did. In order to play with Cayla, you need to download an app to access all her features. Parents must consent to the terms being changed without notice. The recordings of the child, her friends and family, can be used for targeted advertising. And all this information can be shared with unnamed third parties.
Enough? Not quite. Anyone with a smartphone can connect to Cayla within a certain distance. When we confronted the company that made and programmed Cayla, they issued a series of statements that one had to be an IT expert in order to breach the security. Shall we fact-check that statement and live hack Cayla together? Here she is. Cayla is equipped with a Bluetooth device which can transmit up to 60 feet, a bit less if there's a wall between. That means I, or any stranger, can connect to the doll while being outside the room where Cayla and her friends are. And to illustrate this, I'm going to turn Cayla on now. Let's see, one, two, three. There. She's on. And I asked a colleague to stand outside with his smartphone, and he's connected, and to make this a bit creepier...
let's see what kids could hear Cayla say in the safety of their room.
Hi. My name is Cayla. What is yours?
Uh, Finn.
Is your mom close by?
Uh, no, she's in the store.
Ah. Do you want to come out and play with me?
That's a great idea.
Ah, great.
I'm going to turn Cayla off now.
We needed no password or to circumvent any other type of security to do this. We published a report in 20 countries around the world, exposing this significant security flaw and many other problematic issues. So what happened? Cayla was banned in Germany, taken off the shelves by Amazon and Wal-Mart, and she's now peacefully resting at the German Spy Museum in Berlin.
However, Cayla was also for sale in stores around the world for more than a year after we published our report. What we uncovered is that there are few rules to protect us and the ones we have are not being properly enforced. We need to get the security and privacy of these devices right before they enter the market, because what is the point of locking a house with a key if anyone can enter it through a connected device?
You may well think, "This will not happen to me. I will just stay away from these flawed devices." But that won't keep you safe, because simply by connecting to the internet, you are put in an impossible take-it-or-leave-it position.
Let me show you. Like most of you, I have dozens of apps on my phone, and used properly, they can make our lives easier, more convenient and maybe even healthier. But have we been lulled into a false sense of security? It starts simply by ticking a box. Yes, we say, I've read the terms. But have you really read the terms? Are you sure they didn't look too long and your phone was running out of battery, and the last time you tried they were impossible to understand, and you needed to use the service now? And now, the power imbalance is established, because we have agreed to our personal information being gathered and used on a scale we could never imagine.
This is why my colleagues and I decided to take a deeper look at this. We set out to read the terms of popular apps on an average phone. And to show the world how unrealistic it is to expect consumers to actually read the terms, we printed them, more than 900 pages, and sat down in our office and read them out loud ourselves, streaming the experiment live on our websites. As you can see, it took quite a long time. It took us 31 hours, 49 minutes and 11 seconds to read the terms on an average phone. That is longer than a movie marathon of the "Harry Potter" movies and the "Godfather" movies combined.
And reading is one thing. Understanding is another story. That would have taken us much, much longer. And this is a real problem, because companies have argued for 20 to 30 years against regulating the internet better, because users have consented to the terms and conditions.
As we've shown with this experiment, achieving informed consent is close to impossible. Do you think it's fair to put the burden of responsibility on the consumer? I don't. I think we should demand less take-it-or-leave-it and more understandable terms before we agree to them.
Thank you.
Now, I would like to tell you a story about love. Some of the world's most popular apps are dating apps, an industry now worth more than, or close to, three billion dollars a year. And of course, we're OK sharing our intimate details with our other half. But who else is snooping, saving and sharing our information while we are baring our souls? My team and I decided to investigate this. And in order to understand the issue from all angles and to truly do a thorough job, I realized I had to download one of the world's most popular dating apps myself.
So I went home to my wife...
who I had just married. "Is it OK if I establish a profile on a very popular dating app for purely scientific purposes?"
This is what we found. Hidden behind the main menu was a preticked box that gave the dating company access to all my personal pictures on Facebook, in my case more than 2,000 of them, and some were quite personal. And to make matters worse, when we read the terms and conditions, we discovered the following, and I'm going to need to take out my reading glasses for this one. And I'm going to read it for you, because this is complicated. All right.
"By posting content"—and content refers to your pictures, chat and other interactions in the dating service—"as a part of the service, you automatically grant to the company, its affiliates, licensees and successors an irrevocable"—which means you can't change your mind—"perpetual"—which means forever—"nonexclusive, transferrable, sublicensable, fully paid-up, worldwide right and license to use, copy, store, perform, display, reproduce, record, play, adapt, modify and distribute the content, prepare derivative works of the content, or incorporate the content into other works and grant and authorize sublicenses of the foregoing in any media now known or hereafter created."
That basically means that all your dating history and everything related to it can be used for any purpose for all time. Just imagine your children seeing your sassy dating photos in a birth control ad 20 years from now.
But seriously, though—
what might these commercial practices mean to you? For example, financial loss: based on your web browsing history, algorithms might decide whether you will get a mortgage or not. Subconscious manipulation: companies can analyze your emotions based on your photos and chats, targeting you with ads when you are at your most vulnerable. Discrimination: a fitness app can sell your data to a health insurance company, preventing you from getting coverage in the future. All of this is happening in the world today.
But of course, not all uses of data are malign. Some are just flawed or need more work, and some are truly great. And there is some good news as well. The dating companies changed their policies globally after we filed a legal complaint. But organizations such as mine that fight for consumers' rights can't be everywhere. Nor can consumers fix this on their own, because if we know that something innocent we said will come back to haunt us, we will stop speaking. If we know that we are being watched and monitored, we will change our behavior. And if we can't control who has our data and how it is being used, we have lost the control of our lives.
The stories I have told you today are not random examples. They are everywhere, and they are a sign that things need to change. And how can we achieve that change? Well, companies need to realize that by prioritizing privacy and security, they can build trust and loyalty to their users. Governments must create a safer internet by ensuring enforcement and up-to-date rules. And us, the citizens? We can use our voice to remind the world that technology can only truly benefit society if it respects basic rights.
Thank you so much.