Originally posted at The Daily Dot


Steffan Heuer and Pernille Tranberg are authors of the book Fake It: A Guide to Digital Self-Defense. They cover technology and privacy issues in San Francisco and Copenhagen. In this series, Digital Self-Defense, Heuer and Tranberg report with updates from the digital identity wars and teach us how to defend our privacy in the great data grab going on all around us. Follow them at @FakeIt_Book.

Show me your search patterns, and I’ll tell you who you are. Your vacation dreams. Your health worries. Your job situation, your hobbies, if you are gay, hetero, single, or married. Your political orientation, shopping behavior, financial capabilities, food and alcohol cravings, drug habits, and all kinds of other interests and preferences.

Your searches are probably the No. 1 way to collect data about your private life. Forget about the encryption debate that’s been raging since Edward Snowden started leaking. Privacy starts with search.

Are you among the many who googled “Amphetamine mixed salts”? It’s one of the top three medications searched for, according to Google Trends charts. “Amphetamine mixed salts” is a drug used to treat attention deficit hyperactivity disorder and narcolepsy. Searching for these keywords and clicking on particular links is an indication that you or a family member may suffer from one of these conditions. Searches such as these contain highly sensitive information a data broker, pharma company, or insurer would love to get their hands on.

The more of these tell-tale bits you leave in the open, the easier it becomes to target ads and offers to a condition—or even craft a costlier health insurance policy, tailored just for you. And this isn’t doomsday worrying; it’s happening now. One health insurance manager recently told us his company is regularly approached by data brokers who offer to sell transaction data from outlets such as fast food chains in order to tie them to individual clients. Thankfully, his company currently declines such overtures. But do they all? And how long before a desire for higher profits overtakes morality?

Search engine providers such as Google tie every search to every user to allow for user profiling and more targeted ads. And they will hand those search histories, plus IP or location data, over to government agencies when, not if, they come calling. Google’s not the only one. According to Robert Beens, CEO of the anonymous search engine Startpage.com, most search engines record your search data like an OCD librarian writing down every step you take in a library.

Beens tells us:

“They capture your IP address and use tracking cookies to make a record of your search terms, the time of your visit, and the links you choose—then they store that information in a giant database. Those searches reveal a shocking amount of personal information about you.”

He continues,

“Searching is mostly done in the privacy of your home and you ‘confide’ in your favorite search engine as a friend—disclosing your innermost thoughts and interests. Aggregated together over several years, your searches reveal who you are. It completes the whole huge puzzle that you are, piece by piece.”

Add to that the fact that most people use Google & Co. without SSL encryption, and everybody else can listen in: your service provider, your mobile operator, your smartphone manufacturer, etc.

But it doesn’t have to be that way. There are alternatives—ways of preventing search engines and trackers from mining all of your personal data (but, fair warning: A new method called “device fingerprinting” makes it harder to hide your digital tracks).

These are, at the moment, mostly niche players that have received huge boosts in user traffic since Snowden blew the whistle. You can do truly anonymous searches on Startpage and its older sister site Ixquick, as well as with its U.S. competitor Duckduckgo. They’re all breaking traffic records since the NSA revelations, with search numbers nearly doubling.

Some will object that nothing comes close to Google’s massive index, but search engines such as Startpage pool your search terms, strip them of identifiable information, and, in fact, anonymously submit them to Google. You get the full stack of answers without being tracked. What’s more, you don’t get supposedly personalized results based on what they know about you. Escape the “filter bubble” and try clean, non-filtered results for a change. You’ll be surprised at what comes up when it isn’t edited. What’s more, you can view pages via a proxy, basically reading results in anonymity since the search engine pulls up the link and presents it to you privately.

From Cookies to Fingerprinting

People are finally getting sick and tired of having their curiosities and intentions tracked and logged. Here’s the explanation that Gabriel Weinberg, founder and CEO of DuckDuckGo, gave us:

“Aside from government access, there are many other reasons. Perhaps the most obvious is all those ads that follow you around the Internet now. In-between are a bunch of things that most people don’t know about yet, like being charged different prices based on your data profile or having your online info start to show up off-line, for example insurers are now testing such data for use in risk profiles.”

(DuckDuckGo and Smartpage, by the way, recently released nifty iOS apps for mobile searches that let you skirt the data-sniffers from Google & Co.—with Android versions in the works.)

But digital self-defense is an arms race. As soon as somebody develops a block or tool, someone else comes up with new ways to get around it to continue keeping tabs on every user online. Today, most advertisers and other parties on the Web still rely on a wide array of cookies. More and more users block cookies or routinely delete them, as a new study by Pew Internet Life just revealed. Mobile phones do not use cookies, but give off other valuable signals, such as device IDs.

Because cookies are becoming less effective, tracking companies are now turning to “fingerprinting,” a technique allowing a website to look at the unique characteristics of the computer or device you’re using. They analyze installed plugins, software, screen resolution, time zone, fonts, and other features. Combine that info with other data points, and companies can infer which smartphone, tablet, and laptop belong to the same person.

Fingerprinting—like the kind announced last week by Apple—may prove to be a more robust tracking technology than cookies, and it’s very hard to avoid from a user point of view. “Today, to prevent fingerprinting in a meaningful way, there is a big impact on the user experience,” says Andrew Sudbury, CTO and founder of Abine, a provider of privacy tools based in Boston. This has created catch-22 situation for the average user. Take Javascript and Flash. They have become standards to deliver rich online experiences, but they are also the backdoors through which tracking code is planted on your machine for fingerprinting.

Abine, which offers tools like DeleteMe and DoNotTrackMe, is one company among several that have been working on a solution to the fingerprinting problem—but even they admit we’re not there yet. The only simple step you can take to defend yourself is to use proxy services that change and therefore hide your real IP address—the No. 1 identifier in the absence of cookies. However, that alone is not enough, as trackers have become much better at identifying users without their IP address. So the arms race will continue.

In the meantime, here are a few tips on how to keep your searches for problems, pet peeves, pills, and porn from becoming a matter of public record:

  • Use an anonymous search engine that doesn’t track you, like startpage.com (or eu.startpage.com, which only uses EU servers), duckduckgo.com, or ixquick.com. Startpage, by the way, is a Dutch company and thus not under U.S. jurisdiction like DuckDuckGo, which means they cannot be forced to turn over the meager records they might have.
  • If you insist on using Google or another one of the big guys, at least make sure you’re logged out from your Google or Gmail account in the same browser. Better yet, conduct searches only in a separate browser and use a VPN service such as PandaPow or Overplay to hide your IP address, plus use a blocking extension for your browser such as Disconnect or Ghostery to prevent hundreds of third parties from following you around the web from searches to results.
  • Consider using Tor (and Torbutton), which has elements designed specifically to prevent fingerprinting.

You don’t have to stand for search engine tracking. Use these tools and practice good digital self-defense to protect your anonymity online.


Published first in The Daily Dot

Are you an avid user of Snapchat, that kind of person who likes bits that go poof? If you or your kids are under 25, chances are you do. The wildly popular photo-sharing app—which promises that pictures self-destruct after 10 seconds—is a clear sign that many people long for privacy. Snapchat claims to have finally delivered to us a medium that does not forever save and store our data. We want that! Of course we do.

Unfortunately, Snapchat doesn’t quite transport us to the land of data privacy. It sounds too good to be true, and it is. In much the same way that Facebook deludes people into the belief that they can actually be private, it gives us a false sense of security. Twitter makes no bones about privacy; it is, and claims to be, an open public network where you are public by default. With the likes Snapchat and Facebook—and most of social media, in fact—many mistakenly believe in the illusion of being private.

The specific problem with Snapchat, of course, is that while the photo message on Snapchat disappears from the phone of the recipient after a few seconds, it does not prevent the nimble-fingered receiver from taking a screenshot. If that happens you get an alert, but what good does that really do? It certainly doesn’t prevent the screenshot from being shared with others, as happened at this New Jersey high school. There’s another hack to work around that alert. And last but far from least, a US-based company Decipher Forensics told the Guardian that they figured out how to recover photos from the Android version of Snapchat in a matter of days. The company is now trying to recover photos from the iOS version of Snapchat. The photo-app has been downloaded more than five million times in the Android marketplace and has been at the top of the Apple app store for quite a while.

There are just as many problems, if not more, with Facebook’s privacy settings. The biggest is that the company constantly changes them with the perverse effect of eroding its users’ privacy. It forces millions of people to waste precious time to understand the changes and how to plug gaping new privacy holes. (Check out this brilliant visualization of how Facebook keeps moving the privacy goalposts.)

The bottom line is that your friends tend to reveal your supposedly private data, both intentionally and accidentally—for example by sharing your “private” pictures via their own, wide-open account. Of course, it’s worth noting that privacy in itself contradicts Facebook’s raison d’etre of virality.

So why do services even attempt to create this notion of privacy for users? Because there is a growing demand for services that don’t violate your privacy. If data, or their analytic essence, is the up and coming asset for players such as Facebook, dangling a false or misleading sense of privacy in front of users gives us all an incentive to share even more.  

It’s a con game. But don’t take our word for it—ask Alessandro Acquisti, a behavioral economist at Carnegie Mellon University in Pittsburgh. He created quite a stir at this year’s SXSW-conference where he presented his latest research, entitled “Misplaced Confidences: Privacy and the Control Paradox.” It documents the fact that making people feel more in control of their data can lead to their sharing more sensitive information with strangers.

Acquisti asked two groups the same questions, some mundane, some very personal: Have you ever been fired? Have you ever used drugs? Have you lied about your age, had cosmetic surgery, or had sex in a public venue? He found that the group that thought their answers were treated anonymously or privately were willing to tell much more about themselves than the group that knew their answers could become public.

Companies that make money by first luring you in and then selling out your behavior know how to work it. Acquisti and CMU colleague Fred Stutzmann showed that Facebook users have been engaged in a losing battle with the social network to keep their private data under wraps. In their study “Silent Listeners,” the researchers  analyzed more than 5,000 Facebook users over the course of six years, from 2005 through 2011. They discovered that people tried to increasingly restrict what they shared, but Facebook generated so many new settings and loopholes, particularly in the past couple of years, that its users ended up revealing even more than before. In that sense, the expression “privacy settings” sounds like Orwellian Newspeak.

And speaking of silent listeners—how many are there, really? One Stanford study tried its hand at “Quantifying the Invisible Audience in Social Networks.” People generally have no idea how many parties are siphoning off their supposedly private feed. On average, the audience of a Facebook user is four times the size people assume. In other words: post something to 100 so-called “friends,” and it’s skimmed off by 300 strangers, including app developers and advertisers.

In a recent portrait in the New York Times, Acquisti summarized his findings this way:  “What worries me is that transparency and control are empty words that are used to push responsibility to the user for problems that are being created by others.”

Don’t fall for the privacy honeypots set by current social media services. Chances are they will game you with promises of control and ephemeral content. Once your digital tidbits are out there, you can almost never take them back.

Take a look instead at the new and emerging class of services that let you encrypt and wipe multimedia communications between phones, and even posts on social networks: Silent Circle by PGP encryption guru Phil Zimmermann; or new plug-ins like Privly and scrambls. Apps like these make sure that ultimately, you’re the one in control of your data, not someone else. Because do you really want to trust what they tell you is private? Or do you want to determine your own privacy on your own terms?

Steffan Heuer and Pernille Tranberg are authors of the book “Fake It: A Guide to Digital Self-Defense.” They cover technology and privacy issues in San Francisco and Copenhagen. In this series, Digital Self-Defense, Heuer and Tranberg report with updates from the digital identity wars and teach us how to defend our privacy in the great data grab going on all around us. Follow them at @FakeIt_Book.

The German weekly der Freitag has named the German edition of “Fake It!” its book of the week, wrapped in a very comprehensive overview of the topics of data protection, privacy, and digital self-defense.

You can check out Freitag’s coverage, including videos of Pernille and Steffan, on their site.

finnishtv1apr18-2013Data brokers is a very lucrative business.

Here, at Finnish television TV1 (8.05 minutes in), I am explaining that 1 post on social media or 1 search on a search engine is usually not the problem for your digital identity. The real problem is the puzzle about you collected by the growing lucrative industry of data brokers.

Data brokers knows a lot about you, such as your vacation dreams, you health worries and your shopping habits. You can read much more about data brokers here at the brilliant round up at Electronic Frontier Foundation, who explains that

“Data brokers are companies that trade in information on people.”

Data brokers are poorly regulations, as the New York Times explains here. The authorities obviously know very little about what the brokers collect about us, and in July last year some members of the House of Representatives sent letters to 9 major data brokers, Axciom, Epsilon, Equifax, Experian, Harte/Hanks, Intelius, Fico, Merkle and Meredith Corp (why not Datalogix?) asking what they collect. The letters and their vague responses can be read here.

Axciom, for example, disclosed that they collect date of birth/age, race, ethnicity, religious affiliation, language preference, length of residence, home value, home characteristics, marital status, presence of children and number of residents in the household, education, occupation and political party. And according to the CNN Axciom defended their collection saying it is legal (in the US) and that is serves customers with better advertising.

But their answers were far from enough for the authorities. The FTC started an investigation last December.

Our advice is: Use some of our suggested tools like pseudonyms on social media, where you are not strictly professional, use blockers like DNTme from abine.com or disconnect.me, use VPN services to hide your IP-address and think before you post anything on social media, as everything you post is public.



Screen Shot 2013-04-11 at 9.17.44 PM

TEDxOxford University invited me to Oxford to speak  in front of 600 people about our book “Fake It!”

Here I introduced the phenomenon of the “datasexual” – a word that inspired us in the Sept 12 issue of  Wired.

Most of the audience really got the meaning of the word, as they tweeted about it a lot during and after my talk. Are you a Datasexual? they asked, as if it was something they really did not want to be themselves.

The expression is now a part of the online dictionary Wordspy, which also refers to our column about it in the The Daily Dot.

Screen Shot 2013-04-11 at 9.17.57 PM










Our column in The Daily Dot:

Search results can help predict flu outbreaks. Location data can prove that malaria spreads through human travel rather than mosquitos. Crime stats can help pinpoint which prison parolees are likely to commit murder. Piracy and terrorism can be prevented. Credit card and insurance fraud get caught in real time. Taxis can save fuel and not drive to the airport if there is a long line of other taxis already waiting for passengers. And companies can optimize their services and give us relevant recommendations.

We spend a lot of time in this column warning about encroachments on our privacy, but it’s true: big data—all the unstructured data online and on company servers—can do a lot of good and help us find and use convenient services, if the data sets are structured and treated properly. That’s the case in all the above examples.

We are at the brink of a new era in which data, generated by humans and connected devices large and small, becomes our new currency. And as with any valuable asset that people and organizations are eager to trade in and profit from, precaution is vital. Otherwise these resources will be plundered and monopolized by the robber barons of our time. Data will be smuggled and laundered to conceal its origin, fakes will circulate and hurt the entire economy. Ultimately, trust will fall victim to the widespread data rush.

And a data rush it is. The current environment reminds is reminiscent of an earlier rush some 160 years ago, when unsuspecting, credulous people—some indigent, some greedy explorers—rushed to the supposed gold mines. This new kind of gold rush, for data, is going on all around us, intensifying every day. What does that mean for the everyday user of web services and smartphone apps? The one giving the data away?

Big data can be used for good, or it can be abused and exploited. The players involved who are using our data need to be open and transparent about what they use the data for, and what they don’t use it for. Otherwise, we risk the modern-day equivalents of specious finds and scams, deadly landslides, and toxic sludge. As advocates for privacy and digital self-defense, we absolutely want to be a part of this emerging digital society, so we do entrust our information to various service providers we consider responsible: telcos, social media sites, banks and credit card companies, even some apps for our smartphones.

The emphasis, of course, is on the word “consider,” since every week we discover how companies and government entities are abusing and betraying our trust in them. Most providers can do much more when it comes to informing us what they do with our data: with whom they share it, from whom they buy additional data, how long they store it, and if they ever delete it—let alone revealing what treasures they have built on our trust.

A new report from from the World Economic Forum entitled ‘Unlocking the Value of Personal Data: From Collection to Usage’, looks to the issue of data governance. It compellingly concludes that we need a new approach—to stop focusing on protecting individuals from all possible risks, to instead identifying risks and facilitating responsible uses of personal data. Because, the report states, the failure to use data can also lead to bad outcomes.

The WEF report underlines that we still need security for our personal data against those organizations mining and using it. But we need to rethink key principles such as “notice and consent” and “single use.” This is because individuals play a role as both producers and consumers of data; because new and beneficial uses of data are often discovered long after it has been collected; and because the sheer volume of data being collected is staggering.

The report goes one step further. It says that the current approach to providing transparency through lengthy and complex legalistic privacy policies overwhelms individuals, rather than informs them. Yes, absolutely: scientists have estimated that it would take the average person 25 full work days to read just the legalese for the typical websites they visit in a year.

But the report points to a hopeful new trend. There are indications that a “data literacy movement” is beginning to emerge in North America and Europe—a movement which could help cultivate real understanding.

For example, certain companies are aiming to develop simple explanations of their approach to data use in plain language, so that an individual can quickly understand the main elements of how data is being used. The company Intuit has established and shared its “Data Stewardship Principles,” that are at the heart of how the company deals with personal data. This principles set out in clear, simple language what Intuit stands for, what it will do, and what it will not do. Intuit will not sell, publish or share data that identifies any person. But the company will use data to help customers improve their financial lives and to operate its business.

BT.com has also made movements in this direction, implementing the recent update to the EU e-privacy directive, the so called “cookie law.” The cookie law requires companies to obtain the consent of their website users before placing cookies on their computer. Whereas most companies put in place a pop-up box asking users to click to consent, BT implemented an clear, easy-to-understand practice for visitors to its website. A pop-up screen allows users to quickly see those cookies that are truly necessary for the site to operate properly.


Privacyscore is a tool helping to enable this new data literacy movement. It analyzes the privacy policies of companies along four clear criteria and gives each website a color-coded rating and score, from 0 to 100, with anything below 80 being in the dangerous red zone. An updated browser extension, Privacyfix, scans for privacy issues based on your Facebook and Google settings and takes you instantly to the settings that you need to fix.

Mozilla has proposed another tool to help us “read” privacy: a symbols-based guide to the presentation of legal terms in icons that signal, for example: how long data is retained; whether data is used by third parties; if and how data is shared with advertisers; and whether law enforcement can access the data.
This movement towards data literacy is very positive. Because in the end, it’s our job as consumers and citizens to hold companies and government entities accountable, and to demand they use our information in an open, transparent and responsible fashion. And if you can’t consider your web service or app responsible—if you harbor any doubts that they don’t use your data responsibly—there’s still the “delete my account” button.

Fake It got a lot of coverage, when it was launched in Finland March 25th 2013. One of the interesting discussions of the book was a talkshow called Bettinas. She had invited a famous anchor, Vieraina Baba Lybeck, who is pretty private about her life online (apart from the fact that she is preparing for at triathlon). And the famous politician Mikael Jungner, who is pretty open about his private life online. And, yes, then me as an expert and author of the book.

Not only did the staff from the show actually film the first time I saw the book behind the scenes at Bettinas where we discussed, what a datasexual (an oversharer) is. They also showed some of Mikael Jungners pretty private videos with his, now former, girlfriend when they were deadly in love (the videos are now gone from YouTube).
But the most interesting thing was the discussion about being public. They asked me, if I would every share the same time of video as Jungner did. I said no, but explained that our book is not targeted people like Jungner. I am pretty sure that he knows what he is doing as he has been a public person for many many years and he uses social media to build his brand. Our book is mostly written for people who are not used to being public. Because that is what most of us need to learn: What does it mean for us that almost everything we do online and definitely on most social media sites is not private. It is public. It can never be deleted, you can always go back and find it. This was part of the discussion, and I hope the main message came through: Always regard social media as a public forum (Facebook’s so-called privacy setting are misleading) and before you post think: Would I say this on national television? If yes, go an post and you probably wont be toast.
Here a teaser for Bettina’s show,  and here is the actual show.
Screen Shot 2013-03-30 at 1.21.13 PM Screen Shot 2013-03-30 at 1.22.04 PM


Use your real name, when you are professional, eg on Twitter and LinkedIn. And fake it on service who ask for your very private data (e.g. sexual orientation) like Facebook, Pinterest and Instagram.

By Pernille Tranberg

Ruth Carter from Carter Law Firm attended our session at South By Southwest last Sunday. She wrote a blog post about it, where she write that she likes the idea of maintaining privacy with an alter ego, and she raised one interesting question:

“Now, does using a fake name violate the terms of service of social media sites that require you to use your real name or have a policy against one person having multiple accounts?”

Yes, she says, and adds: But if no one reports you, how will they ever know?

Of course she is right, because when you sign up to with Facebook you also sign up to their Terms of Service (TOS). But what do you risk? That they close your account down. That is by far the worst risk, I believe. I don’t believe they will sue you, but you never know.

So why don’t you just NOT sign up to FB, some would argue. Facebook is a private company and has a right to have its own rules.

My answer: Facebook has been such a huge success that so many people have adopted the service. Many municipalities and public services are using it, and I would claim that it is pretty impossible to take active part in society today without a Facebook profile, at least in Denmark where people obviously love the service.

Privacy is a human right. In the analogue world, we can claim the right to be anonymous, so I believe that FB is actually violating our right to privacy by maintaining a real-name policy.



Pernille Tranberg at SXSW 2013 Photo: Rune Michelsen

Speaking at SXSW 2013 about our book Fake It was an amazing experience. At a conference where curiosity rules and scepticism is for scientists, NGOs, journalists and alike, it is pretty challenging delivering a critical message. I embrace social media and new gadgets, but I always want to flip the coin and check what is behind. And with all these new fun gadgets coming out here at SXSW, there is so much tracking our behavior and mining our personal data, often without people knowing about it, let alone discussing it.

Trying to take control over your own data was the theme in ballroom G on Sunday. I spoke about my different identities, both my pseudonyms and my real identity, about some of the best tools to protect your privacy and of course about the risks you take, if you are an oversharer – a datasexual. Then, I read some paragraphs from the book and had a great Q&A. At 20 minute slot which ended up being much too short, as questioning and commenting was intense. What an engaging audience! Thanks for that.

Our book in English is a completely updated and expanded in paperback and as an eBook – available here.


Fake It in the bookstore, but it is mainly an ebook



Ballroom G, Sunday, 12.30.





Onlinecollegecourses.com (thanks) did this great visualization of the growing (but still low) numbers of people getting sick and tired of the world’s largest social network. There are some interesting numbers here, eg. that those who don’t use Facebook spend 88% more time on studying than those who do use Facebook.

Why are they quitting? Bored. Wasting time. Parents there. Not privacy worries.

Facebook Fatigue Infographic