Over the holidays my pre-teen sister was very concerned our parents were being too nosy in her digital affairs as they scrolled through her brand new phone’s early usage. She quickly deferred to big brother to come to her defense, which resulted in another rebuff. I’m of the personal belief that digital access should be both monitored and limited at ever access point for kids until they are of an age to be mature enough to consume it responsibly.
Then again, that age probably never occurs the a great many individuals who make anything but sound digital decisions. Granted, it’s probably not as detrimental in most cases as driving under the influence or other irresponsible things adults do, but it is still bad.
As part of my conversation with my sister about her phone usage we talked not only about the usual parental responsibility to ‘invade privacy’ but also the other invasions of privacy that she doesn’t see happening even though they occur.
It doesn’t take much to demonstrate this to a child even if they are a little savvy in their digital skills. Awing them with what sites they’ve visited and what they probably were doing on them is pretty straight forward on most devices if you know where to look and cell providers list in the billing the calls, text and data usage. However, if you stop to think about this for a brief second, if this information is out there for me to access for myself, where else then is it being accessed or saved to. I posed this question to her (and my family) and was frightened to realize they didn’t know how much they were indeed being “spied” on.
Let’s get one thing out of the way first, as long as there’s been a way to track consumption someone’s been tracking and analyzing it. That being said, most consumers assume that unless they’ve provided explicit permission to attribute usage to themselves that it would be anonymous, and this idea applied to both in the brick-and-mortar and digital worlds. In time, with the digital world in particular, the end-user trade-off occurred between identity and attribution in usage as log ins became more common. Even then, there was still an implied expectation that individual usage was masked and the results were amalgamated for analysis.
In 2011 internet usage tracking took some interesting and ugly privacy turns. Turns that will define our experiences from here on out, both as consumers and managers. Ads are now everywhere, plastered on every inch of digital and physical real-estate available we come in contact with and there’s more of this space then ever. Where there’s ads there’s some kind of targeting and where there’s targeting interactions are being tracked. This is the inescapable reality we are faced with.
Early on tracking was done in the name of delivering better user experiences driven by product teams. It was believed if you see content customized to expected needs or wants you are more likely to have a positive experience and continue to interact with it. This is usually true. Algorithms were then developed to predict which content to display to whom. We’ll leave the accuracy of algorithmic prediction for another post and just go on the notion that it’s relevant this example. The foundation for creating the content and the algorithms lie in the accuracy of the metrics compiled through tracking and the trends derived from the information.
This user experience idea was applied to advertising where the more relevant the ad is the more likely a positive response to its call to action would be. Again, we’ll leave the assumption of this theory’s effectiveness aside and go on it’s relevance to the example. Ads creation and delivery was heavily influenced by tracking information and the inferences it provided about those being tracked.
Pressure to perfect the algorithms and induce the greatest qualified interactions meant more accurate tracking, which meant more variables to process, which meant collecting more personalized information by any means necessary. As the ability to collect this information changed this wasn’t always divulged in the process.
Enter the new era of digital fingerprinting your every move. Much of it was done by content / interaction providers who want to keep the end-user experience free by delivering effective ads for the advertisers. Simply clearing the cookies in one’s browser and dumping the temporary files wasn’t enough to eliminate one’s digital footprint post browsing. Browsing unlogged or blocking access by browsing “in-cognito” does little to deter the problem. Tracking pixels using flash cookies and other files were being set on computers that followed and reported back usage long after the standard digital footprint was assumed erased and the vast majority of offenders were doing it without any warning to their consumers before it happened. It becomes even more complex when you include oAuth and common log ins facilitated through things like Facebook Connect as more parties are simultaneously collecting and sharing information.
It hasn’t ended with just the web sites and how they manipulate your browser based experience either as there were a number of mobile applications that infiltrated user experience throughout the device, in some cases without even disclosing the full permissions being granted in the installation (though most of those were addressed immediately by the distributing market)
Outrage hardly had an impact on its effect and by now it’s more than common-place across the web. When at one point we feared how our search history might be being attributed to our IP address and inadvertently someone might find out what we were researching, now our every more online is being stored in intimate detail somewhere.
The illusion of privacy becomes even more complicated when the access points are also collecting this information. No one probably ever expected they weren’t, after all most all access points are paid in some way and in order to facilitate billing. However, the consumer’s understanding of this was that their interaction with the access point was limited to the automated.
Most common internet access points, such as broadband providers, educational institutions, etc., were known to collect and store the data because of their reticence to turn it over to authorities over the years, particularly in consumer-level individual privacy cases. The education institutions situation was brought even further to light due to additional tracking they install on devices connected to their networks that all kinds of usage, including keystroke tracking and remote recording of actions (through webcams, microphones, etc.).
None of that seemed to prepare some consumers for what mobile carriers were doing in the data collection realm. Carrier IQ took what educational institutions were doing to a select segment of a very specific usage population and blanket tossed it over the entire GPS and web enabled device consuming public. Saying it was extreme tracking of usage is an understatement but the revelation was both under-consumed by the mass populous and of those who did perceive it the abandoning of device usage was neglagable at best. Perhaps part of the reason it hasn’t had much of an impact though is because it isn’t visibly affecting anyone either. When content or ad targeting occurs it is painfully obvious they are happening beyond our initiation and we’re probably cognizant of how are interaction is being tracked by how the rest of the interactive experience responds to us. But with this, we’re not seeing any of those things happen in the same sequence and we’re not aware of how the information is being used.
Of course, this is all fine in the existing structure as we know it. Suppose the Patriot Act were to be tweaked to allow greater unfetted access to the information by Government Agencies? Suppose SOPA were to become law? Suppose these actions could be sold or were to become public knowledge? Supposed the information was retroactively applied and knowledge of one’s actions could be used to determine one’s insurability or employability or their access to education or credit and financing? It might sound like something straight out of a Huxley or Orwell novel from the last century but it is a very real possibility in a very uncertain technological future were the best corporate intentions in delivering good user experience could be used very nefariously under a similar type of ‘what’s best for us’ guise.
Privacy as we know it is long gone but our understanding of what this new lack of privacy means is only just beginning. It’s a trade-off where convenience requires providing personal information and knowledge is power to provide uniquely tailored, time-and-money saving experiences. Being we as a society have become so narcissistic to self-publish so many of our personal actions for the world to see from the bedroom to the bathroom who knows what we’ll be capable allowing strangers and whomever else to know.
This brings me back to my original point, which is the average person isn’t really responsible enough to make knowledgable decisions in the first place which is how you have Anthony Weiner and a majority of HS kids displaying their genitals across the digital space all desperately hoping the only person seeing it was the intended receiver…