Wednesday, October 21, 2009

Private Browsing

Ben Casbon

Socio-Technical System: The Web

Problem Space: User Tracking

Direct stakeholders: Internet browsers

Indirect stakeholders: Insurance and financial institutions.

Socio-Technical System: The Web


The World Wide Web is an immense system with billions of publicly available systems that not only serve-up information to users, but also allows remote users to interact with the system, or even other users of the same system. In general , most of the users will see the same information on the same impartial system.

Problem Space: User Tracking


Browsing the internet is made smoother by the use of ‘cookies’. Cookies are minute pieces of code that are lodged on the browser’s computer by their browser at the request of a web site. Since the World Wide Web is stateless, some mechanism is needed to ‘personalize’ a users interaction and make the site they are visiting ‘theirs’ for the time that they are visiting.

While there is nothing inherently wrong with cookies, they are a technology that can be, and frequently has been abused. Verifying personal information such as usernames and passwords can be an annoyance to users, so cookies have been developed that allow a user to stay ‘logged in’ to sites that they have visited. This can prove hazardous if the user is operating on a computer that may at some time be operated by another user, such as a library or campus computer, or if they lack adequate physical protection to their data.

Example: Facebook Beacon


On November 6th 2007, Facebook began a ‘service’ called ‘Beacon’ (http://www.facebook.com/press/releases.php?p=9166). Beacon was billed as a “… core element of the Facebook Ads system for connecting business with users and targeting advertising to the audiences they want.” In an ambiguous press release, Facebook hailed the benefits of ‘sharing’ information between itself and 44 other online companies.

According to Facebook, “Facebook Beacon is a way for you to bring actions you take online into Facebook. Beacon works by allowing affiliate websites to send stories about actions you take to Facebook.” (
http://www.facebook.com/beacon/faq.php) The Beacon Participating web-site would detect a user’s Facebook identity whether they were logged on to Facebook or not, by detecting the latent Facebook cookie on the user’s machine. One prominent example of the Beacon system was the Blockbuster online integration with Facebook, which would update a user’s Facebook wall with the movies that they added to their ‘queue’ while they were on the Blockbuster site.

Almost immediately after the launch of the Beacon service, MoveOn.org (
http://www.moveon.org) created a Facebook group and online petition demanding that Facebook cease violating users privacy without receiving users informed consent. In December of 2007, Facebook issed a ‘mea culpa’ to the privacy activists about the Beacon service and stated on the Facebook blog: (http://blog.facebook.com/blog.php?post=7584397130)
“At first we tried to make it (Beacon) very lightweight so people wouldn't have to touch it for it to work. The problem with our initial approach of making it an opt-out system instead of opt-in was that if someone forgot to decline to share something, Beacon still went ahead and shared it with their friends.”

Facebook changed Beacon from an opt-out system to an opt-in system in December 2007, but they were too late. Dallas County resident Cathryn Elain Harris filed a class-action lawsuit against Blockbuster Inc. over the company’s participation in the Facebook Beacon system in April 2008. (Computerworld) (
http://www.computerworld.com/s/article/9078938/Blockbuster_sued_over_Facebook_Beacon_information_sharing?taxonomyId=146&taxonomyName=standards_and_legal_issues)

Value: Privacy


Facebook and their cohorts had violated user’s privacy by exploiting the already-available architecture of cookies to track a user’s movements online without the user’s expressed consent. While Facebook and their partner companies were certainly to blame for the bulk of the violation of the user’s privacy, the current web-browsing architecture was also partly to blame.

Current architecture not only allows web sites to leave cookies on a user’s computer, it does so without informing the user by default. Without clearly understanding the implications of the ‘keep me logged in’ checkbox, users leave their information available to sometimes unscrupulous advertisers.

A solution: Incognito mode


In a surprise move in September of 2008, Google released a brand-new web browser. The new browser, dubbed “Chrome” featured many innovations in the interface and behind the scenes. Chrome had the benefit of being developed from the ground up for a more mature web. The designers of Chrome paid special attention to user’s desire for speed, flexibility and privacy.

Google’s chief concession to the users privacy was the incorporation of the ‘Incognito’ mode in Chrome. Incognito mode allowed the user to easily start a browsing session that exists in a temporary space. Instead of relying on the users to manage their privacy by periodically deleting their cookies, Incognito mode allowed the users to merely launch an ‘off the record’ session.

Incognito mode radically changes the web-browsing privacy equation. Whenever a knowledgeable user browses sensitive information, they experience the nagging doubt about whether or not traces of their personal information are being left behind them like inadvertent versions of Hansel and Gretel’s breadcrumbs. Incognito mode allows users to launch a browsing session that is blissfully temporary. Having a browsing session persist past the termination of the browser is an essentially unnatural mapping for the user to comprehend. Why would you still be virtually ‘logged in’ to a site, if you no longer have the window open?

Chrome’s Incognito mode is in keeping with one of the general methodologies to preserve privacy that Friedman mentions in the Handbook of Human Computer Interaction (Friedman et al, 2006) Incognito mode empowers the user to control what information remote sites are able to lodge on a user’s computer.

Chrome is not the first browser to offer the option of private browsing, but it was the most initially accessible version of the feature. Safari offered a version of the Private Browsing feature, but it has been fraught with embarrassing bugs where the user’s private information has been found to be stored on their system despite explicitly entering privacy mode.

The success of Incognito mode lit the fire under the other major browser producers, the Mozilla Corporation and Microsoft. Internet Explorer 8, released earlier this year has an ‘InPrivate’ mode and Firefox 3.5 incorporated a private browsing mode at its release in July of 2009. Once a feature has been added to a browser, it tends to remain in the feature set in future versions of the software. At this point, it is likely that all future versions of internet browsers will have some form of private browsing available to users.

Direct stakeholders: Internet browsers


Millions of people use the internet across the globe, and there are as many uses for the information online as there are users of the information. Each individual’s need for privacy makes them a direct stakeholder. The user’s financial well-being as well as their reputation is at stake on the internet. In February 2007, Javelin Strategy & Research conducted a survey which determined that privacy violation in the form of identity theft (http://www.privacyrights.org/ar/idtheftsurveys.htm) compromised the identity of 8.4 million people in 2007, with the mean resolution time to resolve the identity theft situation was 25 hours per victim. Victims reported a $5 billion out-of-pocket loss in 2003 (http://www.ftc.gov/os/2003/09/synovatereport.pdf ).

Indirect stakeholders: Insurance and financial institutions


According to the same study by the Federal Trade Commission, identity theft in 2003 accounted for $47.6 billion worth of losses. A shocking 11.6% of all identity theft occurred online. Companies who fail to protect their customer’s privacy through negligence or deliberate exploitation stand to lose business, and potentially face prosecution in civil court as in the Facebook Beacon case.

Lack of respect of user’s privacy almost is almost inextricably linked to the user’s loss of trust in that organization. Customers universally hate being treated as a commodity, and the relationship between provider and consumer is easily soured by the provider by expanding that relationship to another company without the consumer’s explicit permission.


Works Cited

Friedman, B., & Kahn, P. H., Jr. (2007). Human values, ethics, and design. In Sears, A. & Jacko, J. (Eds.). The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, 2nd Edition. (pp. 1241-1266). Lawrence Erlbaum.

No comments: