Sunday, April 24, 2011

Lessig's Code 2.0 and Anonymity

                In Lessig’s Code 2.0, we learn that our behavior online and in virtual worlds is not without its consequences in the real world. What used to be the wild and uncontrolled world of cyberspace is now a world where we realize that controls have to be utilized and these ‘controls’ come in the form of code. Because cyberspace is now a place where problems or disagreements can be coded away, the solutions should be altogether as simple as two parties agree on what's in the code. But things are never that simple.
                As Lessig discusses the remaking of cyberspace under this coding control, he adds that “Values originally considered fundamental will not survive” (p.5)  What if anonymity is what becomes the ‘fundamental’ value that does not survive?
                He covers this ground by discussing Julie Cohen’s “Cohen Theorum” – basically, “…protecting a right to read anonymously – that if they monitor, the must be constructed so that they preserve anonymity…[she] identifies a value yielded by an old architecture but now threatened by a new architecture, and then argues in favor of an affirmative right to protect the original value” (p. 192). But while Lessig does favor Cohen, he also argues that the other side also has a legitimate reason for tracking. Simply put, the technology was not there before, it is here now, so why punish someone if they want to use it? His solution is to “architect cyberspaces to ensure anonymity – or more precisely, pseudonymity – first” (p. 192).
                Citizens in the country of Spain are currently participating in a “
Right to be Forgotten” campaign against Google, so that their anonymity and privacy online can be protected, and old references in Google searches “wiped away.”
                As the article states:
                “… Google regularly receives pleas asking that it remove links to embarrassing information from its search index or least ensure the material is buried in the back pages of its results. The company, based in Mountain View, Calif., almost always refuses in order to preserve the integrity of its index.”
                But while Google has been reluctant to make changes to satisfy the masses, they do make exceptions and are sometimes legally forced to change things. On April 4, 2011, the Swiss Federal Administrative Court asked Google to obscure all faces and license plates on the street views for Switzerland.
                And the story in Spain is expected to gain more notoriety, “because the European Commission this year is expected to craft controversial legislation to give people more power to delete personal information they previously posted online.”
                And I know of several requests from high-power executives who have asked that their homes be removed from the Google’s Street View feature – and it was removed. So it does seem that those who know the code and have access to it, or the ones who have the power to obtain legal changes, will be the ones who benefit.  So while those familiar with the online structure of how to regulate and settle arguments by modifying the ‘natural laws’ of the virtual world can find solutions, the bottom line is, the problems that can be coded away will not be meant for everyone.
                As Lessig says:  “There is no middle ground. There is no choice that does not include some kind of building.” One of Lessig's themes for a solution to the anonymity/privacy problem is to get ahead of it enough to be sure that we don't have to backtrack and change things in the future. 
 " may take more planning to ensure that privacy is protected. But if those rules are embedded up front, the cost would not be terribly high. It is far cheaper to architect privacy protections now rather than retrofit them for later" (p. 198).

But I think that the technology grew too fast and was in the hands of those who know how to create their own rules too long, that we will be doing a lot of backtracking for a long time.

No comments:

Post a Comment