Deep Thoughts, Identity and Expression, Privacy and Protection, Second Life, Virtual World Platforms

Kids on the Grid? Card ‘em. Governance, Trust and Identity in Virtual Worlds

“I will not abuse report residents solely based on them saying they’re less than 18 years old, as long as I can not prove this to be true.”

Blog post.

Who Runs Virtual Worlds

I was mulling over the issue of who makes decisions: who decides how content is copied or not, how perms are managed, how virtual worlds are governed. I was trying to sidestep the issues of Creative Commons and m/c/t for now, it’s not the kind of thing for a Sunday.

Then, I ran across this post about abuse reporting, and it put me in mind of a discussion led off by Ren Reynolds on Terra Nova about putting the regulation of virtual worlds in the hands of national governments rather than transnational bodies.

Now, here’s the thing: I have this idea, and the idea is that the impact of code is far more subtle than we realize. And that policy is embedded in code – there’s no ability to enforce policy and there’s no way to report abuse without code. I also have this idea that virtuality, and digital personas in general, bring with them challenges related to identity and trust, and that these challenges are leading to the emergence of a tribal morality. This isn’t a bad thing, necessarily, but it places a tension between the common good as often expressed in open source, or the freedom to copy content towards establishing a creative commons, and the pursuit of individual craft.

Now, I believe that this emerging morality, or social conscience, whatever you want to call it, arises partly because of the need to establish trust in the absence of the ability of the code to perform that task. Again, this isn’t to say that the code necessarily should perform that function, just that in a digital world where anonymity, for example, can be a default, trust is established loosely, work-arounds are found, and at another end of the spectrum people take advantage of the lack of a trust structure, giving rise to griefers, cons or thieves. Griefing, in some views, is a gift in a very odd sense, because it provides insight into what structures are missing, it highlights what’s allowed and what isn’t.

In this fragile framework of trust, where the code doesn’t establish things like identity toggles, for example, or where your ability to “go invisible” isn’t embedded effectively in the social applications, we see the emergence of sub-cultures and communities who work within the confines of policy and software limitations to establish tribes, and the norms and behaviors in these tribes arise as much because of the platform on which sociality is performed as human instinct.

Because of these things, we can’t look at changes to the code solely in terms of technical possibilities, they have impacts for wider policy and governance, impacting as they do a site for sociality that’s growing, expanding, and providing a new range of tools for expression.

If You’re Underage, Prove It
Linden Lab has made policy decisions over the years, some of which had unintended consequences, and some of which were ineffectively executed. One of those policy decisions was related to Residents of Second Life who chose to have child avatars. Another of those decisions was related to age verification. Both policy decisions were supposed to build a framework through which the range of choice for personal expression remained as broad as possible, while the ability of underage individuals to access the Main Grid were restricted.

The main change to the code was the implementation of age verification which, as Dandelion and others have pointed out, was seemingly abandoned as quickly as it was adopted.

Against this backdrop, the community of child avatars responded to their exclusion from the SL5B events with a parallel birthday celebration. The call for inclusion included a wide range of arguments, although the main one perhaps was that child avatars are not children.

Now, however, the challenges of age verification and perceived flaws in abuse reporting have some in the kid’s community advocating that because the system is broken, the use of the system of abuse reporting itself should result in sanctions against those using it. The subject of abuse reporting was the topic of one of Torley’s neon-clad videos and follow-up resident discussion, including on this issue.

Within a section of the child avatar community, Residents can be banned from sims by the estate manager for reporting that another resident is underage. The rationale is that because someone could jokingly claim to be underage, or “could have been drunk, could have made fun. Heck, it could be a typo” that unless the resident in question provides ID proving that they’re underage, there’s no grounds for filing an abuse report, and therefore the filer is griefing rather than protecting the Grid from minors:

“An abuse report is a measure not to be taken lightly. Residents banned from SL lose their land, their Linden Dollars, their creative work and probably their friends. I will abuse report kids for griefing, and for running around naked or violating the rules of my land or community. I will not abuse report residents solely based on them saying they’re less than 18 years old, as long as I can not prove this to be true.”

It’s explained that there’s an opportunity for error: someone says that they’re 14 or whatever as a joke. Someone reports the joke, and Lab suspends the account pending age verification. The process for age verification is cumbersome and lengthy, and there’s a risk that in failing to age verify (because the Resident doesn’t want to provide sensitive documentation, for example), they risk losing their account and assets.

There’s supposedly a second hypocrisy that the ban on abuse reporters is meant to serve, however, which is an age bias, and that science backs up the idea that separation of individuals by age creates mental illness:

“I abhor discrimination based on age. Psychologists and psychotherapists all over the world plead for a society where adults and teenagers, kids and old people mix up. Seperation based on age creates social stupidity, ignorance and mental illness. Where generations learn from each other, social intelligence rises. This is the state of science today.”

The execution of policy on the exclusion of minors on the Grid is left in the hands, therefore, of a group (Linden Lab) that the author claims can’t execute. In the absence of the ability to execute age verification, it’s the people who REPORT other Residents of being underage that should be sanctioned, rather than those who say (mistakenly or truthfully) that they are, because in this author’s view there’s a greater tendency towards using AR for griefing than as a true protective measure against violations of the TOS. And if a few kids slip in, so be it – at least now they know on which sim a policy has been established to protect them against reporting.


Governance Starts Here

This example is illustrative, I think, not so much about issues of age verification or avatar identity (although it contains those issues), but rather of the question of where governance lies in virtual worlds and the Web.

These decisions have been actively promoted, both in-world, on blogs, and on popular podcasts. They form a level of governance at the sim level which, while possibly unintended by Linden Lab, happens because of the perceived failings of their system for age verification and abuse reporting: once again, the code has influenced (subtly or not) our behaviors towards protecting communities.

The argument was put forward by another blogger:

“Perhaps if there were a system of sanctions against filers of false abuse reports, a faster turnaround time requiring less sensitive personal RL documentation to clear one’s name, benchmarks to show that the system is working or not, greater communication and sympathy from Linden Lab, or if they would at least work on weekends for crying out loud, I would change my mind about this.”

In the absence of global policy, clear mechanisms of enforcement, and a process of appeals, governance devolves to sub-communities, having been abandoned by what once used to be called “Game Gods” but that now call themselves “platform owners” in an attempt to stand clear of the need to play a role in social policy.

This leads to the arguments on Terra Nova that the governance of virtual worlds should be considered separate from that of the wider Net, and that platform owners should defer to the laws of the nation state while attempting to avoid global policy. Ren argues that virtual worlds are different from the Net in general because of the centralization of support functions:

“Management of virtual worlds, in terms of technical control and customer support falls into three rough models global (EvE, There, SL etc); Regional (WoW, LOTRO), National (Habbo). Virtual worlds have identifiable and self-identifying communities.

Given the above, I would suggest that the key differences between the net and virtual worlds – when looking from a very high perspective of global governance structures, are:

- the net derives its nature and key benefits from being interconnected, global and ‘neutral’ (in terms of connection, content, protocol)

- virtual worlds derive their key benefits from creating ‘spaces’ with varying degrees of structure (often those structures being ones intended to generate game play) in which communities form.”

What’s left unstated is that the ’spaces’ in which communities are formed will tend towards self-governance and self-policing in the absence of effective platform governance.

Morality and Tribes

I previously wrote:

Synthetic worlds magnify, focus, and contain our explorations until they cycle back to catch our “real selves” and to challenge our long-held assumptions. Alts, for example, seem like separate questions – divorced from reality, and ‘not possible in real life’, until we awaken and discover that the very questions about avatars and alts are really questions about trust, self-confidence, faith, our personal moralities and how we view our ethics in the context of the broader “tribe” or “world”, and our yearning to have a purpose and place in life.

Today’s issues are less about age verification than they are about content protection, interoperability, and establishing domains of trust and tools for identity management. As these decisions are made, they’re not solely about what the code can or can’t do, they’re decisions about governance, policy, transparency, and creating a balance between the tension of the ‘tribe’ or the wider social good, the individual’s right to create and the decisions that need to be made on how to best protect that right, and the tendency of policy to devolve to sub-communities in the absence of thoughtful governance and its supporting code.

As all the mini grids pop up, each with its own game god, each with its own blog post establishing policy, and each of these policies based on their own moralities – whether in support of age inclusiveness or in support of free-to-copy-content, the tools we have as individual users for making sense of the rules and philosophies underlying these spaces we enter are in the hands, for now, of the people crafting those spaces.

I would argue for a right to transparency: an information overlay that both allows control over my identity and gives me fair access to an understanding of the policies, code and enforcement that will underlie the worlds I enter.

7 Comments

speak up

Add your comment below, or trackback from your own site.

Subscribe to these comments.

*Required Fields

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.