“I will not abuse report residents solely based on them saying they’re less than 18 years old, as long as I can not prove this to be true.”
Who Runs Virtual Worlds
I was mulling over the issue of who makes decisions: who decides how content is copied or not, how perms are managed, how virtual worlds are governed. I was trying to sidestep the issues of Creative Commons and m/c/t for now, it’s not the kind of thing for a Sunday.
Then, I ran across this post about abuse reporting, and it put me in mind of a discussion led off by Ren Reynolds on Terra Nova about putting the regulation of virtual worlds in the hands of national governments rather than transnational bodies.
Now, here’s the thing: I have this idea, and the idea is that the impact of code is far more subtle than we realize. And that policy is embedded in code – there’s no ability to enforce policy and there’s no way to report abuse without code. I also have this idea that virtuality, and digital personas in general, bring with them challenges related to identity and trust, and that these challenges are leading to the emergence of a tribal morality. This isn’t a bad thing, necessarily, but it places a tension between the common good as often expressed in open source, or the freedom to copy content towards establishing a creative commons, and the pursuit of individual craft.
Now, I believe that this emerging morality, or social conscience, whatever you want to call it, arises partly because of the need to establish trust in the absence of the ability of the code to perform that task. Again, this isn’t to say that the code necessarily should perform that function, just that in a digital world where anonymity, for example, can be a default, trust is established loosely, work-arounds are found, and at another end of the spectrum people take advantage of the lack of a trust structure, giving rise to griefers, cons or thieves. Griefing, in some views, is a gift in a very odd sense, because it provides insight into what structures are missing, it highlights what’s allowed and what isn’t.
In this fragile framework of trust, where the code doesn’t establish things like identity toggles, for example, or where your ability to “go invisible” isn’t embedded effectively in the social applications, we see the emergence of sub-cultures and communities who work within the confines of policy and software limitations to establish tribes, and the norms and behaviors in these tribes arise as much because of the platform on which sociality is performed as human instinct.
Because of these things, we can’t look at changes to the code solely in terms of technical possibilities, they have impacts for wider policy and governance, impacting as they do a site for sociality that’s growing, expanding, and providing a new range of tools for expression.
If You’re Underage, Prove It
Linden Lab has made policy decisions over the years, some of which had unintended consequences, and some of which were ineffectively executed. One of those policy decisions was related to Residents of Second Life who chose to have child avatars. Another of those decisions was related to age verification. Both policy decisions were supposed to build a framework through which the range of choice for personal expression remained as broad as possible, while the ability of underage individuals to access the Main Grid were restricted.
The main change to the code was the implementation of age verification which, as Dandelion and others have pointed out, was seemingly abandoned as quickly as it was adopted.
Against this backdrop, the community of child avatars responded to their exclusion from the SL5B events with a parallel birthday celebration. The call for inclusion included a wide range of arguments, although the main one perhaps was that child avatars are not children.
Now, however, the challenges of age verification and perceived flaws in abuse reporting have some in the kid’s community advocating that because the system is broken, the use of the system of abuse reporting itself should result in sanctions against those using it. The subject of abuse reporting was the topic of one of Torley’s neon-clad videos and follow-up resident discussion, including on this issue.
Within a section of the child avatar community, Residents can be banned from sims by the estate manager for reporting that another resident is underage. The rationale is that because someone could jokingly claim to be underage, or “could have been drunk, could have made fun. Heck, it could be a typo” that unless the resident in question provides ID proving that they’re underage, there’s no grounds for filing an abuse report, and therefore the filer is griefing rather than protecting the Grid from minors:
“An abuse report is a measure not to be taken lightly. Residents banned from SL lose their land, their Linden Dollars, their creative work and probably their friends. I will abuse report kids for griefing, and for running around naked or violating the rules of my land or community. I will not abuse report residents solely based on them saying they’re less than 18 years old, as long as I can not prove this to be true.”
It’s explained that there’s an opportunity for error: someone says that they’re 14 or whatever as a joke. Someone reports the joke, and Lab suspends the account pending age verification. The process for age verification is cumbersome and lengthy, and there’s a risk that in failing to age verify (because the Resident doesn’t want to provide sensitive documentation, for example), they risk losing their account and assets.
There’s supposedly a second hypocrisy that the ban on abuse reporters is meant to serve, however, which is an age bias, and that science backs up the idea that separation of individuals by age creates mental illness:
“I abhor discrimination based on age. Psychologists and psychotherapists all over the world plead for a society where adults and teenagers, kids and old people mix up. Seperation based on age creates social stupidity, ignorance and mental illness. Where generations learn from each other, social intelligence rises. This is the state of science today.”
The execution of policy on the exclusion of minors on the Grid is left in the hands, therefore, of a group (Linden Lab) that the author claims can’t execute. In the absence of the ability to execute age verification, it’s the people who REPORT other Residents of being underage that should be sanctioned, rather than those who say (mistakenly or truthfully) that they are, because in this author’s view there’s a greater tendency towards using AR for griefing than as a true protective measure against violations of the TOS. And if a few kids slip in, so be it – at least now they know on which sim a policy has been established to protect them against reporting.
Governance Starts Here
This example is illustrative, I think, not so much about issues of age verification or avatar identity (although it contains those issues), but rather of the question of where governance lies in virtual worlds and the Web.
These decisions have been actively promoted, both in-world, on blogs, and on popular podcasts. They form a level of governance at the sim level which, while possibly unintended by Linden Lab, happens because of the perceived failings of their system for age verification and abuse reporting: once again, the code has influenced (subtly or not) our behaviors towards protecting communities.
The argument was put forward by another blogger:
“Perhaps if there were a system of sanctions against filers of false abuse reports, a faster turnaround time requiring less sensitive personal RL documentation to clear one’s name, benchmarks to show that the system is working or not, greater communication and sympathy from Linden Lab, or if they would at least work on weekends for crying out loud, I would change my mind about this.”
In the absence of global policy, clear mechanisms of enforcement, and a process of appeals, governance devolves to sub-communities, having been abandoned by what once used to be called “Game Gods” but that now call themselves “platform owners” in an attempt to stand clear of the need to play a role in social policy.
This leads to the arguments on Terra Nova that the governance of virtual worlds should be considered separate from that of the wider Net, and that platform owners should defer to the laws of the nation state while attempting to avoid global policy. Ren argues that virtual worlds are different from the Net in general because of the centralization of support functions:
“Management of virtual worlds, in terms of technical control and customer support falls into three rough models global (EvE, There, SL etc); Regional (WoW, LOTRO), National (Habbo). Virtual worlds have identifiable and self-identifying communities.
Given the above, I would suggest that the key differences between the net and virtual worlds – when looking from a very high perspective of global governance structures, are:
- the net derives its nature and key benefits from being interconnected, global and ‘neutral’ (in terms of connection, content, protocol)
- virtual worlds derive their key benefits from creating ‘spaces’ with varying degrees of structure (often those structures being ones intended to generate game play) in which communities form.”
What’s left unstated is that the ’spaces’ in which communities are formed will tend towards self-governance and self-policing in the absence of effective platform governance.
Morality and Tribes
I previously wrote:
Synthetic worlds magnify, focus, and contain our explorations until they cycle back to catch our “real selves” and to challenge our long-held assumptions. Alts, for example, seem like separate questions – divorced from reality, and ‘not possible in real life’, until we awaken and discover that the very questions about avatars and alts are really questions about trust, self-confidence, faith, our personal moralities and how we view our ethics in the context of the broader “tribe” or “world”, and our yearning to have a purpose and place in life.
Today’s issues are less about age verification than they are about content protection, interoperability, and establishing domains of trust and tools for identity management. As these decisions are made, they’re not solely about what the code can or can’t do, they’re decisions about governance, policy, transparency, and creating a balance between the tension of the ‘tribe’ or the wider social good, the individual’s right to create and the decisions that need to be made on how to best protect that right, and the tendency of policy to devolve to sub-communities in the absence of thoughtful governance and its supporting code.
As all the mini grids pop up, each with its own game god, each with its own blog post establishing policy, and each of these policies based on their own moralities – whether in support of age inclusiveness or in support of free-to-copy-content, the tools we have as individual users for making sense of the rules and philosophies underlying these spaces we enter are in the hands, for now, of the people crafting those spaces.
I would argue for a right to transparency: an information overlay that both allows control over my identity and gives me fair access to an understanding of the policies, code and enforcement that will underlie the worlds I enter.
Excellent analysis. This leads directly into how states interact with coding authorities/platform owners, particularly when the two might be at odds over issues of national or transnational legislation. As virtual world assets increasingly have real world value (to put it very bluntly) will states feel the need to intervene in virtual worlds if, for example, the inworld assets of a large corporation with lobbying powers are compromised by inworld ‘violent’ actors? Will intervention powers be devolved/outsourced? What thresholds exist? And a thousand more questions …
Lots of good questions, Dusan. We don’t really have any good models of fair and uncorruptable governing bodies in the world, I guess either.
I was playing with ideas of where things could evolve….
We had some talk about avatars that carry with them lots of information about our preferences and other aspects of that virtual world identity and history… we talked about the Kurzweil observation of how change is happening at an exponential rate, so we can imaging huge amounts of computing, memory, and bandwidth at our disposal…
so in that context I hatched an idea (while flipping through the programs on my Tivo’s now playing list and seeing ‘My Name is Earl’):
Virtual Karma
See, with real karma, we rely on comeuppance being meted out some mechanical aspects of the Universe that are not currently measurable with our science, however, with sufficient technology, karma can be automated. Avatar Karma can be just the start. The karma of every entity can be quantified and monitored.
Massive amounts of computing resources would collect and analyze data in a neutral and sanctified cloud computing complex, fact checking and making available what is needed to be known while maintaining scrupulous adherence to privacy.
We could fairly and exhaustively monitor if google really is holding up its promise to “not be evil”, we can instantly see any taint improper corporate influence in our governments, we can see the provenance of our food as it is being served, see the true colors of a policeman’s conduct on his sleeve, hold cell phone companies accountable for treating customers like scum… you get the idea.
*Notes that my Plurk karma has dropped steeply over the past few days.*
I’ve known kids in SL on the main grid and I’ve helped a few and AR’d one for being an idiot. The only real problem with a kid in SL is that people outside of SL may possibly accuse you of being a pedeophile for a relationship that may develop while the child may not have been totally honest about their age. It makes me be afraid to interact with those that I suspect to be underage. In real life, you can at least look at them and make a reasonable judgement of age. That ability doesn’t exist in SL or on the internet.
[...] code, creates a third construct which are the sub-cultures and tribes (as I droned on about in my last post). It’s not a user/owner paradigm, but rather one of tribe/code, and it will be interesting to [...]
[...] up on my recent post about how within the child avatar community of Second Life sim owners have taken to policing their [...]
[...] clear policy coupled with appropriate enforcement. As an example of how an imbalance can occur, I recently wrote about a sim used by the child avatar community where reporting an age offense (in this case it [...]