My research became a little more complicated in November. And by ‘complicated’, I always mean ‘interesting and fun’.
I was having a conversation with Nina Funnell about my work on gendered spaces and how this influences practices of social engagement. The idea is that enforced gender declaration together with a limited range of response and an imposed prominence of this attribute creates issues for equality of participation. Users with perceived feminine profiles are often marginalised, their voices weakened through experiences of harassment, whether direct or observed (one study found “female usernames received 25 times more threatening and/or sexually explicit private messages than those with male or ambiguous usernames”1), and gender stereotyping.
Gender, I find, is a helpful example that explains this issue of marginalisation and fairness of participation within communities. But it is a specific illustration that highlights a broader issue where individuals are not considered equal in practice – whether through silencing or through others internally delegitimising others’ voices through stereotyping – and the level of control individuals have to diminish the negative stereotypes that work against them.
One clear solution is to give individuals more control over deciding what aspects of themselves they wish to reveal. Removing the mandatory gendering of social media spaces and allowing pseudonyms, for example, is a good step in the right direction. Users ‘a1′ and ‘a2′ (with no other declared attributes) are arguably far more equal than users ‘alison’ and ‘ben’. The more attributes added to the basic user ‘skeleton’ the less equal users become, depending on the viewers and their personal understandings of these attributes within a social context and their process of stereotyping. I argue that digital systems have too much control over the mandatory enforcement of declaring information that becomes publicly attributable to the users.
Putting it simply, publicly categorising users within communities, such as gendering spaces through mandatory declaration, harms equality of participation.
Giving users more control over their public appearance may seem like a simple solution that fosters equality in community engagement. In fact, this was the direction my argument had been taking for much of the year. However!, a real solution is much more complex.
Nina mentioned a concept called ‘stick-figure sexism’. It is where we are shown a simple stick figure and we are asked to describe the type of person we believe it represents. More often than not, the response outlines a middle-aged, able-bodied, white male. This, then, is said to be considered a ‘normal’ person (in stick-figure land), and any divergence to it is represented through ‘add-ons’ such as long hair, coloured in heads that represent different skin shades, walking canes, etc.
gethen blog has a fun, short post on stick-figure sexism, which describes xkcd comics as a “serial offender”2. On a related note, I’d like to share my new favourite xkcd comic, as published on that post.
Remixed by gethenhome from the xkcd original.
Hearing about this concept, I recognised important implications for my own work. If I’m advocating the removal of mandatory categorical fields within public user profiles, it is conceivable that some communities may be no better off – or be even worse. If we remove our focus on gender, say, then it could give rise to the assumption that more users fall under whatever gender we imagine is more likely to participate within those communities. We may assume that all (or the vast majority of) ungendered, pseudonymous users are young, white, male Americans and in doing so destroy the sense of diversity we are led to experience in real-world situations.
All in all, I believe the negative consequences of this (let’s give it a name) ‘profile sexism’ in practice will be small, especially when compared to the positive consequences that would be far more apparent. However, it’s certainly important for me to address in my work. To find a reasonable solution we need to look at both the technical and social gendering of spaces.
An early observation I noted was that, through many years’ engagement with communities on livejournal, I have personally experienced many situations where assumptions have not followed this idea of profile sexism. In fact, many communities lead me to perceive a large variation of cultures (I use the term very loosely) coming together to discuss a topic they have a shared interest in. One reason for this is that I’ve been participating within these communities for so long that of course it would have sunk in that users are from different locations and account for a variety of different cultural demographics. This idea suggests the sense of multiculturalism and the acceptance of various views is learned over time. However, it’s difficult to determine the strength of this because it’s difficult to remember first impressions to compare present understandings to.
Another reason for the sense of inclusiveness I register from these communities is that there is something about them, some aspect of the design – influenced by the livejournal system, the community moderators and the members themselves – that may facilitate this. It may be possible that some element of these communities’ appearance suggests they are more welcoming and inclusive than, say, the feeling I get reading YouTube and reddit comments, or user responses to articles on smh.com.
In truth, it’s probably a mix of both a learned understanding through previous interaction and particular design elements that help inform a sense of inclusiveness. In effect, what I’m now looking at (though this is only a small part of my research) is a way to determine better system design for various communities, based on the kind of interaction desired, arguing against the common privacy demarcation between Zuckerberg’s and Schmidt’s “communities are better when everything is public [also: we can make more money from it]” and my previous call for users to have full control over their public profiles and be discouraged from publicising anything that’s not relevant.
I still hold the latter view, of course. But I seek to determine which elements of system design facilitate healthy interaction between users with different backgrounds and social identifications. This can’t be answered simply by discussing ‘privacy’.
Oh, look, I appear to have summarised my thesis in a few sentences.
1. “Female-Name Chat Users Get 25 Times More Malicious Messages”, 9 May 2006, physorg.com, <http://www.physorg.com/news66401288.html>
2. “Stick-Figure Sexism”, 29 December 2009, gethenhome.wordpress.com, <http://gethenhome.wordpress.com/2009/12/29/stick-figure-sexism/>