Tag Archives: unsw

Postgraduate symposium abstract – Stranded Deviations

I’ve just submitted a finalised abstract for a twenty minute paper I’ll be giving at the UNSW postgraduate symposium on Monday September 3. (Specific time and location TBA.)

The symposium theme is ‘Making Tracks’ so, naturally, I’ll be using plenty of dinosaurs in my presentation.

Title and abstract are copied below.

I might actually blog about some of this stuff one day, though the rest of the year sees me quite busy writing other things so it may take a while =/


Stranded deviations: Big Data and the contextually marginalised

Knowingly and otherwise, we all leave traces when we use digital technologies. As social and practical interactions moved to the digital realm, facilitated by technological breakthroughs and social pressures, many have become understandably concerned about user privacy. With the increased scale and complexity of stored information, commonly referred to as ‘Big Data’, the potential for another person to scrutinise our personal information in a way that makes us uncomfortable increases.

However, it can also be argued that because there is so much personal data stored in various digital systems our privacy is retained ‒ we all become lost in the noise. Attention is a finite resource so it becomes unlikely that we will experience a privacy breach by a real person. In practice our traces are most often treated as data, computationally analysed, rather than content, scrutinised by biological eyes.

‘Security through obscurity’ may appear to be an inadequate concept here because privacy breaches occur regularly. However, ‘cyber attacks’ are directed at targets who stand out from the noise, chosen based on some form of profiling. Therefore, within any context, certain individuals become disproportionately targeted. Those regularly contextually marginalised have the most to lose from participating in a culture of Big Data, raising issues of equal access.

In this paper I bring these ideas together to argue that the privacy discourse should not only focus on the potential for scrutiny of personal data, but also the systems in place, both social and technological, that facilitate an environment where some users are more safe than others.

Postgraduate symposium abstract

I’ve just submitted a finalised abstract for a paper I’ll be giving at the UNSW postgraduate symposium in September.  I thought I’d post it here =)

It was difficult choosing between this and my other, similar topic that focused more on temporality.  (I think this one was easier to find useful examples for.)  I figure these topics will both end up as chapters in my eventual thesis so I don’t feel too bad about it yet.

Time to have lunch, and begin making plans to refamiliarise myself with Deleuze!


* Required fields: human rights issues in the digital influence of identity

As the number of people taking advantage of the convenience and social aspects of digital communications technology grows, the public discourse on privacy has become louder and more urgent. While privacy policies and regulatory organisations guide data holders toward responsible practices and aim to reassure users about their online safety, engagement with digital technologies may have unexpected, negative consequences for the nature of identity.

When signing up to digital services, we provide personal information for practical purposes and in order to personalise our experience. But what if you don’t identify with any of the options allowed in a required field? What if you feel certain mandatory details are irrelevant to the context? What if you wish to omit information due to concerns over personal safety? When digital services decide what identifiable information is relevant about us, and make the declaration of these a requirement of legitimate participation both within our existing social circles and the wider public sphere, our identities are altered and our potential for expression becomes diminished.

It has been said that access to the internet, as it facilitates our freedom of expression, has become a human rights issue. In this presentation I will argue that, when technology imposes features of identification some users are uncomfortable with, when it erases minority voices from the conversation and when such experience leads us to tread more carefully with our digital footprints, external rules that manage identities in our digital environment also introduce issues of human rights. This presentation will then outline some of the ways we, as a society, may begin to address this issue and reclaim some of this lost control over our personal identity.