Censorship

It's long been a problem for those who run shops to strike a balance between catering to the widest possible market and running the risk of causing offence to others.

Most recently, it's been Apple who've been in the news for trying to strike this balance. In one case banning The Sun newspaper and in another refusing an entry to the App Store from a pop group.

And of course, before that, there was the case of the application which involved shaking your phone until the baby stopped crying.

Whilst one could question the mind of someone who's write such an application, we shouldn't let emotion come in the way of deciding whether it's the place of the App Store to dictate the content of the applications being sold.

This isn't an Apple-specific problem, of course, it's just that they have had the most recent examples - maybe by virtue of having the highest profile App Store for mobile.

Those who take a conservative line on the censorship of applications being sold online would point to the fact that adult books are not sold in a children's book store. But the point is that a children's book store is designed for children and so it would not only be inappropriate, but wouldn't make sense, to stock adult books. Maybe a more striking example is how newsagents manage to stock both jars of sweets, freezers full of ice-cream, comic books - and a top-shelf full of adult magazines.

It's not only a metaphor to say that the children don't see the adult magazines in this case, it's literally true. They are available, but only in a place where those who should not access them cannot see them.

Of course, an application store (or a video store, or a music shop) is not designed solely for children, and so there will inevitably be material therein which is deemed inappropriate for children to see. So how to deal with that? And whose problem is it?

In recent years, here in the UK, there has been a discussion around what it means to be a publisher in the digital age. First it was child pornography which was the catalyst for the discussion and more recently it's been terrorist information. But the argument was the same in both cases - if an ISP (or server hosting company, or whatever) has some material on their servers which is inappropriate (or even illegal) then are they acting as the publisher of the information, and hence are they responsible for its content?

But I believe that the answer lies in the top-shelf approach where the magazines which may cause offence are wrapped in brown paper and kept out of the way of those who don't want to see them. At some point, we have to allow the user to decide the content they want on their phone rather than trying to mandate a set of morals on behalf of the server from which they download the applications.

Of course, the example of a baby-shaking application is extreme, but it would be easy to extend the argument to less extreme - but still contentious - issues. There are the issues they say you should never discuss with strangers - religion and politics - which are guaranteed to polarise opinion. If a political party created an application to run on your phone, then should this be included in the application store? What if that party was the British National Party? What if a religious organisation created an application to be run on mobile devices? Would which religion the organisation belonged to influence your thinking on whether the application should be published or not?

To know where to draw the line is difficult. Almost everyone could think of where they would draw the line, but that line would fall in a different place to the next person to asked, which would in turn fall in a different place to the line of the next person. No matter where you draw the line between "acceptable" and "unacceptable" you will find yourself being attacked by those who feel it's in the wrong place.

But why draw the line at all? It's a decision to be made early in the process of opening a shop - be it online or offline - as randomly applied censorship can generate worse publicity than taking either a very liberal or very conservative approach (as Apple are discovering at the moment). Is it acceptable for an online store to make it clear that they take no responsibility for the content of the applications they are selling?

Morally, I think it is acceptable - but I don't think it works that ideally in a commercial world. There are competing pressures here. The market for violent/adult computer games is enormous yet hanging a sign saying "some of the things I sell are suitable for adults only" on the front door of your shop may turn away customers who would otherwise have come in to buy more innocent wares. Again, I believe in the top-shelf approach - in pure commercial terms, the business lost to those who refuse to patronise your business because of some of your wares no matter how well you segregate it will be outweighed by the extra business you will gain from stocking the extra items.

So, to return finally to the moral stance - is it acceptable to make money from material which is deemed offensive? Ultimately the answer is "no" - in my mind anyway - yet it's important to remember that whilst almost everyone would agree that an application which involves shaking a baby to stop it crying is on the wrong side of the line, the exact positioning of the line will be different for almost everyone you talk to...

Popular posts from this blog

Dealing with the after life

A war of words

A book I didn’t like