The Moral Operating System

We (as in those of us involved with software development in its broadest sense) are not often, if ever, asked about the moral implications of ‘our’ software. Sure, most have heard of the privacy discussion about the information stored within browsers, the cloud or with other organisations. But morality rarely figures as part of the discussion.

It was about ten years ago, that I first encountered the talk of (mobile) software’s ability to track the user’s location and ‘contextualise’ the behaviour – e.g., display ads for the movie running at the cinema, that you happen to walk past. Foursquare, Google Latitude, and Facebook Places are three such services available today – albeit for other purposes, but stay tuned for that movie ad.

However, the privacy discussion is not just about privacy settings (easy or not). Damon Horowitz certainly asked some interesting questions about morality and technology in his recent TED presentation titled ‘moral operating system‘. It is akin to the discussions found in science about morality, e.g., hydrogen bomb; and the what if Weismann’s barrier is permeable – should we (still) apply gene therapy, if we also risk a heritable change to our DNA?

Developing software hasn’t so far been associated with these types of moral questions, but it seems that our technology has reached a critical mass where we might need to get used to discussing these. More isn’t just more . . .

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s