Lecture 10 – Software

So, we’ve discussed hardware, now for software. As I said in one of my previous posts, one is useless without the other – technology requires hardware and software working together. In this lecture we looked at the beginning of software as a concept, the ecology and epistemology of software and meta-mediums.

From Small Beginnings

Most people will never have heard of Margaret Hamilton and yet within the realms of software, she’s one of the most important people there is. Hamilton pioneered the concept of software engineering and helped NASA in their quest to get humanity to the moon in the 1960’s. Thanks to her, notions of what software was, what it could do and how we could use it, became something to focus on. She envisioned a world where hardware and software could work together and be used by all, for the benefit of all, rather than as something exclusive and expensive. In the early days, software engineering took years to implement and it’s taken Hamilton most of her life to see software spread out across the globe from its beginnings at NASA.


As I discussed in my blog posts related to the classes I had on ecology, no media can work alone, it takes a combination of hardware and software, multiple programmes and networks, working together to create and use technology. Without ecology, web services, the internet, mobile applications, online gaming and more would not be possible – you could still have a TV, mobile phone or laptop as that’s just the physical hardware that is built from components – they just wouldn’t be able to do much.

One of the leading voices on the ecology of software in media is Manovich. He’s written extensively about how our technological ecology is made up of countless parts – some of which are software based. I’ll be talking more about Manovich and his book – Software Takes Command in the seminar 10 blog post.


Keeping it simple, epistemology is basically knowledge and understanding. It’s not only the information we have but how we perceive and make sense of that information. So, you might know that the moon orbits the earth but, this knowledge means nothing if we don’t understand what that means, how this happens, what it happens etc. We have to justify and rationalise the knowledge the way in order to put it into perspective and make it useful.

Our brains can be directly related to the storage hard-drives in computers and other pieces of hardware. The information is processed by our senses – touch, taste, sight etc. and stored in the brain, computers will receive information from the user and store it in the hard-drive. Both the human brain and the hard-drive have a maximum amount of data they can store – a capacity, and both have to be able to process and understand the data stored within them – otherwise there’s no use in storing it. If you saved a Microsoft Word document onto your PC but didn’t have Microsoft Office installed and therefore couldn’t access the document, what would you do? You’d delete the file.

In a world so reliant on software and so full of data and information, we are now in an age where we cannot understand and make sense of all the information we have. We rely on computer algorithms to make sense of data and display it in a way we can understand – usually visually. If you collected the traffic data for an entire city for a week, there’s no way you’d be able to spot patterns and correlations within that huge amount of data, or be able to express them in a useful way. This is why we rely on visual techniques to view the data and software to store and analyse it – a computer can carry on millions of calculations and make decisions base on the information and settings it is programmed with – the human brain is a marvellous thing but it can’t compete with that.


The computer is not a medium on its own, as we understand them to be. But rather it’s one of the first meta-mediums. As Manovich describes, the computer is a “combination of existing, new and yet to be invented media” – when we use a computer we use pieces of code that were written decades ago with companies like IBM and Microsoft, they have since been given a new case and many tweaks but, some of the principals remain – using code to write programmes that can be accessed from a main screen – the desktop. Technologies such as web services, sound, cameras, disc drives and more are contained within the software and hardware of a computer – making this a distinct meta-medium – an individual ecology that links to the wider network of the media ecology.


I think that’s enough for today, I’ve covered the most thought provoking points raised in this weeks lecture. Check out my seminar 10 blog post to read a little more about Manovich and his thoughts on software!


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.