When the iPhone 4S came out, I was truly amazed by the overall disappointment that people expressed. Personally, I went on a whirlwind journey that I’m labeling, ‘Siri’s Three Phases of Enlightenment’ and am here to share.
1st Phase – Siri, redefines Human-Machine Interaction!
My first reaction after seeing Siri on the iPhone 4S was that Apple’s at it again, redefining human-machine interaction. They started the touch-based device movement a few years back that Microsoft, Google, Amazon and others adopted. Touch-based devices have become ubiquitous and the first thing consumers do when trying out a new gadget is touching the screen. Microsoft has gone so far into introducing that technology in their yet-to-come Windows 8. With Siri, Apple is moving us beyond touch-based interaction and into a more natural means of interaction, that being spoken language. Apple did not invent touch-based technology and neither did they invent voice-recognition, but they’re masters at implementing and integrating technology into their products. So, as I watched the online presentation of the iPhone 4S release and the unveiling of Siri, I saw the dawn of a new movement. A movement that will again redefine human-machine interaction just like its predecessor.
2nd Phase – Siri and its workings
As consumers got their hands on the iPhone 4S and reports started flowing in about Siri, my curiosity led me to investigate how Siri did its magic. After some digging around, I realized that there’s more to Siri than meets the eye. It’s not only meant to redefine human-machine interaction, but it’s also meant to target Google services through non-generic search. There's been a lot of talk lately about the impact on Google search and I think this PCWorld article, "Siri as Google Challenger sorting fact from fiction" handles it well. Typically, Siri uses Google as its default search engine for generic questions, but when it comes to non-generic search like location-based questions or calculations, Siri turns to different online resources such as Wolfram Alpha or Yelp for example to answer questions. In short, i agree with the article linked above that Siri isn't taking Google out of the search business, but when it comes to non-generic search, it sure seems to be changing the dynamic.
3rd Phase – Levels of abstraction
Siri, set my alarm for tomorrow at 5am. Siri, what time is it in Berlin? … The more we interact with Siri and order it to do some tasks, the less we will interact with the underlying layers. Siri will slowly establish itself as our intermediary, our go-to assistant. As long as it keeps doing our biding to our satisfaction, the less we will care about its workings and the layers/services it interacts with to achieve those results. I believe the natural progression of this will be that we will lose the ability and know-how of handling those simple tasks on our own. Apps will be designed with API’s for Siri interaction and slowly shed the unnecessary end-user interface. In addition, the amount of data and more importantly the multitude of interactions between different systems/apps to accomplish a specific task, will be so complex, that it'll be best to let the ‘computers do their thing’. This reminds me of a 60 Minutes segment about “How Speed Traders are changing Wall Street” where we see the results and conclusions, but have no idea how we’ve reached them. In other words, Siri is out to not only redefine human-computer interaction, but it brings about computer-to-computer (and app-to-app) interaction and sets itself up as our chosen intermediary with any future online interactions. With an offer to assist, Siri inserts itself between us and everything/everyone we digitally interact with. So, 'Siri, turn up the heat', might be a possibility as home appliances go online, or 'Siri, if I'm out of beer, remind me to buy some after work' is yet another one.
Siri, over and out!