The digital master of intelligenceTwo talks convinced me that automation will play an increasing role in our lives. From an engineering perspective, Amit Kapur and Jeff Bonforte explained the powerful robot applications that run within our phones, our cars, and our houses. From a design perspective, former Cooperista Golden Krishna shared the design principles that might throttle us toward more interfaces-less interactions. Now three scenarios to highlight the difference between the human, the machine, and the automaton: The human: It's 2000. You're on vacation. You make your way through the airport following the car below a hovering key. A uniformed gentleman greets you, arranges the transaction, and offers you a key. The machine: It's 2010. You're at work. You navigate to Kayak, input your dates, times, locations, and ultimately payment details. You arrive at the airport and your name is displayed on the big yellow-lettered board. You hop in, drive off, and enjoy. The automaton: It's 2020. You're asleep. The mound of data collected on your purchase behavior rivals all human knowledge from the beginning of recorded time – and that's only your Safeway club-card. Your agentive automaton is busy running vacation scenarios based on your predicted travel patterns and your wife's Pinterest preferences. You wake to an alert that recommends taking time off on July 8th with a full itinerary of dance classes, ping pong sessions, popup dinners, and, of course, you're self-driving rental car. Let's consider another example: the brakes within the car you've just rented. The human: Its 2000. You're in Toronto. Suddenly a raccoon waddles in front of your car, but its icy. Remembering back to the lesson from your dad, you pump the brakes to avoid seizing. The raccoon continues his waddle, and you slow to a stop. The machine: Its 2010. You've got a Toyota. You're hurdling along at 60 mph when the raccoon returns. Its still icy in Toronto. You hit the brakes, and you hear a fast thumping noise coming from the car. Your gut tells you something is wrong, then your brain recalls the newest feature of your car, anti-lock brakes. You suppress your instinct to pump the brakes. All is well.
Q: How does a designer screw in a light bulb? A: Does it have to be a light bulb?The automaton: Its 2020. You've got a toyota. Your car is driving you down the 101 at a comfortable 58.5 mph, while you read your retinal subscription to the new york times. What you don't see is that your agentive automaton is checking the traffic ahead, the driving record of the person tailgating you, and the collision statistics over the past 10 years in at this interchange. You do wonder why you've now shifted to the passing lane, you also wonder why you reroute to take a later exit. In a fleeting moment, while watching the news-tube, you count yourself lucky to have avoided a 3 car pile-up in the morning rush-hour. No brakes, no locking, no luck, a bit of math, and lots of data. As a designer, you may feel a bit of empathy for this agentive automaton. What we do as designers is try to understand the bigger goal our target persona is trying to achieve. In the case of car safety, you want to avoid accidents in the first place. This is what intelligence helps us with. This is why AI is so disruptive. What we're talking about is a new way of designing. Soon the Watson AI that won Jeopardy and the Google AI for automated driving will fit into your phone. For inputs, that phone is connected to a bunch of APIs with years of your historical behaviors and preferences. Even phones today have many sensors: camera, NFC, GSM, Bluetooth, wifi, GPS, light, touch, and more.
What does this mean for design?Artificial Intelligence + Natural Language Processing + Machine Learning + Distinct Ontologies = Your Next Design Intern Designers need to embrace AI as a partner designer. Interaction design has been concerned with visual interfaces because they offer a vehicle to politely achieve user's goals. Now, AI offers a new way to be much more granular about goals, but requires and even more detailed definition of domain specific context. This means the designer is responsible not primarily for visuals, but rather for codifying personality.
At SxSW Amit Kapur and Jeff Bonforte offered a framework for thinking about how to implement automation: 1. Learn (collect data) 2. Adapt (preferences) 3. Implicit (assume) 3. Proactive (predict) 4. Personalize (use data)With intelligence and automation in abundance, we're going to need to build mental models that can represent these ongoing processes. We don't understand gravity, but we can use it. Just like taxes, etiquette, and gravity, there are forces we inherently understand. We have mental models to deal with social and physical realities. Automation becomes a layer on top of those realities, and we need a mental model to make room for its presence. Cooper's Chris Noessel and Stefan Klocek offer the Jinni as one way to look at this new type of connected intelligence. Golden encourages us to fight our urge to "slap a UI on it." What's becoming clear is that our utopian vision of screens everywhere is giving way to an augmented reality.
Golden offered this set of principles to begin to design for it: 1. Eliminate interfaces to embrace natural processes. 2. Leverage computers instead of catering to them. 3. Create a system that adapts for people.Interfaces will always be with us, and sometimes are us. The screen may not be the medium of interaction, but the network will be ever present. Glass is all about persistence, and with that came a new set of principles to consider. Building on these principles we can improve other interfaces that, though not on our face, are increasingly in our face. Google talking shoe was a proof of concept, the leap motion gestural peripheral is for real. Packs ensure a digital copy of our behavior. Google Glass offers us a new set of design principles.