Tuesday, March 05, 2013

Cybersecurity, Human Factors & User Experience - Part 2


Cybersecurity, Human Factors & User Experience is a multi-part series examining the impact of User Experience Design on cybersecurity written by Stephen Ruiz. Over the next few months we'll be addressing the following:

Part 1 - Cyber War and Human Error
PART 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)
Part 3 - The Iceberg Principle
Part 4 - A Common Visual Vocabulary
Part 5 - The Peaceful Data Warrior (Designing for Healthcare, Finance & Cybersecurity) 


Part 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)

 

"Good displays of data help to reveal knowledge relevant to understanding mechanism, process and dynamics, cause and effect."

- Edward Tufte

When the Space Shuttle Challenger exploded on January 28, 1986, most observers believed that seven people lost their lives due to poorly functioning O-rings on a booster rocket. If you're well versed in the theory and practice of Visual Design, you probably know the work of Edward R. Tufte, a professor emeritus at Yale University and an expert in the field. Dr. Tufte has long argued that misleading visual explanations (also known as bad design) was really at the heart of what went wrong.

As evidence, Tufte cites charts given to NASA by Thiokol, the makers of the solid-rocket booster that in essence failed to communicate vital information about temperature and damage to the O-rings on previous flights. Tufte then designed his own chart, which demonstrated how the same data could have been presented in a way that showed the relationship between temperature and O-ring damage more clearly.


(Chart designed by Thiokol showing the relationship between temperature and O-ring failure. Temperatures are vertical, O-ring problems are squiggles. Two pictures per launch analysis. )

  


(Chart designed by Tufte. His solution uses two dimensions, horizontal and vertical. As the temperature dips lower on the left, the damage to O-rings goes up.)



This topic has generated a number of opinions and inspired exhaustive amounts of research. Our goal isn't to toss yet another theory into the fray. The real takeaway here is that an interface-level design flaw resulted in disaster. This is a User Interface problem. It's a tragic (and very public) example of the impact of information design upon our best and brightest. Remember, the charts were presented to the top experts at NASA, who quite literally were rocket scientists.

If rocket scientists couldn't make a life-or-death decision based on a low-tech chart in 1986, how can we expect better results from average citizens in the immersive, high-tech environment of 2013?  

The truth is there should be no such expectation. The rapid evolution of technology has outpaced our ability to contain all possible negative outcomes. Since the Challenger disaster in 1986, digital cellular phones were invented, the World Wide Web was created, the CD replaced the LP, the MP3 replaced the CD and the world is now hyper-connected via social networks.  

In short, we're all increasingly submerged in data. Private information about our finances, our health, and our personal relationships is regularly transmitted electronically. As greater aspects of our lives are dependent on our own personal digital ecosystems, Cybersecurity threats are bound to have a greater impact on everyday citizens. 

For maximum safety and security, today's average smart phone users should also be Cybersecurity experts. But herein lies the problem. Most people simply don't have the time or the training to safeguard their data properly. So what's the solution? This expertise should instead be baked into the products, software and apps they use every day. Simple but effective security measures must be a basic aspect of the design.

For those of us in the business of designing these systems, this calls for a fundamental shift in how we do our jobs. The urgency has never been greater. We must start thinking not in terms of a "better interface," but in terms of a new visual vocabulary. We must learn to design for people, not for machines. 

In August 2012, a San Francisco Design and Strategy firm called Cooper posted a piece by Golden Krishna titled "The best interface is no interface." The premise is that a good user interface is one that's so seamlessly designed into the product that it's as though it's not even there. 

This is a perfect segue into our next topic. Check back next week for an exploration of "The Iceberg Principle" as it applies to user interface design.