Monday, July 01, 2013

The Iceberg Principle (pt.3)

In 1965, Gordon Moore predicted that computing power would double every two years. Moore, who co-founded Intel, was surprisingly accurate in his rough estimation. What does this imply?  Your current computer is likely to be at least 10,000,000,000,000,000,000,000,000 times faster than a computer from 50 years ago, and that’s if you have a slow and outdated machine.  

Yet it’s also a deceptively simple picture of the evolution of technology. As Niklaus Wirth
observed in 1995, "Software is getting slower more rapidly than hardware becomes faster." Google founder Larry Page restated this in 2009 - quite a credible endorsement in the tech world that was ultimately renamed ‘Page’s Law.’ This trend has proven as valid as Moore’s Law. Software obviously it isn’t getting slower, per se, but in relation to its hardware counterpart, it lags.  

This stems partially from the fact that as software becomes bigger, the number of bugs increases exponentially (not linearly). Bugs beget more bugs, and their relation to each other becomes more cryptic as software grows in size. Debugging is a huge part of developing software, often the most expensive line item in the cost of a project overall.

The bugs and loose ends that haunt R&D beneath the surface are usually a hacker’s secret back door. It’s the things we don’t
know we don’t know that threaten us the most. 

As the evolution of computer programming continues, but as it becomes increasingly complicated for its creators, new vulnerabilities for end users keep cropping up as well. We’re standing on the shoulders of giants every time we boot up a computer, tablet or smartphone. Our user experience is more complicated and overlapped than ever before, with social media, banking, networking, photo and video sites, and email all interconnected. Yet we only see about 10% of what comprises a user interface. While the gears under the hood purr obediently, we take security and functionality for granted.

Think of the cyber landscape as an iceberg, with 90% of its mass beneath the surface, ambiguous to the eye and difficult to measure. Unfortunately, many of us make decisions based on the 10% of the data we have readily available. This can be deceptive – symptoms are not necessarily root problems, and causation can be hard to decipher.

While software engineers and architects grapple with a behemoth set of bugs, holes and vulnerabilities that help keep things secure on the back end, smart users should take on some of this responsibility as well.  Vigilance and caution are the keys to avoiding hacker attacks. A system is only as strong as its weakest link. If we bite a worm with a hook in it, we not only compromise ourselves but also our co-workers, friends and employers. We might even inadvertently expose sensitive or classified data that could threaten anything from a small business to our national security.   

As consumers, we must stay abreast of new scams, bogus apps, and other potential threats that might be introduced into a system due to our clumsiness. All encryption, password protection, even voice and facial recognition safety measures fly out the window once a human user overrides these measures with an approving mouse click. A recent study showed 93% of computers have antivirus software installed, yet only 23% of smartphone users said they intend to install such software. On top of that, only half of all mobile users even lock their phones.

These are just basic front line measures. With smartphone threats on the rise, we’re sure to see new, prolific viruses making their way through the mobile OS world. The idea that everything is safe as long as you have antivirus software is an outdated concept. A zero-day virus (immune to all known antivirus software) can find its way onto a machine with the help of a human mistake, through a technical weakness, or by a combination thereof.

One of the most common ways humans betray their own security is by falling for ‘spear phishing’ schemes, many of which rely on a genuine-looking but utterly counterfeit UI. The ruse can take many forms, like a Facebook icon saying a new friend has invited you to do something, or that your password needs to be or has already been changed. Or, you may receive a fake eBay or Amazon confirmation email with an authentic-looking logo warning you that your credit card has just been charged. Fake emails that appear to come from a coworker or boss are common as well.

We take for granted the everyday user experiences we have with websites and applications we trust. Phishing schemes play on that very familiarity and sense of comfort. Familiar icons and logos automatically register in our brains as safe.  

Cyber security could also be described as ‘cyber vulnerability.’ Since the Web became publicly accessible in 1995, we have gone from a simple boxy PC plugged into a phone jack to an era of smartphones, 4G tablets, laptops, and cloud computing. As consumers and end users, we cope with many more dangers in the digital wild than ever before. What we see on the surface is not all that meets the eye.     

Our experience as users should be one of utility and convenience; this lies at the heart of UX and UI design. We just need to remember to use caution and skepticism as well when we navigate the potentially perilous open seas.

Tuesday, March 05, 2013

Cybersecurity, Human Factors & User Experience - Part 2


Cybersecurity, Human Factors & User Experience is a multi-part series examining the impact of User Experience Design on cybersecurity written by Stephen Ruiz. Over the next few months we'll be addressing the following:

Part 1 - Cyber War and Human Error
PART 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)
Part 3 - The Iceberg Principle
Part 4 - A Common Visual Vocabulary
Part 5 - The Peaceful Data Warrior (Designing for Healthcare, Finance & Cybersecurity) 


Part 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)

 

"Good displays of data help to reveal knowledge relevant to understanding mechanism, process and dynamics, cause and effect."

- Edward Tufte

When the Space Shuttle Challenger exploded on January 28, 1986, most observers believed that seven people lost their lives due to poorly functioning O-rings on a booster rocket. If you're well versed in the theory and practice of Visual Design, you probably know the work of Edward R. Tufte, a professor emeritus at Yale University and an expert in the field. Dr. Tufte has long argued that misleading visual explanations (also known as bad design) was really at the heart of what went wrong.

As evidence, Tufte cites charts given to NASA by Thiokol, the makers of the solid-rocket booster that in essence failed to communicate vital information about temperature and damage to the O-rings on previous flights. Tufte then designed his own chart, which demonstrated how the same data could have been presented in a way that showed the relationship between temperature and O-ring damage more clearly.


(Chart designed by Thiokol showing the relationship between temperature and O-ring failure. Temperatures are vertical, O-ring problems are squiggles. Two pictures per launch analysis. )

  


(Chart designed by Tufte. His solution uses two dimensions, horizontal and vertical. As the temperature dips lower on the left, the damage to O-rings goes up.)



This topic has generated a number of opinions and inspired exhaustive amounts of research. Our goal isn't to toss yet another theory into the fray. The real takeaway here is that an interface-level design flaw resulted in disaster. This is a User Interface problem. It's a tragic (and very public) example of the impact of information design upon our best and brightest. Remember, the charts were presented to the top experts at NASA, who quite literally were rocket scientists.

If rocket scientists couldn't make a life-or-death decision based on a low-tech chart in 1986, how can we expect better results from average citizens in the immersive, high-tech environment of 2013?  

The truth is there should be no such expectation. The rapid evolution of technology has outpaced our ability to contain all possible negative outcomes. Since the Challenger disaster in 1986, digital cellular phones were invented, the World Wide Web was created, the CD replaced the LP, the MP3 replaced the CD and the world is now hyper-connected via social networks.  

In short, we're all increasingly submerged in data. Private information about our finances, our health, and our personal relationships is regularly transmitted electronically. As greater aspects of our lives are dependent on our own personal digital ecosystems, Cybersecurity threats are bound to have a greater impact on everyday citizens. 

For maximum safety and security, today's average smart phone users should also be Cybersecurity experts. But herein lies the problem. Most people simply don't have the time or the training to safeguard their data properly. So what's the solution? This expertise should instead be baked into the products, software and apps they use every day. Simple but effective security measures must be a basic aspect of the design.

For those of us in the business of designing these systems, this calls for a fundamental shift in how we do our jobs. The urgency has never been greater. We must start thinking not in terms of a "better interface," but in terms of a new visual vocabulary. We must learn to design for people, not for machines. 

In August 2012, a San Francisco Design and Strategy firm called Cooper posted a piece by Golden Krishna titled "The best interface is no interface." The premise is that a good user interface is one that's so seamlessly designed into the product that it's as though it's not even there. 

This is a perfect segue into our next topic. Check back next week for an exploration of "The Iceberg Principle" as it applies to user interface design. 

Wednesday, February 13, 2013

Cybersecurity, Human Factors & User Experience

Cybersecurity, Human Factors & User Experience is a multi-part series examining the impact of User Experience Design on cybersecurity written by Stephen Ruiz. Over the next few months we'll be addressing the following:

Part 1 - Cyber War and Human Error
Part 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives)
Part 3 - The Iceberg Principle
Part 4 - A Common Visual Vocabulary
Part 5 - The Peaceful Data Warrior (Designing for Healthcare, Finance & Cybersecurity) 


 Part 1 - Cyber War and Human Error


"Far from being an alternative to conventional war, cyber war may actually increase the likelihood of the more traditional combat with explosives, bullets, and missiles. If we could put this genie back in the bottle, we should—but we can't. Therefore, we need to understand what cyber war is, to learn how and why it works, to analyze the risks, to prepare for it, and to think about how to control and deter it."

- Richard A. Clarke, Counter-terrorism adviser to Presidents Bill Clinton and George W. Bush.

Besides the usual rhetoric of "the need for peace talks" and "a long-term solution to the problem," the most recent flare-up in the decades old conflict between Israel and Hamas this past November may have broken new ground in its use of cyber war tactics. While cyber war is not a new topic in the nation's zeitgeist, the number of reported incidents has grown exponentially over the past few years. This video from CNN describes the incidents of cyber attacks that relate to the conflict.

   



With some security analysts predicting that 2013 is the year nation-sponsored cyber-warfare will go mainstream, it's no wonder that leaders from several western nations have made cybersecurity a top priority within their governing agendas. In his 2013 State of the Union address, President Barack Obama outlined his executive order addressing cybersecurity: Improving Critical Infrastructure Cybersecurity. This is no longer just an issue for those in the industry. The topic has moved from the realm of tech-savvy people and digital professionals and onto our national stage. Cybersecurity has gone mainstream.

Now that cybersecurity is an integral part of Homeland Security's focus along with the vulnerability of critical infrastructure (power, water, and nuclear systems), two things have become abundantly clear. One is that with the increase in attacks (along with the proliferation of connected devices like smart phones, tablets, computers and even household appliances), the responsibility of keeping data and systems secure will require more and more input from people who are not technical professionals. The second is that we need to place a greater emphasis on User Centered Design, not just for ease of use but for the sake of our safety. 

The very real (and very scary) truth is that is human error accounts for a staggering number of security breeches. Here are some frightening statistics from a chiefexecutive.net article from 2011:
  • In October 2010, Microsoft blamed human error after two computers on its network were hacked and then misused by spammers to promote more than 1000 questionable online pharmaceutical websites. 
  • In April 2011, the State of Texas discovered that the personal and confidential data of 3.5 million teachers, state workers, retirees and recipients of unemployment checks had been left unprotected on the Internet nearly one year. According to Gartner, Inc., more than 99 percent of firewall breaches are caused by misconfigurations rather than firewall flaws. 
  • The State Department’s 2008 breach of the passport system was a result of under-configured access control and a defendant’s “idle curiosity” peaked by the simple discovery that he 'could'."

Hypothetically, one could have the most sophisticated technology imaginable, but if it isn't intuitive or designed to account for human error, then it simply would not be effective. So, what can public and private-sector organizations do to address the problem of human error? The first step is to design for humans.

As we'll see in the next installment, Part 2 - Information Disasters (Tufte, Space Shuttles & Design Lessons that Save Lives), smart, user-focused design can literally mean life or death.

Friday, June 15, 2012

Stephen Ruiz speaking at the KJLH Digital Marketing Conference at Microsoft Century City

I had an opportunity to speak at the KJLH DIgital Marketing Conference held at Microsoft Century City on SEO/SEM and User Experience.

Wednesday, September 21, 2011

Wednesday, January 12, 2011

Los Angeles UX Planning Meeting


The Los Angeles "meeting of the UX families" happened last night at Pete's in DTLA. We had representatives from the LA - Usability Professionals' Association Chapter, IxDA Los Angeles, The UX Book Club and our trusted leadership from the LA UX Meetup Group. It was a great night of planning out our respective group's activities and floating ideas about what we would like to have happen as a community.

Some of the more fundamental issues we dealt with were the overlapping of group agendas, and in some cases, redundancy in programming. We managed to come up with some really great ideas for events, and somehow work out a tentative calendar for the first-half of 2011.


Some of the topics that were floated were as follows:

  • A UX Debate: An event that would be in the debate format

  • User Experience and Movies: Exploring the intersection between film-making, digital technology and user experience.

  • Artificial Intelligence and User Experience: With rapidly evolving technology, where does User Experience fit into the self-actualized system?

  • P.E.T. Design and Application: What is designing for P.E.T. or Persuasion, Emotion, and Trust?

  • Prototyping: The 2011 Refresher and evolved methods of...

  • Eye-tracking





  • Some of the established events for the first part of the year include:

  • A tentative meet up at Coloft in Santa Monica on January 25th, where the topic will be eye-tracking. (*See the LA UX Meetup Group page for the latest information on schedules.)

  • Hopefully we can keep the IxDA meeting on in February


  • Interaction Conference Recap, first week of March.


  • Schedule of all combined UX meeting should be the last Tuesday of every month.





  • This was from my personal notes for the meeting, so if there's anything incorrect here, please feel free to email me any corrections.

    Monday, December 06, 2010

    Social Media Revolution

    This is an interesting set of facts about Social Media. Done in a very hyper-media way, but the core of the message is interesting. Now, how to we monetize that? :)