Archive for December, 2007

It’s not what you do, it’s HOW you do it

Glenn on AnyGeo posted an entry about an InfoWorld reporter’s love of the iPhone and how it drastically changed her life. Reminds me of the “desirability” factor we were discussing before the holidays set in … here’s my great prediction (to agree with Glenn): applications, products, web sites, and other stuff that are not only usable, but delightful to use will be the ones that will excel in 2008 (and beyond). Because, as Elizabeth discovered with the iPhone, it’s not WHAT you do that’s important, it’s HOW you do it. We’ve all had experiences of products and features we thought we desired that were less than fun to use when we actually got it and used it. The products and services we really love are the ones that get better the more we use them, the ones that have desirability and usability factored in.

Well, okay, it’s not a great prediction–it’s only common sense. (But wait, if it’s common sense, why isn’t it common?)

This goes for making geo-stuff “sticky” (and again) as well as for e-commerce web sites, mobile devices, toys, soda machines, cars, gadgets, and, well, um, everything people use! Make something that people love to use and they will use it. DUH!

How do you know if they’ll love to use it, though?

I think it goes back to “desirability testing.” When you do your user field tests (hopefully starting early on in the design phase), notice what the users love and what frustrates them about your whatsitz, remove the frustration points and enhance the delightful points, sprinkle in a little innovation and user-oriented thinking and you will have a winning thingy. Step inside your users’ shoes and look for the little extra that will make their lives easier, better, happier (but stay clear of  over-featurizing!) Strike a balance between desirable features and simplicity of use.

Excellent, simple, intuitive interface

No, it’s not a web site–it’s driving directions–or rather, a red line, projected directly onto the windshield of a car. Frank on VerySpatial posted this about Virtual Cable, and it is so incredibly simple, it makes you wonder why nobody ever thought of it before.  As described, it doesn’t sound very usable, but when you see it, you intuitively understand.

Virtual Cable

Frank referred to an article on Engadget about it.

Yesterday, Virtual Cable had some videos up on their web site showing how it works, but they’ve disabled them for the time being. I recommend checking back there. It’s really slick!

Now, the big question I have is where are they getting the navigation data? Will it have the same problems as the outdated map on my Nissan Quest GPS system? I can just imagine the frustration I’d feel if the red line tried to take me across a chasm where there used to be a road (before the earthquake).

Or will it be updated in real-time, more like Dash? The two together would be very powerful, indeed. Wouldn’t you like your red cable to route you around traffic? Or tell you where the nearest coffee shop is?

The value of good requirements

Over on the iRise blog, there’s a great article about measuring the hidden return on investment (ROI) of good requirements definition. David Walker speaks of how companies are focusing on “reduction in rework” and points out that the after-effect of reducing rework is accelerated time to market, which has several significant (monetary) benefits. David also points out:

Rework isn’t the only place to look for time-to-market advantages.  Focusing on requirements definition not only reduces re-work – it also reduces WORK work.  At an agile development conference in 2006, Mary Poppendiek (author of Lean Software Development: An Agile Toolkit for Software Development Managers) shared that  45% of code that’s written is NEVER EXECUTED!  Another 20% is only rarely executed.  It is however coded, documented, tested, trained, supported, maintained, etc.  All of which adsorbs resources – and adds time & cost – to projects.  Getting the requirements right interactively and up-front with end users helps make sure that unnecessary functionality never gets built.  What if your development teams could get 45% additional bandwidth simply by not working on stuff that will never be used?  Less stuff to build & test means, you guessed it, shortened time to market.

Even though I currently work in the government sector where time to market isn’t relevant (or at least not paid much attention), I can sure relate to the frustration of rework. We seem to be reinventing the wheel almost every day, as one group goes out to accomplish some great goal that another group has been working on for months.

How do you develop good requirements?

Rapid prototyping and user testing is key. Understanding your customers and their goals and motivations  is critical. Sound business analysis focusing on the right metrics, direct user interaction, and a focus on usability from the start will give successful results. Using rapid prototyping/requirements developments tools should help as well.

Of course, good vision, communication and organizational structure are going to be critical. Make sure you have all your people marching towards the same goal and everyone knows what the other groups are doing, so you don’t have a whole lot of re-solving the same old problem going on. That’s the kind of rework we don’t much talk about or measure, but whenever you have a fairly large organization, it happens.

More Tools for Building the User Experience

Prompted by a comment from Amit on my Faster Prototypes, Better Specs entry where I discussed Lucid Spec, I went and looked for some other UI-prototyping tools. I looked at GUI Design Studio, as Amit suggested, and also iRise. At first glance, they all look very similar. Initial thoughts:

  • I like the “whiteboard” in iRise Studio where you can model page interactions. They also have Masters and Templates that allow you to standardize your page designs and quick-start any new page design.
  • GUI Design Studio lets you put in anything you can draw as a control–you’re not limited to standard controls.

I’m wondering if anyone has done a comparison study of these types of UI prototyping/requirements tools? Are there others I’m missing?

… On my quick journey investigating these prototyping tools, I stopped off at the iRise blog, where I found a really interesting article on the adoptability of a product depending on where it is in the continuum from functional to usable to desirable, where:

  • Functional = A user can finish what someone could describe as a functional task but doesn’t necessarily meet their needs or goals as a user.
  • Usable = A user can meet needs and goals without frustration.
  • Desirable = The satisfaction of needs and goals is done in such a way that a user builds a positive emotional association with the product (i.e. positive product equity).

Dave showed this chart in the article:
Desirability_5

and explains the chart like so: 

What this shows on the lower right side is that yes, there are software products that can be adopted that are almost purely functional if they provide a huge amount of relative value.  However, even if they are adopted in a temporary fashion, the negative product equity associated with them means that they are easily and enthusiastically displaced by competitors.  The dot shows a product that moves from no adoption, to adoption with negative product equity, to adoption with positive product equity.

What a great chart! and a great way to explain the human factors of design. Dave also talks about “Desirability testing” and says this isn’t currently done. Shouldn’t that be part of any good usability test? I typically try to guage the user’s emotional response to the interface in all my usability tests, in addition to asking several questions about satisfaction and “would you use this again?”, “would you recommend it to your friends?” kinds of questions. That’s desirability, is it not?

But I really like the name. Perhaps we shouldn’t call it a usability test–perhaps we should call it a desirability test? Would that make more sense to the world? Every time I mention “usability” to someone outside the field, they get this deer-in-the-headlights look like they can’t begin to imagine what I’m talking about.

But then “desirability testing” might be even worse–I can just imagine the snide comments I’d get from guys outside the UI design field … (or even worse, those within!)

Verizon “open”? Ha!

When I first heard that Verizon was opening its network up, I was encouraged, but disappointed when I read exactly what they were opening and the terms of it. They are opening up their network to “any application and device” by the end of the year. Verizon states “the company will publish the technical standards the development community will need to design products to interface with the Verizon Wireless network,” and that “devices will be tested and approved in a $20 million state-of-the-art testing lab.”

This morning Glenn on AnyGeo pointed to an article in the NY Times that provides some fantastic insights and eloquently expresses exactly what I was thinking about Verizon’s move to be more open:

This is not “open.” It’s just a little less closed. A true open platform like the Internet doesn’t have certification of trusted devices or applications. Developers get to do anything they want, with the marketplace as their only judge and jury.

Both the personal computer and the Internet flourished in an environment of free-market competition. Tim Berners-Lee did not have to submit his idea for the World Wide Web in 1991 to a “state-of-the-art testing lab.” All that he needed to unleash a revolution was a single other user willing to install his new Web server software. And the Web spread organically from there.

There’s a lesson here for Verizon and other cellphone companies. Like the open architecture of the personal computer, the open architecture of the Internet didn’t mean the end of competitive advantage.

Read the full article  >>>

Return on Investment for Web Sites

Sandra Niehaus recently e-mailed me asking if I would read and review the book she co-authored, Web Design for ROI. I am a strong advocate of using metrics and business analysis along with usability testing to drive decisions on software/web design, so I found this book a breath of fresh air.

In this book, Sandra and Lance Loveday make several powerful points:

  1. There is a science to shopping. Provide a pleasant shopping experience for customers and they buy more. Retailers learned this long ago, but many organizations haven’t begun to understand this simple principle when it comes to their web site. This applies even if you’re not really “selling” anything–even government and nonprofit organization sites have a point and they would be much more effective at it if they’d pay attention to the user experience.
  2. Your web site is an investment. Treat it like one. Evaluate it objectively with appropriate metrics, just like you do every other business investment. Know the business case and rationale for your decisions. Don’t trust to personal design preferences–chances are, the designers and managers making the decisions aren’t anything like the users who will use the site.
  3. Focus on conversion and you will have a competitive advantage. Many organizations focus on buying more traffic for their site, rather than converting their existing visitors to customers (conversion). Optimizing the user experience will increase conversion, which will pay off more over time than buying traffic. Sure, you can still buy traffic–just do it AFTER you improve usability.
  4. Follow these key principles for successfully managing web sites for ROI:
  • Know what you want. What are you trying to accomplish? How can you use your web site to accomplish organizational objectives?
  • Know your audience. Who’s it for? As Sandra and Lance say in their book, “It isn’t about you. It’s about the audience.” User testing is hands-down the most effective tool for understanding your audience.
  • Treat your web site like a business. That means you need a site strategy. “If you don’t know where you’re going, it doesn’t really matter which direction you choose.”
  • Create a web site strategy. List objectives, target audiences (with profiles), assess the competition and traffic sources, then define your strategy for accomplishing your objectives. Your objectives are the what, your strategy is the how.
  • Measure the RIGHT metrics. Use business metrics (the same ones you use to track your business’ success–revenue, transactions, profit, etc.), site metrics (web site analytics–these tell you what users are doing on your site and where there may be weak points), and user metrics (user testing, user surveys, customer service inquiries). 
  • Prioritize design efforts intelligently. Use analytics to find problem areas (places with high dropoff rates) and focus on getting the customer through the entire sales process. Focus less on things that won’t make a big difference in meeting your organizational objectives. Estimate your ROI for a specific change, then track it.
  • Test, learn, repeat. You’re never done. Keep refining your site as technology, competition, and user expectations change.

I really like this analogy presented in the book: Treat your web site like a science experiment–set a hypothesis, try it out, measure results, repeat.

In the book, Sandra and Lance also explore the design issues and metrics related with several different types of pages: Landing pages, Home pages, Category pages, Detail pages, Forms, and Checkout. Each of these sections of the book are very helpful guides for designing and measuring the effectiveness of each type of page.

I loved the book, but found myself wanting more information in some places, so I was pleased to find that Sandra and Lance included a resource section at the end of the book. It’s set up with thumbnail images of each resource and a short, but very helpful description of each. Very readable, scannable, and a place to turn to whenever you are looking for a little assistance in your design efforts.

What government agencies oughta do …

I chanced upon a couple interesting blog entries up on Ogle Earth today Screenshot of Explore, from Mapperz.comabout “closed” geo-portals offered by government geo-agencies–like France’s IGN Geoportail and the UK Ordnance Survey’s Explore. France’s portal offers 2D and 3D views, while the UK’s Explore portal allows users to share routes.

A quote from OgleEarth (Stefan Geens):

Géoportail certainly is much more impressive that the UK Ordnance Survey’s “outreach” effort, but both are just as closed in a time when everything online is moving towards open, interoperable, mashable standards. KML is now an OGC standard, most recently embraced by Microsoft. Where is the support By IGN and OS? Why can’t I export anything to mash up? Where are the APIs? The USGS, on the other hand, gets it. …

… National GIS agencies should concentrate on getting the best GIS content, acting as a repository for it, and making it accessible to all. Competing with Google and Microsoft to provide end-user services based on this content is a waste of public resources, especially as Google and Microsoft will always do it better.

Stefan has a good point: have the government agencies considered what their people want to do with the data? Should government agencies focus on creating new user portals for viewing data, providing collaboration portals, or should they be trying to make their data more open and “mashable”, taking advantage of Open Geospatial Consortium (OGC) standards and best practices? Also, do government agencies inevitably produce inferior interfaces? Should they even waste taxpayer money on developing custom mapping interfaces?

I would argue that both openly accessible data distributed via open methods like KML and a usable government access portal (that includes open features like GeoRSS feeds) are needed. There are developers and dataheads who will want to mash up data with other data sources, but the vast majority of the world doesn’t have a clue what a mashup is, so they need that friendly portal. 

I agree with Stefan that Microsoft and Google are always going to “do it better”–that doesn’t mean government geo-agencies shouldn’t provide a public portal to access the data. What it does mean is they should take advantage of APIs and other resources offered by Google et al. in building those portals. By using commercially-developed (free), industry standard mapping APIs and by paying attention to usability, perhaps they can avoid experiences like Stefan described when he tried out the UK’s Explore tool:

A couple of things quickly became evident. The mapping area is really small. The maximum scale is much lower than what we’re used to elsewhere. Map drawing tools are very rudimentary, and you can’t edit submitted routes. You can’t import routes. You can’t export routes. By-now standard web-map conventions such as using the scroll wheel to zoom aren’t supported. Mapperz has his own list of limitations.

In sum, if this were a private initiative, I’d refrain from reviewing it, as it would compare unfavorably to the competition. But this is tax-payers’ money at work, so the larger question needs to be asked: What is a government agency doing entering a market niche that is serviced much better and for free by the private sector?

Good question, Stefan.