Thursday, 29 November 2012

Qcon San Francisco 2012: Comaking Great Products

Here is my last entry the QCon San Francisco 2012 Lean UX track.  I found Jeff Patton's talk on CoMaking great product very eye opening.  Jeff argues that a surprisingly large percentage of the software we build does not end up being useful.

Being agile or lean is not enough and having a high velocity might just mean that we can produce more crap at a faster pace!  Dividing the business and the developers often only help us blame each other when things go wrong.  We should instead take joint ownership in the product and both bring our ideas to the table.

I can't find Jeff's slide but his website has 12 emerging best practice for agile UX as well as a lot of good related content that echoes ideas that were discussed in Jeff's great presentation.  You can also check out the comaking website which has good content including Jeff's articles from above.





Thursday, 15 November 2012

QCon: Fast Impactful UX Research

Here is another post about the QCon San Francisco 2012 Lean UX track.  This one is about Tomer Sharon's talk on High Quality Impactful Fast UX Research for Engineers.

Here are the slides.

The talk was very interesting and really clarified some concepts about usability analysis/research.  One of the key take away was: "Don't listen to users!  Observe their behaviour."

In less than 50 min, Tomer managed to explain the psychology behind his guiding principles and then discussed three techniques that could be used:
1. High-quality noticeability test
2. Impactful A/B usability study
3. Fast, colourful collaboration tool

I thought all three techniques were very practical and seemed effective.   All in all, I found the presentation's delivery very effective and its content eye-opening.  Good job Tomer!

QCon Lean UX Talk

I really enjoyed the Lean UX track at QCon San Francisco 2012.  The track opened up with Jeff Gothelf's talk: Better Product definition with lean UX and Design Thinking. 

I had a bit of a realization ("ha ha!") when Jeff clarified that requirements should be viewed as assumptions.  Then Jeff explained how these assumptions should  be proved or disproved in the cheapest way possible.  In some cases that's using a minimum viable product (MVP) in other cases mock ups or paper prototypes might do the trick!

Another practical take away was how we could write user stories in the following style:
"We believe that if we implement feature XYZ,
our traders will be able to use it so
they can make more trades per day."

Overall, I thought the presentation made it clear that UX and lean startup concepts and techniques could work well together. 

Here are Jeff's slides if you are interested.  I am not sure if the slides themselves do justice to Jeff's great presentation so stay tuned for when it becomes available on infoq.com!

Friday, 24 August 2012

Data driven tests with JUnit

I recently discovered a simple way to create data driven tests using JUnit.  It's so simple that I am surprised I missed it before.

Here is a code snippet:
public class DataDrivenTestExample extends TestCase {

    private final String expected;
    private final String actual;
 
    // must be named suite() for the JUnit Runner to pick it up
    public static Test suite() {
        TestSuite suite = new TestSuite();
        suite.addTest(new DataDrivenTestExample("One", "answer", "answer"));
        suite.addTest(new DataDrivenTestExample("Two", "result", "fail?"));
        suite.addTest(new DataDrivenTestExample("Three", "run-all-tests!", "run-all-tests!"));
        return suite;
    }
 
    protected DataDrivenTestExample(String name, String expected, String actual) {
        super(name);
        this.expected = expected;
        this.actual = actual;
    }

    /**
     * override this; default impl tries to reflectively find methods matching {@link TestCase#getName()}
     */
    @Override
    protected void runTest() throws Throwable {
        assertEquals(expected, actual);
    }
} 

 The first time I used JUnit for data driven test,  my code would iterate through a directory of files within a single test method.  Of course this does not provide very good defect localization when a test fails...

Then I thought, why not use parameterized test?
This almost works except that JUnit does not let you specify the name of the test!  So when you run it in your IDE or when you review a test report from your CI server you get:
  • testMethod[0]
  • testMethod [1]
 and so on as test names...

An alternative is to use a  JUnit add-on that provides named parameterized tests (it does a toString() of the parameters).  This is pretty good but it requires an extra third party library.

I just finished the first few hundred pages of Xunit Test Patterns and I feel like I finally understand JUnit (beyond the annotations!). There is even a Data Driven Test pattern which explains how it could be implemented by creating a test suite based on interpreting the data as test cases.  The above example does just that!

It would be fairly straightforward to write a FileDrivenTestCase that builds a test suite based on the files contained in a directory.  Then all you have to do is drop new files (inputs) with expected outputs.  It's also handy that setUp() and tearDown() still get called when you override runTest().

Happy data driven testing!