March 18th, 2013 Agile Experience Design: User Research for Agile Teams

http://www.meetup.com/Agile-Experience-Design/events/105172602/

On Monday, March 18, 2013, OLC attended Agile Experience Design's User Research for Agile Teams meetup featuring Andres Glusman and Brenna Lynch of Meetup, Anna Cory-Watson and Cameron Burley of AppNexus and Jodi Leo of CapTap.

              

http://www.meetup.com/          http://www.appnexus.com/                 https://www.captap.com/

Jodi Leo demoed first, introducing CapTap as "building a flexible lending platform for small businesses." She outlined her presentation as first logistics, tools and time. "User researched used to be hours of time-consuming logistics," Leo said. "It was the same thing in 1963 and even in 2003. Since 2007, there has been a ton of tools to figure out how people are using your product," she said.

There are five types of tools that are used to see users interact with the product: moderated, self moderated, automated, automated static and user analytics. "You can look at your research as ammunition," Leo said. "You will get deep behavior insights towards moderated and more opinions toward web analytics," she said. "Mobile is not synchronous on D-Scout—which lets you take pictures of certain procedures. No tool can fix a lack of a test plan or clear testing goals. With your tools, mix and match," she said. From here, Leo discussed Ethnio, a platform built by researchers for researchers. "Time is everything," she said. "You're testing on the user's timeline and not your timeline on Ethnio. You're catching them on web-based screen intercepts and capturing them two to three seconds before they actually interact with your platform. Using Ethnio is pretty easy," Leo said. "Creating a new screener is the place to start. You can customize a lot of things—uploading your logo, template, and you can customize questions. It comes with preloaded questions, but you can create your own. There's options to customize thank you notes and after publishing, you can view it on Twitter, place JavaScript on web browsers and send out a direct link. If you launch two lines of JavaScript, you can get a lot of user feedback. There's no need to do Starbucks drive-bys. This is real-time user research," Leo said.

Anna Cory-Watson and Cameron Burley of AppNexus presented their user research in Scrum. "Scrum is all about doing, but research is all about thinking about doing," Cory-Watson said. "Scrum is a form of agile development," she added. "There really isn't a way that research is incorporated in Scrum. It's the responsibility of a UX designer that puts research in. When AppNexus started, UX wasn't at the top of the list, but we've been working hard to change that. UX is research. One of the ways we've incorporated research is through discovery—interviews with end users and internal stakeholders with use cases and edge cases, workflows, workarounds and other tools. Next is ideation, where we collaborate with users and internal stakeholders for A/B testing and product design workshops. Validation follows that, where we do early testing with end users and internal stakeholders for rapid research with lightweight prototypes and paper prototypes. And we have finalization, which is tested on end users with Alpha/Beta testing and usability testing with a live site," Cory-Watson said. "We used to have a 'User Wednesday Program,' where we had weekly cadence, test designs in production, questions and dissemination of results, but now it's been rebranded as the AppNexus Allies Research Project, to make it more product-friendly, monthly release notes and incorporated new tools like rapid research programs. We prototype with Verify and Solidify, which is great for internal testing. Rapid research programs allows for prototyping at early design stages and can be appropriate for end users," she said. From here, Cameron Burley took over and he discussed getting started with Sprint 0. "You have to have a checklist when you start to do research," Burley said. "Think about the impact on stakeholders, the impact on business procedures, dependencies and assumptions, constraints and epics. Without it, your client will not have a design that is functional," he said. Burley talked about UX Staggered and he listed it as being one to two sprints ahead of the team and that the team can hit the ground running. "It also gives ample time to explore backlog," he said. Burley talked about UX Spike, which he described as, "An experimental test, exploratory, a chance or opportunity to think about how to solve the problem and what else is needed. Think about research alternatives, socialize alternatives and mock or ideate alternatives," he said.

Andres Glusman and Brenna Lynch took the stage to present design and UX testing. "We all experience the Malkovich Bias," Glusman said. "That's the tendency to believe that everyone uses technology the way you do. The cure is to watch people use the stuff you build. It shifts the conversation from, 'How I would use the website,' to 'How this person uses the website,'" he said. The core reason why Meetup focuses on user usability is to shorten the amount of time people at Meetup wonder how someone would react to something and actually watch someone react to something. Brenna Lynch briefly talked about testing at Meetup. "We test with a lot of people. We have up to 400 sessions a year," she said. "Everything user-facing at Meetup is tested and we test four times a week. Tests are set up prior to us knowing what we're going even test on and we recruit participants from the community, but from Craigslist as well. Anyone from any team can request to have something tested," Lynch said. "When we do test, we use GoToMeeting to broadcast the sessions to designers, developers and product managers. We follow each session with discussions and videos and notes," she said. Glusman stepped in and added, "IM conversations are vital to testing. It throws away moderator questions and you learn great things too." Glusman switched gears and talked about design, stating that design is very important to the makeup of Meetup.com. "Our sign-up process used to be five steps," he said. "It used to take five steps to create a Meetup group. One of our designers came up with a prototype that was along the lines of, 'What if it wasn't constrained by anything?' When we tested it on users, they were generally confused, but they were really engaged with the interface. Our developers iterated until they felt it was clear enough for end users. We carved it into a series of experiments and when we were confident enough, we ripped out the old UI and plugged in what we tested. We learned what was good and bad through this—it resulted in 50 percent lift in groups. Usability testing liberates designers," Glusman said. "It's all about problem solving."