Team:Wellesley HCI/Methodology

From 2012.igem.org

(Difference between revisions)
Line 161: Line 161:
<ul>
<ul>
<li>During this phase, we conducted ongoing heuristic evaluations by testing our software with BU and MIT’s iGEM teams.  We iterated on the visual design of our <a href="https://2012.igem.org/Team:Wellesley_HCI/MoClo_Planner">MoClo Planner</a> program as well as improved performance by making the search option more efficient, among many other small program changes that improved subject satisfaction with our software.   
<li>During this phase, we conducted ongoing heuristic evaluations by testing our software with BU and MIT’s iGEM teams.  We iterated on the visual design of our <a href="https://2012.igem.org/Team:Wellesley_HCI/MoClo_Planner">MoClo Planner</a> program as well as improved performance by making the search option more efficient, among many other small program changes that improved subject satisfaction with our software.   
-
<li>We continued to interview experts in the areas of synthetic biology, and started considering questions about safety, collaboration, information sharing, and safety.   
+
<li>We continued to interview experts in the areas of synthetic biology, and started considering questions about <a href="<a href="http://youtu.be/IPr-D4vAGAM">">safety</a>, collaboration, information sharing, and safety.   
</ul>
</ul>
</p>
</p>
Line 172: Line 172:
<b>Finally, in the evaluation stage we deploy our refined software tools for use in the wet-lab and evaluate overall user satisfaction regarding the software tools created. </b>
<b>Finally, in the evaluation stage we deploy our refined software tools for use in the wet-lab and evaluate overall user satisfaction regarding the software tools created. </b>
<ul>
<ul>
-
<li>We evaluated the usability and useful of our tools by performing statistical analyses of our testing results from the design and implementation stages. To evaluate the value of our programs, we used various quantitative measures (e.g. time on task, subjective satisfaction) and qualitative indicators (e.g. user collaboration and problem solving styles).
+
<li>Usability and usefulness: we conducted testing with the BU and MIT teams as well as with Wellesley biology students. We used various quantitative measures (e.g. time on task, subjective satisfaction) and qualitative indicators (e.g. user collaboration and problem solving styles). See results from the evaluation of <a href="https://2012.igem.org/Team:Wellesley_HCI/SynBio_Search#results">SynBio Search</a>, <a href="https://2012.igem.org/Team:Wellesley_HCI/MoClo_Planner#results">MoClo Planner</a>, and <a href="https://2012.igem.org/Team:Wellesley_HCI/SynFlo#results">SynFlo</a>.   
-
<li>We realized that <a href="https://2012.igem.org/Team:Wellesley_HCI/MoClo_Planner">MoClo Planner</a> facilitates a bottom-up workflow, and can build into complex biological modules from individual parts. Since it also would be convenient for our users to use MoClo in a top-down workflow, the next step would be to start the design cycle anew and design for a top-down approach as well.   
+
</ul>
</ul>

Revision as of 01:00, 4 October 2012

Wellesley HCI: User-Centered Design


User-Centered Design

Overview

To have a successful software tool, as designers it is our responsibility to cater to the needs of the user. Throughout our projects, we applied a user-centered design (UCD) process in both our MS Surface and web-based tools as well as in our outreach program. In UCD, user input is extremely crucial throughout all the stages of the design process. The goal of UCD is to create tools that enhance the current intuitive behaviors and practices of the users instead of forcing users to change behaviors to adapt to our software. Thus, extensive attention is given to the feedback and opinions of our potential users- synthetic biologists- and improvements in the designs are made in each iteration of our software design to address the questions and problems brought up by our users each time.

Design Process

The user-centered design process we followed this year can be divided into four different steps: analysis, design, implementation, and evaluation. Following, we describe the key activities we employed in each stage.

Analysis



In the analysis phase we visited a variety of potential users for our software to understand the users’ profiles: their needs, requirements, and visions for the software we are to develop. In this phase there is also a lot of background research on competitive products, and start envisioning potential scenarios in which the synthetic biologists might use our software practically.

Design



In the design phase we sit down with the users and brainstorm concepts for the design of the software, then create low-fidelity prototypes of the software. Collaborating with potential users then we start testing the usability of a higher-fidelity prototype for the design.
  • We took the first week of our summer research program to brainstorm design concepts at Wellesley, and invited several collaborators to the brainstorming sessions. Students presented their findings for relevant works to the group, we brainstormed design concepts, and developed walkthroughs of design concepts through the creation of paper prototypes. After creating design sketches for our projects, we created low-fidelity prototypes, the first iterations, of our programs.
  • During the design phase, we also conducted usability testing of MoClo Planner on our low-fidelity prototype with biology students at Wellesley College. Their feedback allowed us to quickly iterate on the design of our program. After testing SynFlo with students from Upward Bound, we were also able to iterate on our design. As SynBio Search was created towards the end of the summer session, we are still in the design and preliminary test stage for this program.


Implementation



With feedback from our users then we start the implementation stage, where we conduct further usability tests with synthetic biologists, and evaluate the visual design, performance, and efficiency of our tools.
  • During this phase, we conducted ongoing heuristic evaluations by testing our software with BU and MIT’s iGEM teams. We iterated on the visual design of our MoClo Planner program as well as improved performance by making the search option more efficient, among many other small program changes that improved subject satisfaction with our software.
  • We continued to interview experts in the areas of synthetic biology, and started considering questions about ">safety, collaboration, information sharing, and safety.

Evaluation



Finally, in the evaluation stage we deploy our refined software tools for use in the wet-lab and evaluate overall user satisfaction regarding the software tools created.
  • Usability and usefulness: we conducted testing with the BU and MIT teams as well as with Wellesley biology students. We used various quantitative measures (e.g. time on task, subjective satisfaction) and qualitative indicators (e.g. user collaboration and problem solving styles). See results from the evaluation of SynBio Search, MoClo Planner, and SynFlo.