reducing pain points in experimental documentation


Project background

Laboratories of all sizes are responsible for keeping impeccable documentation to prove their methodology is sound. Unfortunately, despite technological advancements in any areas, laboratories still routinely use paper notebooks to document their work. For many scientists, managing a paper based lab notebook is a necessary pain. Paper notebooks make data hard to trace and provide little opportunity for efficiency gains. Designing an electronic laboratory notebook (ELN) that encourages scientists to perform concurrent documentation while leveraging data in the system could greatly benefit scientists of all kinds.

This work would initially serve users of our Biologics product, which targets scientists in biological molecule based research. This project was part R&D effort and part contract project with a client group, so the team needed to balance requirements taken from the client with feature needs we knew would attract a broader audience for our product. It needed to be shipped by the expiration date of the client’s existing ELN contract.

ELN summary@2x.jpg
 
ELN - banner 2@2x.png
 

goals

Our ELN project team needed to do the following for this to be a success for the business and our product users:

  • Reduce scientist’s dependence on paper notebooks

  • Improve value of existing user data contained in the parent product

  • Balance client requirements with a broader product strategy for a potential standalone product

 

WHO IS THIS HELPING?

While this should help all types of scientists and users of our Biologics product, there were three main functional roles we would need to serve:

Bench scientists

These scientists perform the bulk of the documentation efforts, and need a system that is easy to use and provides efficiency gains for repeat processes. Current ELNs don’t provide these scientists with easy methods of documenting work concurrently.

Laboratory managers

Lab managers act as important gatekeepers of information in a lab. These managers perform scientific work, but are also responsible for managing documentation templates and reviewing others work before it is approved. Managers need a streamlined way to receive work for vetting and approval,

LEGAL

Downstream of the experimental output lay the lawyers. Eventually, all of the data generated must be used by the organizations’s legal team to show proper discovery and ownership. While these users wouldn’t see much of the digital product, the need to stay within certain legal obligations made them a de-facto user that would impact the design.

 

THE PROCESS

Conducting user Research

Our group worked closely with members of a scientific project group to become familiar with their needs and workflows. ThIs group was able to provide key insight throughout the design process by acting as evaluators and interview subjects.

Several ELNs exist on the market, but none have seemed to capture a wide audience. Only a handful of these products offer a modern design sense or broad functionality. When asked about using an ELN, most in this group reacted with a groan. It was clear that performing this documentation was seen as a necessary evil more than anything. Clearly, this problem that began with the paper versions has not gone away as science has moved into the digital age. Our solution would need to look beyond acting as a basic text editor and bring in deep data connections in order to make this process feel cumbersome.

 

CONSTRAINts

The primary constraint on this project was the timeline. We would need to move quickly to deliver a system that could accomplish our client’s needs before their current contract lapsed. On the other hand, we knew that this product should be built to be marketable beyond just one client, which put pressure on us to broadly understand the market needs and push back on requests as warranted.

Complicating this timeline was the fact that there were no existing components or design system for this new product. As such, many components would need to be quickly designed and implemented to support this work. This included the core rich text editor.

 

Design workshop

After the project kicked off, I felt it important to get our team together to establish a vision for the product. We ran a one hour “brain sprint,” to consolidate ideas and brainstorm on this goal:

How might we create a notebook page that allows scientists to quickly document information and allow them to couple external or internal data in Biologics?

As part of the homework for the session, I asked participants to bring examples of products that they were inspired by or felt solved a similar problem well to use for “lightning demos.” Several great examples of using well formatted “blocks” of content to build a document were presented, but one example really stuck out - the ability to quickly access commands and inputs via a keyboard menu like Atlassian’s Confluence supports. Those two concepts would form the backbone of our design going forward. The output of this convergent design exercise, shown below, gave our team a shared vision of what our solution would look like and how we could differentiate it.

The “Big Idea” - our team’s convergent design output from the condensed design sprint workshop

The “Big Idea” - our team’s convergent design output from the condensed design sprint workshop

 

visualizing the user flows

The early designs would focus on addressing two core user scenarios which would make up a large amount of the notebook experience:

  • Finding internal system data to connect to the notebook

  • Participating in a review of another team members work, or have your work reviewed by a team member

Referencing items@2x.png

Providing our scientists with a method of quickly bringing existing data in the system into a notebook was an important part of our solution. If done right, it should help the existing data feel more valuable, and reduce the effort needed to perform documentation and review it. By allowing users to reference an item (a sample, experiment, assay data, biological entity, or other notebook), we could create quick links between these items in a centralized location, all supported by the scientific procedure and methods used during the experiment.

 
Review and Sign@2x.png

Performing the documentation is just one part of a creating a scientific notebook. However, just as it takes two to tango, it takes two scientists to certify work. Our solution should provide users with a way to streamline the review process. Instead of review “pizza parties” with lab groups and red pens, our ELN would let scientists send their work to colleagues for a review and signature. All feedback would be captured and certified directly in the ELN. This enhances the audit trail for notebooks, and provides organizations and their legal teams with a reliable timeline of events for their IP filings.

 

Wireframing and concept testing

With these flows and the initial design direction agreed upon by the larger team, I created a wireframe prototype to put in front of a number of our client users. I used these sessions to get feedback on the product’s homepage design, the process and data needed to create a notebook, reviewing notebooks, and referencing internal data. In addition to that, this provided a good avenue to prod our potential users around what they typically look for when reviewing someone else’s work and what happens when a notebook needs to be amended by the author.

While the flows were important to validate, this was also when we first had a look at any level of fidelity of what our envisioned text editor would look like. I spent a lot of time working with our engineers on learning the finer points of various open source rich text editor libraries. The chosen library would need to be highly customizable and support creating components on top of it that would allow us to explore our planned data reference tool.

An early look at a notebook’s format

An early look at a notebook’s format

 
What our scientists would see before proceeding to sign a notebook for completion

What our scientists would see before proceeding to sign a notebook for completion

 
The first low fidelity vision of our data referencing tool

The first low fidelity vision of our data referencing tool

“Where the power in this is where it’s all interwoven.”

The quote above came from a user in our wireframe tests. We left very confident with our structure and navigation, and fully planned to do dive into the data referencing tool to ensure that things were “interwoven.”

Among other things we learned was that an ability to connect these notebooks to a project were critical for its success. Up to this point, we had no notion of projects in our solution. Additionally, we learned that our users had some expectations built up around commenting during the process of reviewing a notebook. Google Docs was a heavy influence on our testers. They wanted an ability to highlight and comment on a certain but of text, but noted that being able to comment on a block of text (what we referred to as an “entry”) would be sufficient.

 

high fidelity evaluation

After iterating with the learnings we took from the first round of testing, I took a higher fidelity prototype to a number of users from our client group. This round would focus on how comments form the basis of a review process and how they would inform the decision to sign or not sign a notebook. Notably, this round of designs included changes to the comment structure that would allow users to add comments to a selected notebook entry during the review process. These comments would then be bundled and returned to the author to address if the notebook was denied, or included in the notebook record if approved. This relied on a two step review process, where users change the document view into a “review mode,” add comments, and decide to sign or return. Comments were anchored in the sidebar of the document. This drew inspiration from both Google Docs commenting functionality and Adobe’s E-Sign experience.

 
Reviewing notebook - preview 3@2x.png

The decision to move comments into the sidebar without having an ability to highlight text blocks caused confusion for our user group. In addition to this, they noted that, most often, they already knew if they were going to sign or reject a notebook before moving to review mode. As such, a dedicated “review mode” was seen as unnecessary, provided we gave users a way to collaborate in the live document as either an author or reviewer. This would result in providing comment threads embedded at the bottom of each entry as well as reviewing the initial review step, allowing users to move more quickly to the signature.

 

THE SOLUTION

Our ELN launched into early access for a small number of client groups and is preparing for a full release in the coming months.

 

building a web of data

ELN - References@2x.png

The data reference tool is the key differentiator for our ELN versus other available solutions. By leveraging existing data, we can provide biological scientists with quick links to the sample and instrument data that their organization has assembled within our Biologics product. This tool is a hotkey menu embedded in the text editor. By initiating the menu, scientists can walk a tree of data classes, search by referenced item type, and see the most recent items for each class to attach to the notebook. Once referenced, notebook users can hover over the referenced item to get a quick view of information about the connected item.

 

write once, use forever

Authoring a single notebook on an experimental process can be a daunting, if necessary, task. Authoring a repeated set of notebooks on the same topic? Painful. By enabling templates for notebooks, we allow scientists to create a framework they can quickly duplicate for new experiments and refine over time. This makes for easier, and faster, reproducibility of process and results, which is a win for efficiency gains and paramount for the experimental process. Templates include all rich data connections and formatting from the source notebook.

 
ELN - Review comments@2x.png

modern collaboration for modern science

Scientific tools are slow to come into their own and move beyond share Excel sheets over OneDrive. Our ELN further pushes into the lab space by integrating what was formerly a manual portion of the documentation process - reviewing and signing each others notebooks. Our solution brings the full review lifecycle into where the user’s data lives. We provide a way for scientists to control and document this process by selecting specific reviewers, deadlines for review, comments for fast feedback, and an approval or denial function that dictates what happens next with the notebook. Gone are the days of work pizza parties where groups of scientists would pass notebooks around a circle for review.

 

Takeaways

diary studies excel for hard to observe audiences

Scientists are a busy and difficulty to study user group. Getting into the lab to observe in the best times is difficult. In COVID-19 times, that’s a non-starter. Scientific documentation can be exceptionally lengthy for long running experiments. As such, a method of feedback that could capture longitudinal events was much more fitting than one-off user tests. This made a diary study an excellent fit. Over the course of a week, our product team was able to capture rich feedback on a list of trigger events that helped us hone in on the most critical items to address in our next phase.

It also made for a rewarding experience for the team to “hear” our users react to discovering features. Here is how one user reacted to learning how to use the data referencing feature:

‘SO EASY. Literally said "wow!"‘

going shallow to deeper understanding

Throughout our research and early design work, we kept hearing about the importance of metadata for organization. I designed various “metadata labels,” “notebook properties,” and “key fields” on the notebooks. None of these seemed to stick in our expert interviews, but the demand was still there. Given the lack of clarity in the user scenarios, we opted for a generic custom fields listing that could be used to track a label and a value, giving users a way to categorize notebooks or track important variables.

Surprisingly, our diary study results indicated that users didn’t find a use case for their variables. This made our decision to go shallow and trim scope from this area a wise one. We may learn more as the ELN is used over the longer term, but this decision freed us up to work on other features that we felt were more important, despite the client demands.

 

NEXT STEPS

Bringing it all together

Our users reacted positively to being able to quickly link to the related data used for their notebook. One thing the team heard from our users in the diary study was that they wanted more data related to their experimental workflows in particular. These experiments represent milestones in project related work. By adding more types of data references and project organization, we can better mimic the organizational structure and approach that laboratories use. We’re actively working on tying this functionality into experimental workflows to bring in even more data to these notebooks seamlessly.

meeting expectations

One of the challenges of designing features that are core to other apps but which are outside of your software’s area is that users have certain expectations about how they should work. For a text editor that offers a review workflow, it’s clear that Google Docs has set the bar and expectations high for how comments and collaborations should work. Unfortunately, standing this up from a non-existent tech stack is a huge effort, and one that has to be taken in small paces. We’ll need to approach features like integrating inline commenting into an approval workflow in a pragmatic manner, while also acknowledging said expectations.