Grove Thumbnail_1@3x.png

Grove

20210322_141208.jpg

Grove

(User Research Study + Product/Design Strategy + Platform Support)

Grove is a content management system (CMS) publishing platform launched by NPR for use by itself and member stations. I joined the platform support team as a UX Researcher and worked with a product manager and product designer to study existing users to better understand the support requests the team was receiving.

Context

The previous research study had occurred about a year prior, before covid and many of the changes it brought to news publishing. And, when I joined the team, the Grove platform was soon to be opened to a wider set of users. And as support tickets were growing in complexity, there was simply a need for fresh ux research to inform decision making.

Goal

To better understand the present workflows and broader context in which users are creating content in Grove.

 

Research Questions

  • What is most important to stations?

  • What are the base expectations for the CMS?

  • Are there any problems that aren’t being reported?

  • What workarounds are users taking?

Our team set the intention to focus on workflows in Grove, not to compare the present experience with previous CMS’s or review existing tickets the participants may have submitted.

Research Planning

Participants

We sought 6 participants from 6 different stations from varying sizes and regions across the country. We worked closely with stakeholders from different departments in NPR to develop our screener and scripts. One department, whose specialty is working with member stations, helped us to finalize our screener and recruit the participants.

Our participant pool in the end was from a mix of east coast, southern, and midwest stations, with varying content focuses, with participants roles ranging from editorial to IT.

Methodology

We conducted interviews about the participants roles and contextual inquiry into recent workflows over zoom.

In these 45 minute sessions, participants shared their screens, workarounds and thoughts about Grove and how the platform fits into the current initiatives at their station.

 

Outcomes

I presented the study’s findings in multiple research readouts to the platform support team, stakeholders, and members of the wider organization. We also provided the participants and member stations with an update on the outcome of the study.

What’s Important to Stations?

Participants shared their station’s goals and how Grove could help meet them; they also gave examples of features they were looking forward to. I presented these next to each other to the team to show the shifting nature of what was presently important to stations.

Awaited Features

  • Social Sharing Integration means disseminating local news faster through social channels.

  • More Video Support  in embeds and thumbnails helps users of all levels publish videos easier.

  • Dynamic Ads  and other automated features help users focus on content creation.

  • Secondary Audio  support means being able to experiment with how audio and text work together on a page.

Common Priorities

  • Local Story Showcase is key to station goals of serving their local audience.

  • Diversifying Content Types  means being able to reach a wider audience.

  • Ease of Use lowers the onboarding time for new users and means more content.

  • Having Options lets stations experiment with their best way to cover stories

 

Base Expectations and Feedback

After using Grove, participants had formed base expectations for what the CMS was capable of:

  • It’s easy to make content look good in Grove.

  • Search and embed features work.

  • There is always something new to learn.

My intention was to foster a constructive conversation about shifting priorities with the research readout. So during the presentation, it was important to share these parts of the feedback with the team to give broader context of the overall positive tone of the feedback, particularly as we moved into the more critical topics.

Training Retention

A common pain point among all participants, was that the learning curve for Grove often extends past launch. These issues, usually attributed to training retention, were more likely to go unreported through the conventional support channels.

During our interview sessions, I teased this feedback out by having the participants walk me through moments when they felt this pain point. One participant identified that they didn’t them issues because they were about to make work arounds. Instead, they referred to them as trick pickles.

Tricky Pickles and Workarounds

Given that all participants noted certain difficulties were worth reporting for tickets while others weren’t, I presented the team with a listed of the common “tricky pickles” and their work arounds.

Workarounds

  1. Starting at the bottom of the page and filling the fields from the bottom up helped some pay closer attention.

  2. Uploading media in batches before content creation.

  3. Making two separate stories in the system despite potential liabilities.

  4. Using a custom browser bookmark to go straight to the live story.

  5. Some users make custom lists and manually change the content that appears in them after every publication.

  6. Create a new tag rather than reuse old ones.

Tricky Pickles

  1. The information architecture on page-builder screen was confusing for some.

  2. It takes a lot of clicks to upload a given piece of content (audio/graphic/video) in the middle of editing a story.

  3. The process for allowing radio stories to convert to podcast stories automatically was ambiguous.

  4. Previewing posts after hitting publish had to be done manually.

  5. Prioritizing content on the homepages was not straightforward and fully automated.

  6. Legacy tags and attributions from deleted stories were difficult for the average user to update or edit.

These were helpful to discuss because it gave support team members insight into how someone who works in editorial or as front end developer might be approaching the same task differently.

 

User Journey

I distilled the average editorial user’s workflow into this chart.

Story Journey

I also noticed that when asking participants about their work inside and outside of Grove, they would describe the ways in which the story moved in and out of Grove itself. I made this chart to describe to the team the way in which users see their content moving between themselves, other contributors, Grove, and their audiences.

Station Voices

After presenting the feedback and charts, I presented some longer quotes of feedback from participants in both text and recordings. Again, this was meant to provide team members with tonal context for the feedback and an opportunity to hear directly from those they might not otherwise.

Recommendations

Our team’s recommendations for what to do with what we learned was to

  • Reprioritize Support Requests with a similar logic used to lodge them.

  • Stabilize Current Features and Prioritize New Features; stations are working to incorporate new media and features to expand their reach.

  • And Ask More Questions as we learned about other themes to investigate.

Prioritizing Support Requests with a similar priority used to lodge them: 3-Tier Request Priority extrapolated from feedback from our participants.


Prioritizing Support Requests with a similar priority used to lodge them: 3-Tier Request Priority extrapolated from feedback from our participants.

 

Next Steps

After the readout, our team shared the research document wider to gather feedback. We reviewed what we learned and worked closely with the team to move forward on the recommendations.

Takeaways

I learned that sometimes nothing needs to be overtly wrong for there to be a need for fresh research.