Trifacta

Jobs Results and Monitoring

profile_main@2x

Trifacta is the global leader in data preparation. The company was founded in October 2012 and primarily develops data wrangling software for data exploration and self-service data preparation on cloud and on-premises data platforms. The company’s initial vision was to radically simplify the way people work with data, for a long time it had been marketed as an ultimate data transformer. Since 2017 the company has shifted towards automation and scale. 

The company’s platform is designed for analysts to explore, transform, and enrich raw data into clean and structured formats. Trifacta utilizes techniques in machine learning, data visualization, human-computer interaction, and parallel processing so non-technical users can work with large datasets. Trifacta has tens of thousands of users at more than 8,000 companies around the globe, including leading brands like Deutsche Boerse, Google, Kaiser Permanente, New York Life, and PepsiCo.

Context

The successful analysis relies upon accurate, well-structured data that has been formatted for the specific task at hand. Data preparation is the process you must undergo to transition raw data source inputs into prepared outputs to be utilized in analysis, reporting, and data science.

Group

Analysis and understanding

Trifacta customers' needs are at the head of the table when it comes to a product direction. Trifacta's Customer Success Management department (Customer Success Managers, Solutions Architects, and Technical Support) plays a very important role in providing valuable insights from the field. The UX team strongly relies on the CSMs customer's sessions and records to collect most non-numerical insights.

Shadowing of CSMs sessions is a kind of qualitative research and observation the UX team performs a lot in Trifacta. The main incentive is to observe participants’ natural behavior, without interrupting them or affecting their behavior. This information helps constantly improve the product and create intuitive software since the observation provides accurate information about people, their tasks, habits, their needs, and pain points.

My main incentive during shadowing sessions was to find opportunities for system automation and operationalization. Job execution and results surfacing   interface  has  appeared to be a bottleneck in customers seamless experience and unveiled many users' pain points:

  • Cannot find the location to download job results output (Wrangler Free, Cloud Dataprep).
  • Accidentally clicking  “create a dataset” when trying to download/publish output.
  • Confusion about ad-hoc publishing functionality vs run-job publishing functionality.
  • Cannot find the job results for jobs run without profiling.
  • Lack of transparency on job status for publishing.
  • Poor (if any) system feedback. 
  • Job logs are hard to read for non-tech customers.
  • When a job fails, there's a need for better surfacing of failing reasons.

Since the issues were troublesome a set of interviews followed. Open questions included: 

  • What do our customers want to know about their jobs while they’re running?
  • Once finished, what information is the most useful to them about their jobs?
  • What are they trying to accomplish when they go to the job results page? Etc.

Shadowing sessions, records analyzing, interviews, and field feedback from the CSEs and CSAs were organized in common patterns around Job Results:  

Group-3

This is how the Job Results Project has been born .

Problem

Due to lack of transparency on job status (when, how, and why) there was a big trust issue towards the job results, especially for jobs run without profiling. Data analysts experience undue friction completing complex data transformation projects. While a “Job Profile” (if available) provides basic visualization, analysts can’t naturally locate assets, must tediously inspect uncertainly long recipes, and can’t position their results.

My responsibilities

Conduct research by mapping user journeys, analyzing customer feedback database, conducting user interviews, organizing feedback, and prioritizing problems. Explore, set, and communicate product vision to stakeholders. Support a team of 6 developers and 1 product manager by testing click-through prototypes with users, delivering mockups in Sketch, and providing detailed documentation in Dropbox Paper and Confluence. 

Key goals and possible solution

With the field’s feedback in mind, the primary focus of this project is to address the monitoring and navigation needs, laying the foundation for further expansion. 

  • Structural visualization: notify key stakeholders of important trends and feedback.
  • Storytelling with cross-linking (not a real drill-down): make information discoverable.
  • Job execution/results management (delete, publish, download).

The concept of a monitoring dashboard responds to the users' need to both monitoring the job status and accessing relevant information from one single screen.

My assumptions are that monitoring/overview can be used for different purposes (by maybe even different audiences):

  • As a management tool to monitor and evaluate the progress of a job execution;
  • As a communication tool to support learning (alerts belong here as well) and improvement by providing regular updates on progress, information can be regularly reviewed and used to inform any necessary changes.
whiteboard

Form meets function

Design is a quest to find the best possible match between the user’s mental model that they have in their mind, and the conceptual model that you’re presenting to them with your product. In the case of Trifacta, there is always a tech component to it.

As a product designer, I really enjoy discovering and working around specific functional constraints or uncertainties because it brings another level of conceptual model validation. Below you can see a small part of my workshops with developers to understand how future design should meet job run functionality. 

Group-3-Copy-3-1
Group-5-Copy-2

Layout

I was imagining a visual display with the most important information needed to achieve one or more objectives, with the data consolidated and arranged on a single screen so the information can be monitored at a glance. The ideal outcome for Monitoring/Overview is to be used to guide decision-making and action.

Like-Dashboard design:

  • The user can directly compare and draw conclusions from the data at a glance, which is not possible if the data is split across several screens.
  • Show only the most important pieces to communicate status, monitor progress, or evaluate success - effective tracking of performance in an engaging manner.
  • Allows a user to drill-down to perform timely interventions.
  • Allows a user to drill-down to find historical data with historical snapshots.

There are two states of this view:

  • Dynamic with information being regularly refreshed and updated, in ‘real time’.
  • Static to display the big picture in a simple manner, and tell a story/highlight data patterns.

Overview design concepts:

Group-6
Group-7
Group-7-Copy

For the chosen overview layout another important consideration is how customers' experience changes over time. Customers did not engage with running jobs since there was no mechanism to monitor and analyze runs. My main goal was to fix poor storytelling and structural visualization for all job statuses to display the big picture in a simple manner, to tell a story/highlight data patterns. 

Group-5

Final deliverables

After a job is launched, detailed monitoring permits customers to track the process of the job during all phases of execution.  Status, job stats, inputs, outputs, and a flow snapshot are available through the Trifacta application.

overview-completed@2x
overview@2x
output-destinations@2x
profile@2x
data-sources@2x
dependency-graph@2x
parameters@2x
designsystem@2x

Designed with love in San Francisco

Designed with love in San Francisco

Designed with love in San Francisco

Let's chat. Say Hello!

Let's chat. Say Hello!

Let's chat. Say Hello!