Trifacta

Jobs Results and Monitoring

profile_main@2x

Trifacta, a leader in data preparation, helps users transform raw data into structured insights. As the company evolved toward automation and scalability, efficient data pipeline monitoring became crucial. This case study explores how we tackled real-time data transformation monitoring—enhancing transparency and making it easier for users to identify and resolve issues before they impact operations.

Context

The successful analysis relies upon accurate, well-structured data that has been formatted for the specific task at hand. Data preparation is the process you must undergo to transition raw data source inputs into prepared outputs to be utilized in analysis, reporting, and data science.

Group

Analysis and understanding

Trifacta customers' needs are at the head of the table when it comes to a product direction. Trifacta's Customer Success Management department (Customer Success Managers, Solutions Architects, and Technical Support) plays a very important role in providing valuable insights from the field. The UX team strongly relies on the CSMs customer's sessions and records to collect most non-numerical insights.

Shadowing of CSMs sessions is a kind of qualitative research and observation the UX team performs a lot in Trifacta. The main incentive is to observe participants’ natural behavior, without interrupting them or affecting their behavior. This information helps constantly improve the product and create intuitive software since the observation provides accurate information about people, their tasks, habits, their needs, and pain points.

My main incentive during shadowing sessions was to find opportunities for system automation and operationalization. Job execution and results surfacing   interface  has  appeared to be a bottleneck in customers seamless experience and unveiled many users' pain points:

  • Cannot find the location to download job results output (Wrangler Free, Cloud Dataprep).
  • Accidentally clicking  “create a dataset” when trying to download/publish output.
  • Confusion about ad-hoc publishing functionality vs run-job publishing functionality.
  • Cannot find the job results for jobs run without profiling.
  • Lack of transparency on job status for publishing.
  • Poor (if any) system feedback. 
  • Job logs are hard to read for non-tech customers.
  • When a job fails, there's a need for better surfacing of failing reasons.

Since the issues were troublesome a set of interviews followed. Open questions included: 

  • What do our customers want to know about their jobs while they’re running?
  • Once finished, what information is the most useful to them about their jobs?
  • What are they trying to accomplish when they go to the job results page? Etc.

Shadowing sessions, records analyzing, interviews, and field feedback from the CSEs and CSAs were organized in common patterns around Job Results:  

Group-3

This is how the Job Results Project has been born .

Problem

Due to lack of transparency on job status (when, how, and why) there was a big trust issue towards the job results, especially for jobs run without profiling. Data analysts experience undue friction completing complex data transformation projects. While a “Job Profile” (if available) provides basic visualization, analysts can’t naturally locate assets, must tediously inspect uncertainly long recipes, and can’t position their results.

My responsibilities

Conduct research by mapping user journeys, analyzing customer feedback database, conducting user interviews, organizing feedback, and prioritizing problems. Explore, set, and communicate product vision to stakeholders. Support a team of 6 developers and 1 product manager by testing click-through prototypes with users, delivering mockups in Sketch, and providing detailed documentation in Dropbox Paper and Confluence. 

Key goals and possible solution

With the field’s feedback in mind, the primary focus of this project is to address the monitoring and navigation needs, laying the foundation for further expansion. 

  • Structural visualization: notify key stakeholders of important trends and feedback.
  • Storytelling with cross-linking (not a real drill-down): make information discoverable.
  • Job execution/results management (delete, publish, download).

The concept of a monitoring dashboard responds to the users' need to both monitoring the job status and accessing relevant information from one single screen.

My assumptions are that monitoring/overview can be used for different purposes (by maybe even different audiences):

  • As a management tool to monitor and evaluate the progress of a job execution;
  • As a communication tool to support learning (alerts belong here as well) and improvement by providing regular updates on progress, information can be regularly reviewed and used to inform any necessary changes.
whiteboard

Form meets function

Design is a quest to find the best possible match between the user’s mental model that they have in their mind, and the conceptual model that you’re presenting to them with your product. In the case of Trifacta, there is always a tech component to it.

As a product designer, I really enjoy discovering and working around specific functional constraints or uncertainties because it brings another level of conceptual model validation. Below you can see a small part of my workshops with developers to understand how future design should meet job run functionality. 

Group-3-Copy-3-1
Group-5-Copy-2

Layout

I envisioned a visual display that consolidates key information onto a single screen, enabling users to monitor data at a glance and take action when needed. The goal is to facilitate decision-making through clear, real-time insights.

Dashboard-Like Design Principles:

  • Enables direct comparison of data without switching between screens.
  • Focuses only on critical information for status updates, progress tracking, and performance evaluation.
  • Supports drill-down capabilities for timely interventions and historical data retrieval.
  • Engages users with an intuitive, performance-tracking experience.

Two Key States:

  • Dynamic View – Continuously updates in real time for live monitoring.
  • Static View – Provides a high-level snapshot to highlight trends and patterns.

Overview Design Concepts: 

Group-6
Group-7
Group-7-Copy

Another key consideration for the chosen overview layout was how the customer experience evolved over time. Users lacked engagement with running jobs due to the absence of monitoring and analysis tools. My main goal was to improve storytelling and structural visualization across all job statuses—presenting the big picture in a clear, intuitive way while ensuring timely insights into patterns and trends.

Group-5

Final deliverables

After a job is launched, detailed monitoring permits customers to track the process of the job during all phases of execution.  Status, job stats, inputs, outputs, and a flow snapshot are available through the Trifacta application.

overview-completed@2x
overview@2x
output-destinations@2x
profile@2x
data-sources@2x
dependency-graph@2x
parameters@2x
designsystem@2x

Designed with love in San Francisco

Designed with love in San Francisco

Designed with love in San Francisco

Let's chat. Say Hello!

Let's chat. Say Hello!

Let's chat. Say Hello!