Synthesizing Sanity with, and in Spite of, Synthetic Monitoring

Monday, March 18, 2024 - 2:40 pm3:25 pm

Daniel O'Dea, Atlassian

Abstract: 

Synthetic monitoring, particularly browser-based monitoring, is hard to do well. When tests pass, synthetic monitoring provides a uniquely intuitive kind of psychological safety - human-like, verified confidence, compared to other forms of monitoring. When tests fail, synthetic monitoring is often blamed as flaky, misconfigured, or unreliable. If not properly implemented, it can not only be financially, mentally and organisationally draining, but damaging to real customer experience.

This talk is a conceptual and technical story of 4 years working with Atlassian’s in-house synthetic monitoring solution, being the owning developer for a tool actively used by 30-40 internal teams to build and manage synthetic monitoring for Jira. How can we make synthetic monitoring better serve its purpose of providing useful signal?

Daniel O'Dea, Atlassian

Daniel O’Dea is part of the Jira Site Reliability Engineering team at Atlassian, where he leads database improvements, drives incident resolution, and builds tools used by many teams. Daniel is also a classical pianist, composer, and artist. He previously spoke at SREcon22 in APAC, about high-cardinality monitoring (and AI-generated ice cream).

BibTeX
@conference {295025,
author = {Daniel O{\textquoteright}Dea},
title = {Synthesizing Sanity with, and in Spite of, Synthetic Monitoring},
year = {2024},
address = {San Francisco, CA},
publisher = {USENIX Association},
month = mar
}

Presentation Video