LinkedIn Series

All posts in one place. Read in order or jump to what you need.

Overview & How to Use

This page collects my LinkedIn series in a playlist-style format. Each series has a short description and a numbered list of posts with direct links to LinkedIn. Start at Week 1 to follow the full narrative, or skim the summaries and jump into a topic that matches your needs.

Follow me on LinkedIn for new posts.

Experimentation in Airlines

Playlist
Week 1 · Why airline experiments are hard
November 19, 2025

  |   Read on LinkedIn

The post outlines why airline pricing experiments are especially difficult. Pricing affects demand, competitors, and capacity, so flights are not independent. Booking data are autocorrelated and noisy, which makes detecting small effects hard. Despite this, experimentation is essential because it provides clean causal evidence and even small revenue gains matter at scale. Building this capability strengthens pricing decisions and lays the foundation for broader experimentation across the airline.
Week 2 · When A/B testing meets airline reality
November 26, 2025

  |   Read on LinkedIn

The post shows why standard A/B testing breaks in airlines. Treatment and control cannot stay independent because customers share the same seats, routes influence each other, competitors react, and booking curves create timing effects. To get reliable results, airlines use designs that limit these spillovers, such as route-level randomization, pods, switchbacks, and models that account for network effects.

State Space Models

Playlist
Week 1 · What is a State Space Model?
August 13, 2025

  |   Read on LinkedIn

Introduces state space models as a way to separate signal from noise, combining interpretable components (trend, seasonalities, cycles, regressors) with Kalman filtering for estimation.

  |   Read on LinkedIn

Compares SSMs with ARIMA, highlighting SSM strengths for trends, breaks, and missing values and notes that ARIMA itself can be written in state space form.

  |   Read on LinkedIn

Shows how SSMs are built like Lego: combine trend, seasonalities, cycles, and regressors to decompose and explain complex time series.

  |   Read on LinkedIn

Explains the Kalman filter’s prediction–update logic and why it yields optimal state estimates given uncertainty in data and model.

  |   Read on LinkedIn

Shows how the smoother uses the full dataset to refine past state estimates, revealing clearer trends and breaks.

  |   Read on LinkedIn

Demonstrates how the filter naturally handles gaps by propagating the model forward, with uncertainty widening until data resumes.
Week 7 · Forecasting with State Space Models
September 24, 2025

  |   Read on LinkedIn

Covers multi-step forecasting by treating future points as missing, inspecting component-wise forecasts and uncertainty.

  |   Read on LinkedIn

Explains including covariates as deterministic regressors (variance set to zero) so betas are interpretable with standard errors equal to recursive least squares.

  |   Read on LinkedIn

Defines the three residual types and why one-step prediction errors are the right diagnostic for model adequacy (ACF flatness, zero mean, homoskedasticity).

  |   Read on LinkedIn

Connects prediction errors and their variance to the likelihood; shows how maximizing likelihood tunes parameters (e.g., signal-to-noise ratio) for the best fit.

  |   Read on LinkedIn

The Kalman filter’s closed-form beauty relies on linear and Gaussian assumptions; the Oxford–Cambridge Boat Race reveals what happens when those assumptions break.

Missing a post? Let me know and I’ll add it.