Service Reliability Measurement using Oyster Data

Similar documents
Measuring Bus Service Reliability: An Example of Bus Rapid Transit in Changzhou

HOW TO IMPROVE HIGH-FREQUENCY BUS SERVICE RELIABILITY THROUGH SCHEDULING

Evaluation of Predictability as a Performance Measure

CONGESTION MONITORING THE NEW ZEALAND EXPERIENCE. By Mike Curran, Manager Strategic Policy, Transit New Zealand

ARRIVAL CHARACTERISTICS OF PASSENGERS INTENDING TO USE PUBLIC TRANSPORT

Real-Time Control Strategies for Rail Transit

London Underground. Performance report. Period /12 (16 October 12 November 2011) MAYOR OF LONDON. Transport for London

Quantile Regression Based Estimation of Statistical Contingency Fuel. Lei Kang, Mark Hansen June 29, 2017

Transfer Scheduling and Control to Reduce Passenger Waiting Time

KING STREET TRANSIT PILOT

Airport Capacity, Airport Delay, and Airline Service Supply: The Case of DFW

Outline. 1. Timetable Development 2. Fleet Size. Nigel H.M. Wilson. 3. Vehicle Scheduling J/11.543J/ESD.226J Spring 2010, Lecture 18

Discriminate Analysis of Synthetic Vision System Equivalent Safety Metric 4 (SVS-ESM-4)

An Econometric Study of Flight Delay Causes at O Hare International Airport Nathan Daniel Boettcher, Dr. Don Thompson*

THIRTEENTH AIR NAVIGATION CONFERENCE

National Rail Performance Report - Quarter /14

2013 Travel Survey. for the States of Guernsey Commerce & Employment Department RESEARCH REPORT ON Q1 2013

Where is tourists next destination

TfL Planning. 1. Question 1

Transport Performance and the Data Clubs Approach. Richard Anderson ESRC International Public Service Rankings 13 th December 2005

Predictability in Air Traffic Management

Reducing Garbage-In for Discrete Choice Model Estimation

Mercer SCOOT Adaptive Signal Control. Karl Typolt, Transpo Group PSRC RTOC July 6th, 2017

Labrador - Island Transmission Link Target Rare Plant Survey Locations

GUIDE TO THE DETERMINATION OF HISTORIC PRECEDENCE FOR INNSBRUCK AIRPORT ON DAYS 6/7 IN A WINTER SEASON. Valid as of Winter period 2016/17

Evaluation of Pushback Decision-Support Tool Concept for Charlotte Douglas International Airport Ramp Operations

SIMAIR: A STOCHASTIC MODEL OF AIRLINE OPERATIONS

An Analysis of Communication, Navigation and Surveillance Equipment Safety Performance

Understanding Travel Behaviour in Avalanche Terrain: A New Approach

The Traffic Management Act (TMA) 2004 and roadworks; and lane rental under the New Roads and Streetworks Act (1991) in England

EUROCONTROL EUROPEAN AVIATION IN 2040 CHALLENGES OF GROWTH. Annex 4 Network Congestion

Estimation of Potential IDRP Benefits during Convective Weather SWAP

WHEN IS THE RIGHT TIME TO FLY? THE CASE OF SOUTHEAST ASIAN LOW- COST AIRLINES

VCTC Transit Ridership and Performance Measures Quarterly Report

Board meeting

WORKSHEET 1 Wilderness Qualities or Attributes Evaluating the Effects of Project Activities on Wilderness Attributes

2017/2018 Q3 Performance Measures Report. Revised March 22, 2018 Average Daily Boardings Comparison Chart, Page 11 Q3 Boardings figures revised

Directional Price Discrimination. in the U.S. Airline Industry

2017/ Q1 Performance Measures Report

rtc transit Before and After Studies for RTC Transit Boulder highway UPWP TASK Before Conditions

Transport Focus Train punctuality the passenger perspective. 2 March 2017 Anthony Smith, Chief Executive

Note on validation of the baseline passenger terminal building model for the purpose of performing a capacity assessment of Dublin Airport

Assessment of Pathogen Strategies

IAB / AIC Joint Meeting, November 4, Douglas Fearing Vikrant Vaze

Board Box. October Item # Item Staff Page 1. Key Performance Indicators Sep 2018 M. Mungia Financial Report Aug 2018 H.

PERFORMANCE REPORT DECEMBER 2017

Abstract. Introduction

Workshop on Advances in Public Transport Control and Operations, Stockholm, June 2017

CENTRAL OREGON REGIONAL TRANSIT MASTER PLAN

Technical Summary for Form F of the Iowa Assessments

Proceedings of the 54th Annual Transportation Research Forum

Northeast Stoney Trail In Calgary, Alberta

The Benefits of Attendee Travel Management

Online Appendix to Quality Disclosure Programs and Internal Organizational Practices: Evidence from Airline Flight Delays

Development of Flight Inefficiency Metrics for Environmental Performance Assessment of ATM

Airline Passenger Transportation System: Structure and Dynamics. Lance Sherry 04/2101

Predicting a Dramatic Contraction in the 10-Year Passenger Demand

PERFORMANCE REPORT DECEMBER Performance Management Office

Building adaptation in the Melbourne CBD: The relationship between adaptation and building characteristics.

CURRENT SHORT-RANGE TRANSIT PLANNING PRACTICE. 1. SRTP -- Definition & Introduction 2. Measures and Standards

Air Carrier E-surance (ACE) Design of Insurance for Airline EC-261 Claims

ANALYSIS OF THE CONTRIUBTION OF FLIGHTPLAN ROUTE SELECTION ON ENROUTE DELAYS USING RAMS

LAX Community Noise Roundtable. Aircraft Noise 101. November 12, 2014

Airport Characterization for the Adaptation of Surface Congestion Management Approaches*

TRANSPORTATION RESEARCH BOARD. Passenger Value of Time, BCA, and Airport Capital Investment Decisions. Thursday, September 13, :00-3:30 PM ET

State Park Visitor Survey

Performance Evaluation of Individual Aircraft Based Advisory Concept for Surface Management

Airport Planning and Terminal Design

CONGESTION REPORT 1 st Quarter 2018

Board Box. February Item # Item Staff Page 1. Key Performance Indicators M. Thompson Financial Report for Dec H.

Unmanned Aircraft System Loss of Link Procedure Evaluation Methodology

2015 Travel Survey. for the States of Guernsey Commerce & Employment Department RESEARCH REPORT ON Q1 2015

Estimating Domestic U.S. Airline Cost of Delay based on European Model

Business Intelligence Development at Winnipeg Transit

Transportation System Preservation R & D Needs Statements

Wake Turbulence Research Modeling

SATELLITE CAPACITY DIMENSIONING FOR IN-FLIGHT INTERNET SERVICES IN THE NORTH ATLANTIC REGION

Yasmine El Alj & Amedeo Odoni Massachusetts Institute of Technology International Center for Air Transportation

ERASMUS. Strategic deconfliction to benefit SESAR. Rosa Weber & Fabrice Drogoul

Appendix B Ultimate Airport Capacity and Delay Simulation Modeling Analysis

Bus Passenger Survey spring 2015 results Centro - West Midlands PTE area

Operational Performance

A carbon offsetting and reduction scheme for international aviation

TransAction Overview. Introduction. Vision. NVTA Jurisdictions

U.Md. Zahir, H. Matsui & M. Fujita Department of Civil Engineering Nagoya Institute of Technology,

NAS Performance Models. Michael Ball Yung Nguyen Ravi Sankararaman Paul Schonfeld Luo Ying University of Maryland

Affiliation to Hotel Chains: Requirements towards Hotels in Bulgaria

QUALITY OF SERVICE INDEX Advanced

DRAFT Service Implementation Plan

American Airlines Next Top Model

Bus Corridor Service Options

PBN and airspace concept

UNLOCKING THE BRIGHTON MAINLINE

FINEST LINK WP2 Appendix 2. Passenger volume estimation

Passenger Focus Relationship between Customer Satisfaction and Performance CrossCountry. Date: 20 July 2010

Customer Satisfaction Tracking Annual Report British Columbia Ferry Services Inc.

Research Report Agreement T4118, Task 24 HOV Action Plan HOV ACTION PLAN

Preliminary Analysis of the Impact of Miles-in-Trail (MIT) Restrictions on NAS Flight Operations

Estimates of the Economic Importance of Tourism

Price-Setting Auctions for Airport Slot Allocation: a Multi-Airport Case Study

Transcription:

Service Reliability Measurement using Oyster Data - A Framework for the London Underground David L. Uniman MIT TfL January 29 1

Introduction Research Objective To develop a framework for quantifying reliability from the perspective of passengers using Oyster data that is useful for improving service quality on the Underground. How reliable is the Underground? - how do we think about reliability? - how do we quantify it? - how do we understand its causes? - how do we improve it? 2

Background What is Service Reliability? Reliability means the degree of predictability of the service attributes including comfort, safety, and especially travel times. Passengers are concerned with average travel times, but also with certainty of on-time arrival % of Trips Probability of Late Arrival T departure T departure T arrival T desired arrival Time of Day Avg. Travel Time Buffer Avg. Travel Time - Adapted from Abkowitz (1978) 3

Framework - Reliability Buffer Time Metric Criteria for Reliability Measure Representative of passenger experience Straightforward to estimate and interpret Usefulness and applicability compatible with JTM Propose the following measure: Reliability Buffer Time (RBT) Metric The amount of time above the typical duration of a journey required to arrive ontime at one s destination with 95% certainty RBT = (95 th percentile 5 th percentile) O-D, AM Peak, LUL Period sample size 2 No. of Observations Actual Distribution 5 th perc. 95 th perc. Travel Time RBT 4

Framework Separating Causes of Unreliability Two types of factors that influence reliability and affect the applicability & usefulness of the measure: 1. Chan (27) found evidence for the effects of service characteristics on travel time variability impact on aggregation (e.g. Line Level measure) 2. In this study, observed that reliability was sensitive to the performance of a few (3-4) days each Period, which showed large and non-recurring delays (believe Incident-related) 95th percentile [min] 35 3 25 2 15 1 5 Waterloo to Picc. Circus (Bakerloo NB) - February, AM Peak 4-Feb 7-Feb 1-Feb 13-Feb 16-Feb 19-Feb 22-Feb 25-Feb 28-Feb 2-Mar Weekdays % of journeys Comparison of Travel Time Distributions (normalized).3.25 February 14th.2 February 5th.15.1.5 5 6 7 8 9 1 11 12 13 14 15 16 17 18 19 2 21 22 23 24 25 Travel Time [min] 5

Framework Classification of Performance Propose to classify performance into two categories along two dimensions degree of recurrence and magnitude of delays Relate to reliability factors and strategies to address them Recurrent Reliability Day-to-day (systematic) performance Includes the effects of service characteristics and other repeatable factors (e.g. demand) Can be considered as the Underground s potential reliability under typical conditions Incident-Related Reliability Unpredictable or Random (unsystematic) delays Unreliability caused by severe disruptions, additional to inherent levels of travel time variation Together with performance under typical conditions, makes up the actual passenger experience Methodology use a classification approach based on a Stepwise Regression to automate process 6

Framework Classification of Performance Bakerloo Line Example: Waterloo to Piccadilly Circus AM Peak, Feb. 27 35 95th percentile [min] 3 25 2 15 1 5 4-Feb 7-Feb 1-Feb 13-Feb 16-Feb 19-Feb 22-Feb 25-Feb 28-Feb 2-Mar Weekdays T.T. Actual = P[No Disruption]*T.T. Recurrent + P[Disruption]*T.T. Incident-related.3.3.3.25.2.15.25.2.15 P[No Disruption] = 17/2 days = 85%.25.2.15 P[Disruption] = 3/2 days = 15%.1.1.1.5.5.5 5 1 15 2 25 3 5 1 15 2 25 3 5 1 15 2 25 3 RBT Actual = 4-min RBT Recurrent = 3-min RBT Incident-related = 1-min 7

Framework Validation with Incident Log data Validation of non-recurrent performance with Incident Log data (from NACHs system) confirmed Incident-related disruptions as the primary cause Brixton to Oxford Circus (Victoria NB) - February 7, AM Peak Brixton to Oxford Circus (Victoria NB) - February 14 - AM Peak 6 6 Travel Time [min] 5 4 % Journeys.3.3.25.25.2 5.2.15.15.1.5 1 11 12 13 14 15 16 17 18 19 2 21 22 23 24 25 26 27 28 29 3 31 32 33 34 35 36 37 38 39 4 4 % Journeys.1.5 1 11 12 13 14 15 16 17 18 19 2 21 22 23 24 25 26 27 28 29 3 31 32 33 34 35 36 37 38 39 4 3 Travel Time [min] 3 Travel Time [min] 2 2 1 1 7: AM 7:3 AM 8: AM 8:3 AM 9: AM 9:3 AM 1: AM 7: AM 7:3 AM 8: AM 8:3 AM 9: AM 9:3 AM 1: AM Departure Time Departure Time Indicative Incident Date Cause Code Result NAX's Start Incident End February 7 Fleet Train Withdrawal 2.6124 7:2am 8:2am February 7 Customer - PEA Train Delay 3.5756 9:16am 9:19am Indicative Incident Incident Date Cause Code Result NAX's Start End Customer - February 14 PEA Train Delay 9.883 8:3am 8:6am February 14 Signals Train Delay 47.843 9:5am 9:16am Partial Line February 14 Signals Suspension 45.473 9:57am 11:19am 8

Excess Reliability Buffer Time Metric Using these 2 performance categories, we can extend our reliability measure by comparing the actual performance with a baseline value Propose the following: Excess Reliability Buffer Time (ERBT) Metric The amount of buffer time that passengers need to allow for to arrive on-time with 95% certainty, in excess of what it would have been under disruption-free conditions. ERBT = (RBT Actual RBT Recurrent ) O-D, AM Peak, LUL Period sample size 2, cumulative baseline Num. of Observations Recurrent Distribution (II) Actual Distribution (I) Excess Unreliable Journeys 5 th perc. (I) 95 th perc.(ii) 95 th perc.(i) Travel Time RBT (II) ERBT 9

Application # 1 JTM Reliability Addition Use measure for routine monitoring and evaluation of service quality propose to include it within JTM as an additional component. RBT form is compatible is JTM units (min, pax-min), aggregation (Period, AM Peak, O-D), estimation (Actual, Scheduled, Excess & Weighted) TPT AEI PWT IVTT C&I RBT Apply RBT measure to Victoria Line AM Peak, Feb. & Nov. 27 2. Travel Time [min] 18. E_RBT 16. 14. B_RBT 12. 1. (8.55) 8. (7.1) 6. 3.62 2.8 4. 2.. 4.93 4.93 16.71 February November Median Travel Time Period 1

Application # 1 JTM Reliability Addition Actual W eighted RBT value estimation Contribution to Service Quality (i.e. Perceived Performance) Travel Time [min] 2. 18. 16. 14. 12. 1. 8. 6. 4. 2.. Comparison of Actual Reliability Buffer Time and Median Journey Time: Victoria Line, AM Peak, Feb/Nov 27 8.55 7.1 16.71 February November Median Travel Time Compare contribution of RBT to other JTM components through VOT FEB. 27 Unweighted Unweighted, JTM Weighted, JTM (VOT RBT = 1.) Proportions* Proportions (VOT RBT = 1.) (VOT RBT =.6) 21% 16% 9% 34% 34% 1% AEI 35% 66% * total 11% due to rounding 5th Perc. RBT 1% 8% PWT OTT CLRS RBT 36% 12% 12% 37% 11 7% AEI PWT AEI OTT PWT OTT CLRS CLRS B_RBT RBT E_RBT

Application # 2 Journey Planner Reliability Addition Better information reduces uncertainty by closing the gap between expectations and reality improve reliability of service Propose more COMPLETE information through Journey Planner SIMPLE EXAMPLE: David s morning commute Bow Road to St. James Park Probability of arrival on-time using Journey Planner =.2 Access/Egress and Wait Times (?) must guess 17% of trip Journey Planner time must be increased by a FACTOR of 1.64 to be 95% reliable 1 Cumulative Probability.9.8.7.6.5.4.3.2.1 Journey Planner Oyster Median 1 15 2 25 3 35 4 45 5 55 6 O yster Travel Time [min] Oyster 95th perc. 12 Courtesy of Transport for London. Used with permission.

Application # 2 Journey Planner Reliability Addition Assessment: Journey Planner information is INCOMPLETE in 2 ways: 1. Journey Planner consistently underpredicts Oyster journey times - possibly AET & PWT leaves around 3-5% of journey to chance 2. Expected journey times not helpful for passengers concerned with on-time arrival (e.g. commuters) Difference in Travel Times - yster Data and Journey Planner: Difference in Travel Times - Data and Planner: 5 Largest O-D's, O yster Journey AM Peak, Nov. 27 5 Largest O-D's, AM Peak, Nov. 27 's D - - O 5. of ota o (t N ) f l 2 3 5 45 18 16 25 4 14 2 35 12 3 1 15 25 28 156 1 14 5 52 % 1 1-2 1% 23 1-42% 3 5 2-3% 4 6 75 3-4%8 6 9 4-5%1 7 (OystePrrMobeadbian (95th ilit Perc. Ty roav f OelnT-iTmimee- Travel Time AJorruivrnael / Journey uy Psinlang njeorutrnrav Planner eyepl Time) lainmne)r T [min] 13

Application # 2 Journey Planner Reliability Addition Possible representation of new journey information: SIMPLE EXAMPLE: David s morning commute Bow Road to St. James Park Chose to: Route Depart Expected Arrival Latest Arrival Duration (up to) Interchanges Depart at Æ 1 8:27 8:57 9:11 :3 (:41) View Arrive by Æ 2 8:19 8:49 9: :3 (:41) View 14 Courtesy of Transport for London. Used with permission.

Conclusions & Recommendations 1. Reliability is an important part of service quality, relative to average performance, and should be accounted for explicitly. 9 Monitor and evaluate reliability through JTM Extension 2. Incidents have a large impact on service quality through unreliability, which may be underestimated through focus on average performance. 9 Use Oyster and Reliability Framework to improve monitoring and understanding through NACHs (measurement vs. estimate) 3. The impacts of unreliability on passengers can be mitigated through better information. 9 Update travel information and include reliability alternative in Journey Planner 15

Conclusions & Recommendations 4. In order to manage performance, we need to be able to measure it first. 9 Framework contributes to making this possible, and sheds light on some of the broader questions How reliable is the Underground? - how do we think about reliability? - how do we quantify it? - how do we understand its causes? - how do we improve it? 16

Thank You Special thanks to people at TfL and LUL that made this research possible and a memorable experience 17

MIT OpenCourseWare http://ocw.mit.edu 1.258J / 11.541J / ESD.226J Public Transportation Systems Spring 21 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.