Skip to main content
Top
Published in: BMC Medical Informatics and Decision Making 1/2006

Open Access 01-12-2006 | Research article

Accounting for seasonal patterns in syndromic surveillance data for outbreak detection

Authors: Tom Burr, Todd Graves, Richard Klamann, Sarah Michalak, Richard Picard, Nicolas Hengartner

Published in: BMC Medical Informatics and Decision Making | Issue 1/2006

Login to get access

Abstract

Background

Syndromic surveillance (SS) can potentially contribute to outbreak detection capability by providing timely, novel data sources. One SS challenge is that some syndrome counts vary with season in a manner that is not identical from year to year.
Our goal is to evaluate the impact of inconsistent seasonal effects on performance assessments (false and true positive rates) in the context of detecting anomalous counts in data that exhibit seasonal variation.

Methods

To evaluate the impact of inconsistent seasonal effects, we injected synthetic outbreaks into real data and into data simulated from each of two models fit to the same real data. Using real respiratory syndrome counts collected in an emergency department from 2/1/94–5/31/03, we varied the length of training data from one to eight years, applied a sequential test to the forecast errors arising from each of eight forecasting methods, and evaluated their detection probabilities (DP) on the basis of 1000 injected synthetic outbreaks. We did the same for each of two corresponding simulated data sets. The less realistic, nonhierarchical model's simulated data set assumed that "one season fits all," meaning that each year's seasonal peak has the same onset, duration, and magnitude. The more realistic simulated data set used a hierarchical model to capture violation of the "one season fits all" assumption.

Results

This experiment demonstrated optimistic bias in DP estimates for some of the methods when data simulated from the nonhierarchical model was used for DP estimation, thus suggesting that at least for some real data sets and methods, it is not adequate to assume that "one season fits all."

Conclusion

For the data we analyze, the "one season fits all " assumption is violated, and DP performance claims based on simulated data that assume "one season fits all," for the forecast methods considered, except for moving average methods, tend to be optimistic. Moving average methods based on relatively short amounts of training data are competitive on all three data sets, but are particularly competitive on the real data and on data from the hierarchical model, which are the two data sets that violate the "one season fits all" assumption.
Appendix
Available only for authorised users
Literature
3.
go back to reference Buehler J: Review of the 2003 national syndromic surveillance conference – lessons learned and questions to be answered in: syndromic surveillance: reports from a national conference. Morbidity and Mortality Weekly Report. 2003, 53 (Suppl): 18-22. Buehler J: Review of the 2003 national syndromic surveillance conference – lessons learned and questions to be answered in: syndromic surveillance: reports from a national conference. Morbidity and Mortality Weekly Report. 2003, 53 (Suppl): 18-22.
4.
go back to reference Hutwagner L, Browne T, Seeman G, Fleischauer A: Comparing aberration detection methods with simulated data. Emerging Infectious Diseases. 2005, 11 (2): 314-316.CrossRefPubMedPubMedCentral Hutwagner L, Browne T, Seeman G, Fleischauer A: Comparing aberration detection methods with simulated data. Emerging Infectious Diseases. 2005, 11 (2): 314-316.CrossRefPubMedPubMedCentral
5.
go back to reference Hutwagner L, Thompson W, Seeman G, Treadwell T: A Simulation model for assessing aberration detection methods used in public health surveillance for systems with limited baselines. Statistics in Medicine. 2005, 24: 543-550. 10.1002/sim.2034.CrossRefPubMed Hutwagner L, Thompson W, Seeman G, Treadwell T: A Simulation model for assessing aberration detection methods used in public health surveillance for systems with limited baselines. Statistics in Medicine. 2005, 24: 543-550. 10.1002/sim.2034.CrossRefPubMed
6.
go back to reference Brillman J, Burr T, Forslund D, Joyce E, Picard R, Umland E: Modeling emergency department visit patterns for infectious disease complaints: results and application to disease surveillance. BioMedCentral Medical Informatics and Decision Making. 2005, 5 (4): doi:10.1186/1472-6947-5-4 Brillman J, Burr T, Forslund D, Joyce E, Picard R, Umland E: Modeling emergency department visit patterns for infectious disease complaints: results and application to disease surveillance. BioMedCentral Medical Informatics and Decision Making. 2005, 5 (4): doi:10.1186/1472-6947-5-4
7.
go back to reference Kleinman K, Lazarus R, Platt R: A Generalized linear mixed models approach for detecting incident clusters of disease in small areas, with an application to biological terrorism. American Journal of Epidemiology. 2004, 159 (3): 217-224. 10.1093/aje/kwh029.CrossRefPubMed Kleinman K, Lazarus R, Platt R: A Generalized linear mixed models approach for detecting incident clusters of disease in small areas, with an application to biological terrorism. American Journal of Epidemiology. 2004, 159 (3): 217-224. 10.1093/aje/kwh029.CrossRefPubMed
8.
go back to reference Bradley C, Rolka H, Walker D, Loonsk J, BioSense: Implementation of a national early event detection and situational awareness system. Morbidity and Mortality Weekly Report. 2004, R54 (Suppl): 11-19. Bradley C, Rolka H, Walker D, Loonsk J, BioSense: Implementation of a national early event detection and situational awareness system. Morbidity and Mortality Weekly Report. 2004, R54 (Suppl): 11-19.
9.
go back to reference Page E: Continuous inspection schemes. Biometrika. 1954, 41: 100-115. 10.2307/2333009.CrossRef Page E: Continuous inspection schemes. Biometrika. 1954, 41: 100-115. 10.2307/2333009.CrossRef
10.
go back to reference Farrington C, Andrews N, Beale A, Catchpole M: A Statistical algorithm for the early detection of outbreaks of infectious disease. Journal of the Royal Statistical Society A. 1996, 159 (3): 547-563. 10.2307/2983331.CrossRef Farrington C, Andrews N, Beale A, Catchpole M: A Statistical algorithm for the early detection of outbreaks of infectious disease. Journal of the Royal Statistical Society A. 1996, 159 (3): 547-563. 10.2307/2983331.CrossRef
12.
go back to reference Gelman A, Carlin J, Stern H, Rubin D: Bayesian Data Analysis. 1995, Boca Raton, FL:Chapman & Hall/CRC Gelman A, Carlin J, Stern H, Rubin D: Bayesian Data Analysis. 1995, Boca Raton, FL:Chapman & Hall/CRC
13.
go back to reference Graves T, Picard R: Predicting the evolution of P&I mortality during a flu season. Los Alamos National Laboratory Report LA-UR-02-4717. 2002 Graves T, Picard R: Predicting the evolution of P&I mortality during a flu season. Los Alamos National Laboratory Report LA-UR-02-4717. 2002
15.
go back to reference Chatfield C: The analysis of time series: an introduction. 1980, New York: Chapman and HallCrossRef Chatfield C: The analysis of time series: an introduction. 1980, New York: Chapman and HallCrossRef
16.
go back to reference Kulldorff M: Prospective time periodic geographical disease surveillance using a scan statistic. Journal of the Royal Statistical Society A. 2001, 164: 61-72. 10.1111/1467-985X.00186.CrossRef Kulldorff M: Prospective time periodic geographical disease surveillance using a scan statistic. Journal of the Royal Statistical Society A. 2001, 164: 61-72. 10.1111/1467-985X.00186.CrossRef
Metadata
Title
Accounting for seasonal patterns in syndromic surveillance data for outbreak detection
Authors
Tom Burr
Todd Graves
Richard Klamann
Sarah Michalak
Richard Picard
Nicolas Hengartner
Publication date
01-12-2006
Publisher
BioMed Central
Published in
BMC Medical Informatics and Decision Making / Issue 1/2006
Electronic ISSN: 1472-6947
DOI
https://doi.org/10.1186/1472-6947-6-40

Other articles of this Issue 1/2006

BMC Medical Informatics and Decision Making 1/2006 Go to the issue