An Evaluation of the Influenza Hospitalization Surveillance Network =================================================================== * W. Jon Windsor * Rodney E. Rohde 1. W. Jon Windsor(#aff-1) 2. Rodney E. Rohde(#aff-2) 1. Colorado School of Public Health 2. Texas State University 1. **Address for Correspondence: W. Jon Windsor**
, Colorado School of Public Health, wjwindsor58{at}yahoo.com ## ABSTRACT The Influenza Hospitalization Surveillance Network (IHSN or FluSurv-NET) was evaluated using the Centers for Disease Control and Prevention’s (CDC) guidelines for evaluating a public health surveillance system. The IHSN was evaluated for usefulness, simplicity, flexibility, data quality, acceptability, sensitivity, positive predictive value (PPV), representativeness, timeliness, and stability. The IHSN was found to use a broad range of sources for influenza surveillance that can be openly accessed via the CDC’s “FluView” online application. The IHSN is highly adaptable, with its capacity to accommodate additional data sources when needed. The overinclusiveness of different laboratory diagnostic methodologies was found to be detrimental to the overall data quality of the IHSN in the form of variable sensitivity and PPV measures among the CDC’s acceptable testing methods. Overall, the IHSN is a very robust system that allows for timely access to influenza data by public health officials. However, the inclusivity of the IHSN causes it to fall short when considering the importance of consistency in data collection practices. The IHSN fails to take into account several factors that could either artificially increase or decrease case counts. We recommend the IHSN integrate a more streamlined and reliable data collection process and standardize its expectations with all of its reporting sites. ABBREVIATIONS: * CDC - Centers for Disease Control and Prevention * DFA - direct fluorescent antibody * DOB - date of birth * EIP - Emerging Infections Program * FDA - Food and Drug Administration * FN - false negative * FP - false positive * ID - identification * IFA - indirect fluorescent antibody * IHSN - Influenza Hospitalization Surveillance Network * NCHS - National Center for Health Statistics * PPV - positive predictive value * RIDT - rapid influenza diagnostic test * RT-PCR - reverse transcription-polymerase chain reaction * TN - true negative * TP - true positive * WHO - World Health Organization INDEX TERMS: * influenza * human * public health surveillance * evaluation studies as topic ## STAKEHOLDERS The Stakeholders of the Influenza Hospitalization Surveillance Network (IHSN) include the Emerging Infections Program (EIP) and all their affiliates, the United States Centers for Disease Control and Prevention (CDC), the World Health Organization (WHO), local and state health departments, educators, healthcare officials, and the public. ## SYSTEM DESCRIPTION ### Importance Annually, influenza disseminates worldwide, causing widespread illness and, in severe cases, death. In the 2014–15 season for the United States, laboratory-confirmed influenza-associated hospitalizations reached upwards of approximately 65 cases per 100,000 persons, 30 in 2015–16, 60 in 2016–17, and 102 in 2017–18.1 Influenza-associated hospitalization cases are organized by age, underlying medical conditions, virus subtype, and cumulative/weekly rates.1,2 Severity is indexed by accumulating influenza-associated hospitalization case counts and calculating cumulative and weekly (unadjusted) incidence rates using population estimates from the National Center for Health Statistics (NCHS) to estimate hospitalization rates in the United States.1 The inequities of influenza infection result in time away from work and other societal obligations. The economic losses from the effects of influenza are considerable and the cost of hospitalization because of influenza is substantial. A study published in June of 2018 estimated the average annual total economic burden of influenza to the healthcare system and society was $11.2billion. Direct medical costs were estimated to be $3.2billion, and indirect costs $8.0billion.3 Influenza infection can be largely, but not completely, prevented by vaccination. The CDC’s 2017–18 influenza season vaccine effectiveness study showed that for children between 6 months and 8 years old, there was 68% less incidence of influenza (subtype A or B) in those vaccinated compared to those unvaccinated, while in the elderly population (>65 years old), there was only a 17% reduction of influenza in those who were vaccinated compared to those unvaccinated.4 The contents (or viral subtype targets) of influenza vaccines are based on recommendations by the WHO that carefully analyze sentinel surveillance of viral genotyping each year.5 Influenza can only be prevented through vaccinations; there is no cure for the infection outside of physician-prescribed antiviral drugs and basic symptom management. Influenza surveillance benefits the public by outlining the severity of each influenza season in an approximation of real time to help drive public health entities’ intervention strategies within the United States. ### Purpose The purpose of the IHSN within the EIP of the CDC is to conduct population-based surveillance for laboratory-confirmed influenza-associated hospitalizations.5 The objectives of the IHSN are to determine the time and location of where influenza activity is occurring, track influenza-related illness, determine which influenza virus subgroups are circulating, detect influenza virus mutation events, and measure the influence influenza has on hospitalizations and deaths in the US population.4 IHSN-gathered data is used to estimate age-specific hospitalization rates on a weekly basis and display characteristics of persons hospitalized with influenza. Cases are identified by reviewing hospital laboratory and admission databases and infection control logs for patients hospitalized during the influenza season with a documented positive influenza test (ie, viral culture, direct/indirect fluorescent antibody assay [DFA/IFA], rapid influenza diagnostic test [RIDT], or molecular assays, including reverse transcription-polymerase chain reaction [RT-PCR]).4 There is no legal requirement to submit influenza-associated hospitalization data to the CDC because it is not a nationally notifiable disease;7 however, participation is conditional for each participating state to receive funding from the CDC. The IHSN facilitates integration with other systems by aggregating data collected from individual EIP state surveillance systems (Figure 1). ![Figure 1.](http://hwmaint.clsjournal.ascls.org/https://clsjournal.ascls.org/content/ascls/32/3/108/F1.medium.gif) [Figure 1.](http://hwmaint.clsjournal.ascls.org/content/32/3/108/F1) Figure 1. The IHSN data flow from site location to the CDC, where the data is then inputted into FluView for public use. Additional information from laboratory-confirmed influenza cases provided to the CDC include patient ID number, surveillance site, hospital admission date, patient DOB, influenza test methodology, and identified influenza subtype (A or B).8 ![Figure 2.](http://hwmaint.clsjournal.ascls.org/https://clsjournal.ascls.org/content/ascls/32/3/108/F2.medium.gif) [Figure 2.](http://hwmaint.clsjournal.ascls.org/content/32/3/108/F2) Figure 2. An example of a 2 **×** 2 table used to calculate sensitivity and PPV. Test 1 is the method of interest and Test 2 is the method used for reference. The sensitivity calculation is TP/(TP + FN). The PPV calculation is TP/(TP + FP). 25 Note: FN, false negative; FP, false positive; TN, true negative; TP, true positive. The IHSN conducts surveillance on the individual populations of the 10 EIP-participating states. Data is collected annually and published weekly starting in the beginning of October and ends as late as May. Each of the EIP states have designated counties that contribute data to the IHSN.4 Among the 10 states, there are approximately 70 counties whose hospitals contribute data to the IHSN. The IHSN accumulates data from 267 acute care hospitals and laboratories in counties varying in socioeconomic status within the 10 EIP sites. All sites within the EIP are geographically distributed throughout the United States, and encompass approximately 27 million people.8 Surveillance officers (usually through EIP-participating public health departments) are trained to collect laboratory-confirmed influenza cases from laboratory logs, infection control practitioner logs, weekly calls to data collection sites (hospitals), or (depending on the state) state-reportable condition logs.6 Data is then compiled and sent on a weekly basis to the CDC for analysis and eventual input into the FluView application.1,2 Patient information is recorded with each case in all EIP-participating states. This is because, in contrast to the CDC’s notifiable conditions, laboratory-confirmed influenza (subtype A) is a reportable condition in all EIP states (Table 1) and that same information is required for use at the CDC (Figure 1). However, unique patient information (name, date of birth [DOB], patient identification [ID] number) is encrypted and securely sent, and is not published in weekly surveillance reports, nor is it inputted into the FluView application. View this table: [Table 1.](http://hwmaint.clsjournal.ascls.org/content/32/3/108/T1) Table 1. Displays a list of the 10 EIP reporting sites and their varying requirements for influenza reporting. “Influenza reportable?” indicates whether influenza is required to be reported to the state department. “Reporting Window” indicates the state allowable timeframe for reporting before a penalty is incurred. And “Isolate sent?” indicates whether the laboratories that identified a positive case of influenza are required to send a specimen to the state health department for confirmation testing. 11⇓⇓⇓⇓⇓⇓⇓⇓-20 ### Resources Used The IHSN is primarily financed by core funding for operation and personnel training provided to the EIP by the CDC.8,9 ## EVALUATION DESIGN The overall purpose is to evaluate the performance of the IHSN (FluSurv-NET) by assessing the reliability of laboratory-confirmed influenza-related hospitalizations in the United States. The evaluation can be taken under consideration and used to drive improvement or reinforce the IHSN strengths by the previously mentioned stakeholders. Information gathered by the evaluation can be used to highlight noted strengths and weaknesses of the IHSN and to improve overall quality assurance of data collection. An evaluation of the IHSN will consider whether the data collection methods require improvement, determine efficiency of case report flow, identify any discrepancies between the 10 EIP-participating sites, and determine any implications of variable state-level data accumulation. IHSN will be assessed by determining its overall usefulness for detecting trends and associations of influenza occurrences and how they can be used to prompt further research and prevention efforts. The IHSN will also be assessed by investigating each individual system attribute and their levels of contribution to the overall performance of the IHSN. System attributes will include simplicity (structure and ease of operation), flexibility (adaptability to evolution of information and public needs), data quality (validity of gathered data), acceptability (participation rate of EIP states), sensitivity (ability to identify cases and monitor changes), positive predictive value (PPV) (confidence of reported cases being “actual” cases), representativeness (accuracy of influenza occurrence and population distribution), timeliness (turnaround time between data collection steps), and stability (overall reliability of the IHSN). ## CREDIBLE EVIDENCE ### Usefulness Through the FluView Interactive application, the IHSN uses laboratory, hospital admission database, and infection control logs to capture hospitalized cases with a documented positive influenza test result during the regular influenza season.1,2 This is a comprehensive approach for accumulating data. The IHSN addresses the variability of testing methods by outlining the Food and Drug Administration (FDA)-cleared, or the Clinical Laboratory Improvement Amendment–waived influenza testing method that includes, but is not limited to, viral culture, DFA/IFA, RIDT, or nucleic acid–detecting molecular assays.2 ## SYSTEM ATTRIBUTES ### Simplicity FluView application allows for real-time data access and can differentiate cumulative rates based on age group, EIP state, and influenza season. Data is gathered by weekly reports to the CDC Influenza division by each EIP-participating state (Figure 1). The 10 states participating in the EIP that contribute data to the IHSN FluView application are California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. Georgia, Maryland, and Tennessee only require that influenza subtype A be reported to the state health department. All other states require all hospital-confirmed influenza cases to be reported to their state health department authorities (subtypes A and B).11⇓⇓⇓⇓⇓⇓⇓⇓-20 ### Flexibility Influenza can undergo *antigenic drift*, which are changes made (through mutation) to its varying subtypes. Because of antigenic drift, previous vaccination targets (subtypes) are then less effective at preventing infection in the population, making influenza difficult to control each year.21 Considering the unpredictable nature of influenza, The IHSN has a high degree of flexibility between influenza seasons. The IHSP can adjust to each influenza season by adding additional reporting sites outside of the EIP states (sites).6 The 2009–10 H1N1 pandemic prompted this change in the IHSP’s surveillance capacity. Additionally, the IHSP can also remove sites as needed. This has potential to compromise the longitudinal validity of data gathering and analysis. Each EIP-participating state has their own unique criteria for reportable conditions (Table 1), which can also compromise the validity IHSN data. However, aggregation of data at the CDC level is simplified because of their strict criteria for each case report (Figure 1).8 ### Data Quality Consistent surveillance officer training at EIP sites mitigates variability of the data accumulation process at the state level. The IHSN uses NCHS data to form population estimates used in rate calculations when calculating weekly and cumulative influenza-associated hospitalization rates.1 However, each test method outlined within the CDC’s “Information for Clinicians on Influenza Virus Testing” has variable sensitivity and PPV measures (Table 2).22 This variability has the potential to compromise the overall reliability of rate calculations used in the FluView application via underreporting caused by inaccurate test results (false negatives). View this table: [Table 2.](http://hwmaint.clsjournal.ascls.org/content/32/3/108/T2) Table 2. A table comparing the turnaround times (test time), methodologies, analytical sensitivity, and positive predictive values (separated by influenza A and B subtypes) of 6 different randomly selected test methods selected from the CDC’s “Available FDA-Cleared Rapid Influenza Diagnostic Tests”22 and “FDA-cleared Nucleic Acid Detection Based Tests for Influenza Viruses”24 tables found on the CDC website. Sensitivity and positive predictive values for each test were calculated individually using package insert clinical study data of each methodology.26⇓⇓⇓⇓-31 ### Acceptability For the IHSN EIP sites to receive funding from the CDC, they are required to comply with basic reporting standards of the CDC’s national notifiable conditions. By having trained surveillance officers for collection of relevant information (and paying them to do so), this allows EIP sites to participate in the IHSN, ensuring as much data is provided as possible. Apart from 3 participating sites (Table 1), laboratory-confirmed influenza (A and B subtypes) is a state-reportable condition ensuring compliance at the site level. Failure to report a “reportable” or “notifiable” condition by a hospital or physician office subjects them to potential revocation of individual medical licenses or operating licenses of the institutions (hospitals) at fault.23 ### Sensitivity and Positive Predictive Value Table 2 includes a compilation of 3 tests each selected from the “Available FDA-Cleared Rapid Influenza Diagnostic Tests (Antigen Detection Only)” and the “FDA-Cleared Nucleic Acid Detection Based Tests for Influenza Viruses” pages on the CDC’s website, 22,24 and the sensitivity/PPV calculations for each test. Test selections were made by numbering each test in each table and submitting them into a random number generator. Calculations were performed using “Nasopharyngeal Swab” sample data. The clinical sensitivity of all 3 nucleic acid testing methodologies ranges from 90% to 100%, while for antigen detection methods, they range from approximately 84% to 97% for influenza subtype A. The confidence that a detected positive value is actually positive within the patient for nucleic acid testing methods are all almost universally 100%, whereas antigen detection tests only had a range of approximately 75%–93% confidence in positive values for influenza subtype A. The IHSN is heavily reliant on the accuracy of influenza testing methods at the individual laboratories within the EIP states’ participating counties. Sensitivity and positive predictive values were determined at individual testing levels to address this at the IHSN level. There are currently no criteria for confirming positive influenza tests within the IHSN. Confirmation testing for positive results is left to the discretion of the EIP-participating states. Table 1 indicates only 3 EIP-participating state health departments require confirmation testing on all positive influenza tests. The lack of confirmation testing could lead to an inflation of false positive test results on methods with a lower positive predictive value. Table 2 outlines the differences in sensitivity and positive predictive values between the 6 selected tests. It is noted that there is a lot of variability in sensitivity and specificity among the different test types. ### Representativeness The IHSN has a high degree of representativeness in terms of geographic distribution of counties within the EIP-participating states and of the EIP states themselves. This allows for a stratified approach to IHSP data collection, which helps published data to be more generalizable to the rest of the United States. A key challenge is accurate representation of a grossly underreported disease like influenza.32,33 The CDC has struggled for decades to adjust and refine their models to determine epidemic thresholds and determination of seasonal severity. This is because of changes in diagnostic technology, access to diagnostics, and modeling techniques.34⇓⇓-37 It is important to note that population-based estimates of influenza are based on census data, which is also based on statistical models that have evolved over the decades as well. The dichotomy of having more cases reported may result in stimulating media reporting, which in turn stimulates patient demand that stimulates healthcare providers to order influenza testing. Because of an increase in influenza molecular testing options, increased access of testing options to physicians can cause them to overscreen, which can lead to an artificial inflation of positive influenza cases that may or may not be contributing to patient hospitalizations.38 The IHSN counts all hospitalizations that have a laboratory-confirmed positive influenza test. Artificial inflation of positive cases in the form of overscreening, combined with the IHSN case definition, can lead to a misrepresentation of the population’s influenza-associated hospitalization rates. This raises concerning questions regarding the scientific basis upon which we claim severity: is it based on antigenic shift (ie, a pandemic), or more accurate statistics for an underreported disease? ### Timeliness Each EIP IHSN state has variable reporting conditions and timelines for influenza (Table 1). All participating states require all laboratory-confirmed influenza cases to be reported to the state health department. The reporting timeframe for influenza in each state ranges from immediate to reporting “within 7 days” (Table 1). The CDC estimates there to be a median 7-day lag time from the time a case is identified to when the CDC receives the report for the IHSN.6 It is unclear as to whether the IHSN inputs influenza cases using the identification date at the laboratory level or the date the CDC received the data. However, a 7-day lag time between identification and reporting to the CDC is fairly rapid considering the geographical distribution of EIP sites and frequency of influenza cases. ### Stability There have been no significant events or available evidence that suggest the stability of the IHSP and their FluView application have ever been compromised. The IHSP provide weekly updates and there have been no notable delays in updates as of 2018. ## CONCLUSIONS/RECOMMENDATIONS The IHSN uses a broad range of sources to identify influenza-associated hospitalization cases. This, combined with a narrow case definition, affords the IHSN the benefit of having reliable sources of data collection.13 The added benefit of each EIP state having at least some degree of required reporting for influenza (Table 1) and near identical reporting requirements (Figure 1), indicates that some effort has been made to mitigate underreporting from participating EIP states. The FluView application is user-friendly and easily accessed by the public, ensuring widespread use of IHSN-accumulated data.13 Adaptability of the IHSN allows for timely and appropriate reactions to the constant shifts in influenza activity between seasons. The IHSN data quality can be both effective or ineffective, depending on which data points are being considered. It is also noted that the stability of the IHSN has been proven adequate in the past but vigilance must remain to maintain that security. By using NCHS data, universal determination of population estimates from each participating county within the EIP states allows for consistent population estimates for rate calculations.12 However, laboratory testing methodologies and individual physician testing behaviors are not universal. Each reporting laboratory uses different testing methodologies that vary in sensitivity and PPV (Table 2). Certain testing methodologies are more reliable than others in terms of sensitivity. Methodologies with lower sensitivity can artificially decrease case counts. Testing platforms that have a lower PPV can artificially increase case counts. All of this can potentially confound site-specific data and lead to inaccurate predictions or comparisons when used for research. Lower rates in certain areas could be a product of less accurate testing methods (eg, RIDT) and not an accurate reflection of the status of influenza in that area. Molecular testing has proven to the be one of the most reliable methods of identifying influenza.4 By incentivizing hospital laboratories to adopt more molecular testing for influenza identification, the IHSN can ensure a higher degree of accuracy in its data sources. Furthermore, state health departments can address artificial increases to case counts, implementing more confirmation testing on positive influenza samples that do not exceed a certain PPV threshold. The IHSN ensures EIP state participation by making weekly influenza case reporting conditional for the receipt of funding from the CDC.26 This further diminishes the likelihood of cases not being reported to the state health departments for IHSN use. Population-specific socioeconomic status and demographics are well-represented in the IHSN dataset. This is because of a wide geographic distribution of participating counties and EIP states.1,2 However, the IHSN fails to take into account individual hospital policy on screening patients for influenza, which is made possible by the increasing number of affordable influenza testing methods on the market.38 Policies that favor overscreening can artificially increase case counts, deteriorating the quality of IHSN rate estimates. This can potentially be addressed by narrowing the case definition so that laboratory-confirmed influenza-associated hospitalizations only encompass hospitalizations that are a result of influenza. Each EIP state have varying reporting time frames for influenza. This can result in delays of reporting and lower weekly case counts. This can be addressed by proposing a more universal reporting timeframe among the EIP states. However, the IHSN is still able to provide weekly updates to the FluView application which is fairly rapid considering the scope of the IHSN (Table 1). The variability of influenza each year requires that the United States be vigilant in its evaluation and improvement of influenza-associated hospitalization surveillance to adapt to the ever-growing changes in severity, morbidity, and mortality of influenza. ## LESSONS LEARNED Overall, the IHSN provides a fairly reliable data source when considering its flexibility, usefulness, and timeliness. The IHSN’s ability to add states into its data pool based on need makes it highly adaptable to the unpredictability of the influenza virus, but at the cost of introducing more variability into its dataset. IHSN data can be used to establish incidence rates and trends over time. The FluView application that uses IHSN data can stratify data based on age, underlying conditions, and viral subtypes to help determine measures of association during each influenza season. Data is updated on a weekly basis allowing for analysts and public health officials to implement control and prevention measures in a timely manner. The IHSN is extremely stable and experiences little to no (noticeable) system outages. The IHSN data collection process requires a more streamlined and reliable approach. Coupled with a lack of confirmation testing, variability in the clinical sensitivity and positive predictive values of each test method deteriorates the overall reliability of data. Measures that ensure confirmation testing for positive influenza results obtained by analytically unreliable tests is paramount to enhancing overall quality of data. The representativeness of IHSN data can be more accurately determined by comparing the influenza screening policies of individual hospital-based laboratories to differentiate volume of testing and potentially eliminate overtesting as an inflation for cases in a future study. The question remains of how to manage communications in the context of increased accuracy in representing a historically underreported disease like influenza. There are ethical considerations when interpreting data in the context of continually changing data collection processes and assessment methods within in the context of ongoing vaccine skepticism. On the one hand, we are improving awareness of the importance of influenza as a potentially serious disease for which early treatment can reduce cost of care, morbidity, and mortality. On the other hand, overcalling severity without providing key disclaimers regarding changes made over time to improve surveillance may impair credibility with patients and providers. ## ACKNOWLEDGMENTS A special thank you to Ian Wallace for providing unique clinical laboratory testing insights and addressing the implications the variable sensitivities amongst different testing methods. Thank you, Dr. Dawn Comstock, for providing the education that facilitated this evaluation and for the timely feedback on earlier drafts. Thank you, Dr. James Wilson, for providing invaluable insights into health security intelligence and providing your unique perspective on the impact surveillance systems can have on driving public health responses. And thank you to Alicia Cronquist, whose willingness to discuss her experience with electronic disease reporting systems provided great direction when discussing the strengths and limitations of the IHSN. The opinions expressed by authors contributing to this article do not necessarily reflect the opinions of the CDC or the institutions with which the authors are affiliated. * Received July 10, 2019. * Accepted September 23, 2019. American Society for Clinical Laboratory Science ## References 1. 1.“FluView: Influenza Hospitalization Surveillance Network,” Centers for Disease Control and Prevention, [https://gis.cdc.gov/GRASP/Fluview/FluHospRates.html](https://gis.cdc.gov/GRASP/Fluview/FluHospRates.html). 2. 2.“FluView: Influenza Hospitalization Surveillance Network,” Centers for Disease Control and Prevention, [https://gis.cdc.gov/grasp/fluview/FluHospChars.html](https://gis.cdc.gov/grasp/fluview/FluHospChars.html). 3. 3.Putri WCWS, Muscatello DJ, Stockwell MS, Newall AT. Economic burden of seasonal influenza in the United States. Vaccine. 2018;36(27):3960–3966. doi: [10.1016/j.vaccine.2018.05.057](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1016/j.vaccine.2018.05.057) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.1016/j.vaccine.2018.05.057&link_type=DOI) 4. 4.“Influenza (Flu).” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 19 Oct. 2018, [www.cdc.gov/flu/weekly/overview.htm](http://www.cdc.gov/flu/weekly/overview.htm). 5. 5.“WHO Consultation and Information Meeting on the Composition of Influenza Virus Vaccines for Use in the 2019 Southern Hemisphere Influenza Season.” *World Health Organization.* 6. 6.Hadler JL, et al. “Emerging Infections Program-State Health Department Perspective - Volume 21, Number 9-September 2015 - Emerging Infectious Diseases Journal - CDC.” *Centers for Disease Control and Prevention*, Centers for Disease Control and Prevention, 12 Aug. 2015, [wwwnc.cdc.gov/eid/article/21/9/15-0428\_article#r3](http://wwwnc.cdc.gov/eid/article/21/9/15-0428_article#r3). 7. 7.“2018 National Notifiable Conditions.” Centers for Disease Control and Prevention, [www.cdc.gov/nndss/conditions/notifiable/2018/](http://www.cdc.gov/nndss/conditions/notifiable/2018/). 8. 8.Chaves SS, Lynfield R, Lindegren ML, Bresee J, Finelli L. The US Influenza Hospitalization Surveillance Network. Emerg Infect Dis. 2015;21(9):1543–1550. doi: [10.3201/eid2109.141912](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.3201/eid2109.141912) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.3201/eid2109.141912&link_type=DOI) [PubMed](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=26291121&link_type=MED&atom=%2Fascls%2F32%2F3%2F108.atom) 9. 9.Pinner RW, et al. “Cultivation of an Adaptive Domestic Network for Surveillance and Evaluation of Emerging Infections - Volume 21, Number 9-September 2015 - Emerging Infectious Diseases Journal - CDC.” Centers for Disease Control and Prevention, 12 Aug. 2015, [www.cdc.gov/eid/article/21/9/15-0619\_article#r13](http://www.cdc.gov/eid/article/21/9/15-0619_article#r13). 10. 10.“Division of Preparedness and Emerging Infections (DPEI).” Centers for Disease Control and Prevention, 15 Oct. 2018, [www.cdc.gov/ncezid/dpei/eip/eip-about.html](http://www.cdc.gov/ncezid/dpei/eip/eip-about.html). 11. 11.“California Code of Regulations (CCR) 2500, 2593 …” *Title 17*, 2016, [www.cdph.ca.gov/Programs/CID/DCDC/CDPHDocumentLibrary/ReportableDiseases.pdf](http://www.cdph.ca.gov/Programs/CID/DCDC/CDPHDocumentLibrary/ReportableDiseases.pdf). 12. 12.“Colorado-Report a Disease.” *Gov. John Hickenlooper | The Official Site of Governor Hickenlooper,* 2018, [www.colorado.gov/cdphe/report-a-disease](http://www.colorado.gov/cdphe/report-a-disease). 13. 13.“Connecticut Reportable Diseases, Emergency Illnesses and …” *Connecticut Epidemiologist,* 2018, [portal.ct.gov/-/media/Departments-and-Agencies/DPH/dph/infectious\_diseases/pdf\_forms\_/ReportableDiseases.pdf?la=en](http://portal.ct.gov/-/media/Departments-and-Agencies/DPH/dph/infectious_diseases/pdf_forms_/ReportableDiseases.pdf?la=en). 14. 14.“Disease Reporting.” Georgia Department of Public Health, 2016, [dph.georgia.gov/disease-reporting](http://dph.georgia.gov/disease-reporting). 15. 15.“Maryland Reportable Diseases.” Diseases, Conditions, Outbreaks, & Unusual Manifestations Reportable by Maryland Health Care Providers, 2008, [phpa.health.maryland.gov/IDEHASharedDocuments/what-to-report/ReportableDisease\_HCP.pdf](http://phpa.health.maryland.gov/IDEHASharedDocuments/what-to-report/ReportableDisease_HCP.pdf). 16. 16.“Minnesota-Infectious Disease Reporting.” Airborne Precautions - Minnesota Dept. of Health, 2018, [www.health.state.mn.us/divs/idepc/dtopics/reportable/disease.html](http://www.health.state.mn.us/divs/idepc/dtopics/reportable/disease.html). 17. 17.“New Mexico-Notifiable Conditions.” Notifiable Diseases or Conditions in New Mexico, 2013, [nmhealth.org/publication/view/regulation/372/](http://nmhealth.org/publication/view/regulation/372/). 18. 18.“New York Reportable Diseases.” Cancer - New York State Department of Health, 2018, [www.health.ny.gov/professionals/diseases/reporting/communicable/](http://www.health.ny.gov/professionals/diseases/reporting/communicable/). 19. 19.“Oregon-Reportable Diseases.” Oregon Public Health Division Reporting for Laboratories, 2018, [www.oregon.gov/oha/ph/DiseasesConditions/CommunicableDisease/ReportingCommunicableDisease/Documents/ReportingPosters/poster-laboratory.pdf](http://www.oregon.gov/oha/ph/DiseasesConditions/CommunicableDisease/ReportingCommunicableDisease/Documents/ReportingPosters/poster-laboratory.pdf). 20. 20.“Tennessee-Reportable Diseases.” 2018 List of Reportable Diseases in Tennessee For Laboratories, 2018, [www.tnpcaeducation.org/misc/2018LaboratoryListandGuidance.pdf](http://www.tnpcaeducation.org/misc/2018LaboratoryListandGuidance.pdf). 21. 21.“Influenza.” *World Health Organization*, World Health Organization, 15 Nov. 2018, [www.who.int/biologicals/vaccines/influenza/en/](http://www.who.int/biologicals/vaccines/influenza/en/). 22. 22.“Influenza (Flu)-Diagnosis Table RIDT.” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 20 Feb. 2018, [www.cdc.gov/flu/professionals/diagnosis/table-ridt.html](http://www.cdc.gov/flu/professionals/diagnosis/table-ridt.html). 23. 23.“Public Health Professionals Gateway.” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 13 Apr. 2018, [www.cdc.gov/phlp/index.html](http://www.cdc.gov/phlp/index.html). 24. 24.“Influenza (Flu)-Nucleic Acid Detection.” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 26 Mar. 2018, [www.cdc.gov/flu/professionals/diagnosis/table-nucleic-acid-detection.html](http://www.cdc.gov/flu/professionals/diagnosis/table-nucleic-acid-detection.html). 25. 25.Baron EJ. “Flu Season 2011-2012.” (Table 2) Cepheid - A Better Way, [www.cepheid.com/us/healthcare-impact/emagazine/item/20-flu-season-2011-2012](http://www.cepheid.com/us/healthcare-impact/emagazine/item/20-flu-season-2011-2012). 26. 26.Novak-Weekley SM, Marlowe EM, Poulter M, et al. Evaluation of the Cepheid Xpert Flu Assay for rapid identification and differentiation of influenza A, influenza A 2009 H1N1, and influenza B viruses. J Clin Microbiol. 2012;50(5):1704–1710. doi: [10.1128/JCM.06520-11](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1128/JCM.06520-11) [Abstract/FREE Full Text](http://hwmaint.clsjournal.ascls.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiamNtIjtzOjU6InJlc2lkIjtzOjk6IjUwLzUvMTcwNCI7czo0OiJhdG9tIjtzOjIwOiIvYXNjbHMvMzIvMy8xMDguYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 27. 27.Leber AL, Everhart K, Daly JA, et al. Multicenter Evaluation of BioFire FilmArray Respiratory Panel 2 for Detection of Viruses and Bacteria in Nasopharyngeal Swab Samples. J Clin Microbiol. 2018;56(6):28. doi: [10.1128/JCM.01945-17](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1128/JCM.01945-17) [Abstract/FREE Full Text](http://hwmaint.clsjournal.ascls.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiamNtIjtzOjU6InJlc2lkIjtzOjE0OiI1Ni82L2UwMTk0NS0xNyI7czo0OiJhdG9tIjtzOjIwOiIvYXNjbHMvMzIvMy8xMDguYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 28. 28.CDC Human Influenza Virus Real-Time RT-PCR Diagnostic Panel [package insert]. Atlanta, GA: CDC; 2014. 29. 29.Sofia Influenza A. +B FIA [package insert]. San Diego, CA: Quidel; 2011. 30. 30.Veritor System [package insert]. Franklin Lakes, NJ: Becton Dickinson; 2018. 31. 31.Alere BinaxNOW [package insert]. Lake Bluff, Illinois: Abbott; 2017. 32. 32.Biggerstaff M, Kniss K, Jernigan DB, et al. Systematic Assessment of Multiple Routine and Near Real-Time Indicators to Classify the Severity of Influenza Seasons and Pandemics in the United States, 2003-2004 Through 2015-2016. Am J Epidemiol. 2018;187(5):1040–1050. doi: [10.1093/aje/kwx334](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1093/aje/kwx334) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.1093/aje/kwx334&link_type=DOI) [PubMed](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=29053783&link_type=MED&atom=%2Fascls%2F32%2F3%2F108.atom) 33. 33.Birger R, Morita H, Comito D, et al. Correction for Birger et al., “Asymptomatic Shedding of Respiratory Virus among an Ambulatory Population across Seasons”. MSphere. 2018;3(6):e00667–18. doi: [10.1128/mSphere.00667-18](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1128/mSphere.00667-18) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.1128/mSphere.00667-18&link_type=DOI) 34. 34.Ip DK, et al. Viral Shedding and Transmission Potential of Asymptomatic and Pauci-Symptomatic Influenza Virus Infections in the Community. Clin Infect Dis. 2015;(Mar). doi: [10.1093/cid/ciw841](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1093/cid/ciw841) 35. 35.Reed C, Chaves SS, Daily Kirley P, et al. Estimating influenza disease burden from population-based surveillance data in the United States. PLoS One. 2015;10(3):e0118369. doi: [10.1371/journal.pone.0118369](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1371/journal.pone.0118369) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.1371/journal.pone.0118369&link_type=DOI) [PubMed](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=25738736&link_type=MED&atom=%2Fascls%2F32%2F3%2F108.atom) 36. 36.Thompson WW, Shay DK, Weintraub E, et al. Mortality associated with influenza and respiratory syncytial virus in the United States. JAMA. 2003;289(2):179–186. doi: [10.1001/jama.289.2.179](http://hwmaint.clsjournal.ascls.org/lookup/doi/10.1001/jama.289.2.179) [CrossRef](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=10.1001/jama.289.2.179&link_type=DOI) [PubMed](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=12517228&link_type=MED&atom=%2Fascls%2F32%2F3%2F108.atom) [Web of Science](http://hwmaint.clsjournal.ascls.org/lookup/external-ref?access_num=000180226400028&link_type=ISI) 37. 37.Morbidity and Mortality Weekly Report. “Estimates of Deaths Associated with Seasonal Influenza-United States, 1976-2007.”. CDC MMWR. 2010;59(33). 38. 38.Burnham ML. YarbroughCarey-Ann D., et al. “Influence of Molecular Testing on Influenza Diagnosis.” Clinical Chemistry, Clinical Chemistry, 1 Nov. 2018, [clinchem.aaccjnls.org/content/64/11/1560?casa\_token=U42IBkZbpBYAAAAA:l-QEZFiYY3idegTnP-gIgJsk\_X7\_1u2SzwTEumS4Q5IHz7kix5E0T\_80V3x04VJiNRf2nu2ibA](clinchem.aaccjnls.org/content/64/11/1560?casa\_token=U42IBkZbpBYAAAAA:l-QEZFiYY3idegTnP-gIgJsk_X7_1u2SzwTEumS4Q5IHz7kix5E0T_80V3x04VJiNRf2nu2ibA).