To be effective pandemic risk prevention work has to take place well before pandemics through the three Ps: Planning, Preparedness and Practise [1].Global influenza surveillance work has been led by WHO since the early 1950s and although pandemics occurred at irregular intervals in the 20th Century and before (Fig. 1) formal national and international pandemic planning only started late in the 20th century. The first published national pandemic plans appeared in Europe in the 1990s, some stimulated by the emergence of a potential pandemic influenza A(H5N1) in Hong Kong in 1997 [2]. The first WHO guidance pandemic plan appeared in 1999 but was based on limited international consultation [3]. More considered work began in early 2002 with the formal development of a global influenza agenda including pandemic planning [4, 5] leading to a resolution at the World Health Assembly in 2003, This also contained the first targets for seasonal influenza immunisation [6]. The experience of SARS in 2003, another albeit considerably different acute respiratory viral infection, gave stimulus to this work as did the development of the first comprehensive International Health Regulations (IHR 2005) that could be used when declaring a pandemic [7]. The first proper global guidance on pandemic planning was adopted by WHO in 2005 along with a seminal checklist [8, 9].

Fig. 1
figure 1

Pandemics and interpandemic human influenzas, 1889–2010

In Europe the precursors to the European Influenza Surveillance Network (EISN) had been active since the late 1980s. In 2001 the European Commission convened the first European Union conference on pandemic preparedness (Tab. 1). Particular stimulation for strengthening pandemic plans and preparedness came from the re-emergence of influenza type A(H5N1) in China and South East Asia in 2003 to then affect birds and some humans in the rest of Asia and Europe in 2004–2005. When the European Commission (EC) and WHO Regional Office for Europe (WHO-Euro) in March 2005 convened the first European Pandemic Preparedness Workshop the European Member States as a whole became more serious in pandemic planning. This included a review of EU/EEA Member States’ paper pandemic plans and later that year the EC issued a Communication on Pandemic Planning for the European Union countries [10].

Tab. 1 European pandemic preparedness—a 21st century timeline

The European Centre for Disease Prevention and Control (ECDC) opened in May 2005 in the midst of this accelerated interest. Its Director made pandemic preparedness its first disease-specific priority, alongside that of establishing the basic infrastructure. The ECDC did this by drawing from the expertise of its four Technical Units (Health Communications, Preparedness and Response, which led the pandemic preparedness activities; Scientific Advice, which hosts the coordination of the influenza work, and Surveillance). Working in support of Member States and the European Commission the ECDC initially developed a simple assessment procedure to strengthen pandemic preparedness in the EU and European Economic Area (EU/EEA) countries and started a series of visits (Tab. 1). It did so in collaboration with the European Commission (EC) and the WHO Regional Office for Europe (WHO-EURO). At the same time the ECDC developed close linkages with WHO Headquarters since that has the main influenza expertise globally and leads the work of pandemic preparedness and response.

The objective of this paper is to describe the ECDC assessment work and procedure, and how and why it evolved with experience with the EU/EEA Member States.Footnote 1 The paper also reports the resources that were developed for Europe from needs expressed in the assessments. It shows how momentum was maintained and European capacity drawn upon. The overall results of the assessment are described and finally it lists the important gaps identified as a result of the real test from the 2009 pandemic.

The initial procedure

The central component of the ECDC’s work on pandemic preparedness was a standard procedure for assisting EU and EEA countries to assess and improve their national and local pandemic preparedness. This was based on WHO’s 2005 guidance and especially its checklist but had to operate within the limited mandate of EU bodies in general and the ECDC in particular in relation to human health issues [11]. The ECDC has no regulatory function and can only issue guidance (that is documents on a topic offering options with their pros and cons, operational aspects and the relevant scientific evidence). It can rarely offer recommendations or directions. Also it can only become involved in a country’s preparedness when invited. However with strong leadership from the European Health Commissioner (Mr Kyprianou 2005–2008), the European Commission and successive EU Presidency Countries (notably those of the UK, Austria, France and Sweden) all EU and EEA countries welcomed and supported the ECDC’s approach to the extent of seeking an ECDC assessment.

The initial assessment procedure was simply a short country visit using a set of indicators. This was developed by ECDC staff in its Preparedness and Response Unit in 2005 using the WHO planning guidance and especially its 2005 checklist as reference points [8, 9]. Following piloting with Swedish authorities the visits began in the summer of 2005 (Tab. 1). The initial procedure was a classical external assessment relying on a visiting team (usually but not always led by a senior member of the ECDC or EURO staff but with some members from the EC and EURO). The team visited countries for 3–4 days, checking plans against the WHO templates and making a limited number of visits to a few convenient national institutions. A standard questionnaire was completed during the visit, based on the WHO check-list, and those questions became the first ECDC Indicators of Pandemic Preparedness. This followed the broad structure recommended by the WHO documents with the five WHO guidance categories: planning and coordination, situation monitoring and assessment, prevention and reduction of transmission, health system response and communication. Following the visit the ECDC would send a written report to the country which remained unpublished.

Evolution and development of the procedure

Responding to avian influenza A(H5N1)“bird flu”

The autumn of 2005 and the winter of 2005/2006 provided a stimulus for EU/EEA countries to prepare for a severe pandemic with human cases appearing in a neighborhood country (Turkey); birds in many EU countries were also then infected. Strenuous work to improve preparedness was led by WHO (human health), the European Commission and other animal health agencies (FAO and OIE) [12, 13]. Many countries secured stockpiles of antiviral agents, principally oseltamivir [14]. In response to this items on preparedness for avian influenza outbreaks were added to the ECDC indicators and joint work was sought with Ministries of Health and Agriculture during the visits. However“bird flu” was almost a distraction for ECDC in that it had to develop a suite of risk assessment and guidance documents for what was an animal influenza what could occasionally infect humans [15, 16]. While human infections with A(H5N1) often had lethal consequences to date no such infections have occurred in the EU/EEA countries and the virus has as yet failed to fully adapt to humans and transmit efficiently from person to person [16]. With hindsight this severe threat led most Member States, ECDC and WHO to focus on planning for a much more severe pandemic than transpired in 2009. Preparing for a severe threat was a very defensible first step but it should of course have moved on to planning for mild as well as severe eventualities [17, 18].

Development of a fuller procedure

As the country visits proceeded a number of limitations in the 2005 procedure became clear and it rapidly evolved with significant revisions in 2006 and 2007 (Tab. 1). In order to provide detailed guidance for the visits, ECDC (in collaboration with the European Commission and WHO EURO) led the further development of an assessment tool that addressed the planning process, discussion points for the meetings, and report writing. An internal guide was developed for team leaders. The tool evolved substantially during the first year of application and latterly became available in a version that focused more on local preparedness, intersectoral work beyond the health sector, consistency of preparedness with neighbouring countries (interoperability), measures around seasonal influenza, especially influenza vaccination, laboratory preparedness, antiviral strategies, exercises, and communication aspects [19, 20, 21]. The most important developments (Tab. 2) and some of the resources will now be described in more detail.

Tab. 2 Major developments in the assessment procedure from 2005–2007 (re-published in July 2006 and March 2007)

Evolution from short external assessment to facilitated self-assessments

The most important development was to move from the classical short external assessment visit to a more demanding self-assessment owned and enacted by the country. The assessment then became a collaborative effort with an in-country lead and all the national country agencies involved in influenza preparedness as essential players. It was also appreciated that the scope of pandemic preparedness was becoming more and more complex. While the visits remained central and still lasted only 4 days, they were part of a longer procedure lasting 4 or more months orchestrated by the country’s lead agency for pandemic preparedness, usually but not always, the Ministry of Health (Tab. 2, [19]). Standard aims and ground rules were agreed (Tab. 3). The central visit always involved work with the Ministry of Health and the national public health agency but also variously Ministries of Agriculture, institutions involved in general civil emergency preparedness, the national influenza committee, the national influenza reference laboratory and the national surveillance centre. The visits also increasingly included some local representatives as the emphasis focused more on local preparations, including a visit to at least a set of local preparedness structures. The visits resulted in a report with a list of recommendations and actions prepared by ECDC and the national authority. Finally in the spirit of openness and to make it more likely that recommendations would be acted upon countries were latterly encouraged to publish their reports and five of the countries undertaking self-assessments did so (Finland, Ireland, the Netherlands, Norway and Sweden). Each visit concluded with identification of gaps that needed filling, often calling on ECDC to do this. This in turn led to the development of a suite of resources which were made available to all Member States through publication on ECDC’s website (Tab. 4).

Tab. 3 Aims and ground rules of the pandemic self-assessment procedures and ground rules of national pandemic preparedness self-assessments
Tab. 4 Pandemic preparedness resources and documents developed by ECDC and its partners

Indicators to monitor preparedness—evolution over time

The list of key and subsidiary indicators also evolved to a point where they could be used by the countries and the EU to monitor their planning progress. A special list of indicators was developed for communication preparedness. A working group had been set up by WHO EURO to develop similar indicators and make them more SMART under a project funded by the European Commission, i.e. that the indicators were specific, measurable, achievable, realistic and timely. The ECDC was invited to join this group to ensure there were not competing sets of indicators. A proposed set of primary and linked secondary indicators were field tested during two of the assessment visits. They were found to be unacceptably complex to the in-country teams so that their completion was not insisted upon. Rather the secondary indicators were retained to assist Member States in deciding if they had completed a primary indicator or not.

From plans to preparedness and practice

The first European Pandemic Preparedness Workshop in 2004 in Luxemburg focused on paper plans. The second Workshop in 2005 hosted by WHO-EURO in Copenhagen established that all EU and EEA countries had plans, but that these were only starting to move on to preparedness (Tab. 1). The philosophy therefore developed that published national plans were essential but not sufficient in preparing for a pandemic [23]. Published pandemic plans play vital roles and their analysis and comparison against standards can by their omissions reveal important gaps and inconsistencies [24, 25]. However it was also appreciated that plans could not reflect preparedness and there was a danger that national authorities stopped at producing a well-written pandemic plan without developing the operational aspects underneath.

In November 2005 the European Commission carried out a pandemic exercise Common Ground involving every EU and EEA government, all relevant EU Agencies, WHO and the substantial European pharmaceutical industry (which accounts for a significant proportion of the global influenza vaccine production). The exercise was highly successful in that many countries combined the EU level event with national exercises [26]. These exposed many gaps and weaknesses at both the EU and national levels. This experience and further national exercises which were stimulated by Common Ground led to an appreciation that the assessments had to include Practices and that these were the third P joining Plans and Preparedness. It became standard in the assessments to request details of exercises that had been undertaken or to recommend them being carried out [19, 21].

Strengthening local preparedness—the ECDC“acid tests”

An inherent weakness of pandemic preparedness is that though it has to start centrally in countries for the individual citizen the countermeasures have to be available and delivered locally [1]. If that is not the case the preparations can be viewed as a waste of time by the citizens. After the second European workshop it was appreciated that in addition to evaluating written plans, the self-assessments needed to determine the extent that they had been translated into preparedness at different levels in the countries, for example whether or not complementary standard operating procedures and contingency arrangements had been developed, staff educated and trained, equipment and supplies ordered and local business continuity planning undertaken. These facets were added to the procedure and indicators [20, 21]. Particular examples are the delivery of antivirals and vaccines. While a country may have acquired a national stockpile of oseltamivir and have an Advance Purchase Agreement for specific pandemic vaccines that will be to no effect if the plans are not there for and practised for getting the drugs and vaccines to local practitioners and then onto the public. To assist in this ECDC developed what it called acid tests Footnote 2 for countries to self apply to convince themselves that they could deliver countermeasures to citizens (Tab. 5, [27]).

Tab. 5 ECDC acid tests for local preparedness

From exclusive health sector to cross-government and whole country plans

Initially European pandemic plans exclusively focused on the health care sector and the first ECDC procedure and indicators reflected this, Partially this was following the WHO lead which primarily concerned the health sector and did not contain a multi-sectoral component until 2009 following the creation of a United Nations Influenza Coordinator (UNSIC) [28]. The importance of non-health sectors was appreciated for two reasons. Firstly a pandemic threat could temporarily remove up to 20% of working adults through illness or having to care for others and so threaten the functioning of core activities well beyond the health sector. Pandemic business continuity planning was needed, for example the power industry, food and fuel supplies. Secondly while there was agreement that certain public health measures might constitute significant countermeasures to reduce peak illness prevalence those in the social distancing category required cross sectoral preparation action [29]. For example proactive school closures needed close coordination between education, health and other sectors at national and local levels [30]. Therefore in their updating the ECDC procedures increasingly required their health sector counterparts to involve other sectors, business and civil society. At the same time some EU countries started to publish cross government or whole society plans and eventually the French Presidency Angers Workshop in 2008 led to formal EU Health Council Recommendations on the need for multi-sectoral planning in pandemic preparedness [20, 21].

Maintaining momentum and using European capacity

Pandemic preparedness workshops

After the initial European pandemic preparedness workshops in Luxembourg and Copenhagen, two other workshops were held in May 2006 in Uppsala (Tab. 1) and in September 2007 in Luxemburg. Alternating with the European level workshops, and following a recommendation of the Uppsala workshop, the ECDC started in 2006 to organize smaller, regional level and topic-based workshops. These workshops of up to 10 countries with, if possible, common borders, addressed specific operational issues within and among Member States. An additional topic-specific workshop on communications was also held in 2006 leading to the formation of a network of communicators under the Health Security Committee.

Peer review and innovations

As capacity developed in the EU increasingly ECDC and WHO-EURO (which was undertaking its own assessments of countries outside the EU/EEA area) drew on those countries that had undertaken early assessments to contribute to later assessments. The fourth and final European workshop 2007 in Luxemburg made a special feature of all the innovations from Member States and ECDC stated publishing these on its website [31].

Results of the assessments

The fact that the indicators were standard meant that it was possible to make comparisons of a country against a norm. At the request of the health commissioner two cross-sectional surveys were undertaken by ECDC with EU/EEA Member States in 2006 and 2007 asking them to indicate their performance on the indicators. A sensitive question has been whether performance on preparedness indicators should be centrally monitored in the EU? A number of Member States made it clear to the ECDC that the country-specific results should only be known to the country, that they should not pass beyond the ECDC and that specifically there would be no“league tables”. This followed an earlier academic exercise when preparedness plans found on the web were analysed and countries were place in a ”league table” without validation or informing the countries [24, 25]. Instead in 2006 the ECDC used the then indicators to gather country-specific data but did not publish individual country results. Rather normative data were passed back to countries so that they could determine how they compared to other EU/EEA countries. All countries participated in a repeat of the survey using a core of the same indicators in 2007, which allowed the ECDC to determine to what extent countries’ preparedness had improved after 12 months of work [32]. Member States were also shown the report for comment before publication and the ECDC made adjustments when suggestions and requests were received. Hence the ECDC was seen as an honest“broker”. The combined results revealed that much progress was being made while at the same time there were a number of weaknesses especially in the fields of delivering preparedness locally and multi-sectoral work. This led to the development of a standard“three-dimensional” model of pandemic preparedness [22]. In preparation for an intended repeat survey in 2009 (Member States had asked for a moratorium on surveys in 2008) the EU Health Security Committee (the EU body with oversight of pandemic preparedness) pointed out that if Member States were going to be judged by the indicators they ought to approve them. A process of consultation was undertaken with the HSC Influenza Section members leading to formal publication of the indicators, though not until after the 2009 pandemic [22].

Future developments—learning from the 2009 pandemic

The 2009 pandemic provided a real test of pandemic preparedness. While there was generally a strong response to a pandemic that itself was almost optimal for Europe [18, 32] it is also agreed that many weakness were revealed and lessons needed to be learnt [17, 33]—perhaps the most important being to prepare for different kinds of pandemics, especially around that elusive parameter“severity” [34]. The last European status report on pandemic preparedness in 2007 [30] had concluded that, in spite of the progress found, a further 2–3 years of sustained effort and investment was needed by the EU and its Member States to achieve the level of preparedness needed to respond well to a pandemic. That can now be seen to have been an optimistic estimate. Dealing well with the challenges of any pandemic requires complex adjustments that may be very different from the experiences in other crises. Most problems occurred in stress on hospital intensive care and paediatric services, around risk communication, vaccination, maintaining the confidence of the professionals and delivering local interventions. These cannot be measured with indicators alone and arguably the ECDC“acid tests” are the most useful of the indicators on hand ([27], Tab. 5). Consensus on the surveillance difficulties were agreed at a meeting of the European Influenza Surveillance Network (EISN) in 2010 [34]. These identified difficulties in establishing surveillance in hospitals, undertaking seroepidemiology, estimating severity and sharing of analyses. Finally it is important to appreciate that the 2009 pandemic was unusually benign (it had a low case fatality rate and did not stress essential services outside the health sector) [17, 18, 33], i.e. countries should not neglect preparations for more severe pandemics.

The commitment to visit all 30 EU/EEA Member States in 2 years placed a considerable strain on the ECDC and its partner organizations. They will not have the capacity to re-visit the countries frequently, and follow-ups have been so far limited to addressing progress in pandemic preparedness during visits for other reasons. Therefore, it remains challenging for the ECDC and partner organizations to assist the Member States over a longer term in keeping the momentum in strengthening preparedness though there is a Council Conclusion to do so which ECDC will support [35]. However countries will need to do more self-monitoring and assessment in the future, supported by the new indicators, and complemented by annual regional meetings for direct communication about their activities, and continued, but less frequent assessment visits. The natural“kick-off” for this in Europe was the Belgian EU Presidency meeting in July 2010 and the Council Conclusions though in parallel it is hoped that there will also be a new WHO guidance developed after the 2011 World Health Assembly and the report of the Fineberg Committee [35, 36].