Dataset: 11.1K articles from the COVID-19 Open Research Dataset (PMC Open Access subset)
All articles are made available under a Creative Commons or similar license. Specific licensing information for individual articles can be found in the PMC source and CORD-19 metadata.
More datasets: Wikipedia | CORD-19
Deep Learning Technology: Sebastian Arnold, Betty van Aken, Paul Grundmann, Felix A. Gers and Alexander Löser. Learning Contextualized Document Representations for Healthcare Answer Retrieval. The Web Conference 2020 (WWW'20)
Funded by The Federal Ministry for Economic Affairs and Energy; Grant: 01MD19013D, Smart-MD Project, Digital Technologies
The financing, provision, and quality of healthcare systems; the availability of vaccines, antivirals, and antibiotics medicines, and appropriate compliance to treatment protocols are all important determinants of infectious disease transmission. Although the correlation between healthcare system financing and efficacy is not perfect, recent budget cuts to healthcare are an important consideration when anticipating infectious disease risk. In part related to the global economic crisis, it has been reported that many high-income governments have introduced policies to lower spending through cutting the prices of medical products and, for example, through budget restrictions and wage cuts in hospitals (54). There are many indirect and direct pathways through which budget cuts could affect disease transmission; to provide just one example, it has been estimated that 20–30% of healthcare-associated infections are preventable with intensive hygiene and control programmes2 – should investments in this area diminish, then healthcare-acquired infections could become an even more problematic issue. There are currently roughly 4.1 million healthcare-associated infections each year in the EU alone.3
A broader issue related to healthcare provision is population mobility for both healthcare professionals and patients who might increasingly seek work or healthcare in other countries – the provision of cross-border healthcare and the mitigation of cross-border health threats will necessitate collaboration across borders (55, 56) and solutions for the brain-drain of medical personnel from resource-poor countries (57). Also related to the healthcare provision and practice is the over-prescription or overuse of antibiotics. In combination with a lag in pharmaceutical innovation, rapid transmission, and poor infection control measures, this has driven resistance of organisms such as methicillin-resistant Staphylococcus aureus, or extended-spectrum beta-lactamases, and carbapenemase-producing gram-negatives such as Klebsiella pneumoniae carbapenemase (KPC) (58). Antimicrobial resistance is currently one of the major health risks facing society (59).
Food production systems remain a persistent source for human infectious diseases. Attempts are underway to estimate the global burden of food-borne disease (60), which is likely substantial. Many factors in food production affect human health. A vast range of familiar human pathogens can be acquired through the consumption of animal products and other disease drivers, such as global travel, further provoke this (61). In addition to farmed animals, the hunting and slaughtering of wild animals has led to the emergence of more exotic pathogens: SARS originated in wildlife markets and restaurants in southern China (62) and HIV and Ebola have both been linked to the hunting or slaughtering of primates and other wild animals (33, 63, 64). The density and health of livestock, meanwhile, have been linked to disease in humans (65, 66). Although inconclusive, there is some evidence to suggest that livestock production may lead to increased antibiotic resistance in human pathogens. There are certainly many pathways by which drug resistant pathogens could transmit from livestock to humans, including environmental contamination by excreted veterinary antibiotics (33, 67, 68).
Social and demographic contexts can significantly influence the transmission of infectious disease, while also creating increased vulnerabilities for some population subgroups. The elderly are at greater risk of many infectious diseases, and the ageing trend in many high-income countries could increase the challenges related to nosocomial (hospital-acquired) and nursing-home acquired infections. An additional challenge related to population ageing is that the share of employed workers in a country decreases. The combination of more people to care for and fewer tax-related revenues may challenge publicly financed public health and disease control programmes (7).
When persons from regions with high endemicity of a given disease move to ones with lower endemicity, new challenges for public health are created. In addition migrant communities can be highly vulnerable to certain infectious diseases. In the EU, for example, approximately 37% of HIV cases reported in 2011 were among people born abroad, and the equivalent number of cases for tuberculosis was 25% (40). Similarly, migrants suffer from a higher burden of chronic hepatitis B infections (41).
It is widely established that socially and economically disadvantaged groups suffer disproportionally from disease (42). This is applicable to infectious disease burdens in both high- and low-income settings (43, 44). Income inequalities are generally widening globally, and this appears to be have been exacerbated in many countries due to the global economic crisis (45). Rising unemployment and the prospect of public health budget cuts can increase the risk of infectious disease transmission (44, 46), with a prominent example being an outbreak of HIV among people who inject drugs (PWID) in Greece (see ‘Measles among Roma in Bulgaria and HIV among PWID in Greece: the impact of socioeconomic contexts’ section) (47, 48). In a similar fashion, it has been speculated that tuberculosis rates could rise in some countries in Central and Eastern Europe (49).
Social trends and behaviours can also play a significant role in infectious disease transmission. The most notable example would be vaccine hesitancy, the phenomenon through which vaccination coverage rates remain suboptimal due to the varying and complicated reasons that individuals may have for not getting vaccinated (50, 51). In some cases, this might be related to misconceptions about the safety or efficacy of vaccines (50, 52), whereas in others this may be related to religious or cultural beliefs (53).
Hepatitis B is found in virtually every region of the globe. Of the more than 2 billion people who are or have been infected, 350 to 400 million are carriers of the chronic disease; the remainder undergo spontaneous recovery and production of protective antibodies. Nearly 100% of infected infants (that is, those born to HBV-infected mothers) become chronically infected. The risk of developing a chronic infection decreases with age.
At least 30% of those with chronic HBV infection experience significant morbidity or mortality, including cirrhosis and hepatocellular carcinoma. Most people do not know they are infected until they present with symptoms of advanced liver disease, which means that infected individuals can spread the infection unknowingly, sometimes for many years. Although oral antiviral therapies are effective at stopping HBV replication, they do not cure the disease. Therefore, therapy is usually lifelong. Treatment is also complicated by the development of drug resistance and side effects. A vaccine against HBV is safe and effective in 90 to 95% of people; however, the individuals who are most at risk of becoming infected are often those with limited access to the vaccine, such as marginalized populations or people living in resource-limited countries.
There is substantial evidence that an individual's likelihood of recovering from an acute HBV infection or developing severe sequelae from infection is influenced, in part, by genes [39–45]. Candidate gene and genome-wide association studies have identified variants associated with HBV-related disease progression or hepatocellular carcinoma in various populations [46–52]. Treatment response to interferon (IFN)-α has been associated in some, but not all, studies with IFNλ3 polymorphisms. Finally, specific gene variants (HLA and non-HLA alleles) have been associated with vaccine response and non-response [54–57].
Emergence and reemergence of infectious diseases occur over time. Prior to causing an epidemic, infectious disease agents go through various stages of adaptation to access or acquire pathogenic characteristics in a new host. Specific processes such as gene mutation, genetic recombination, or reassortment as well as factors that compel microbial agents to change reservoir hosts constitute opportunities for infectious agents to evolve, adapt to new hosts in new ecological niches, and spread easily [18, 19]. A number of factors contribute to this adaptation and consequent disease emergence. The complex interactions between infectious agents, hosts, and the environment are key.
Specifically, factors affecting the environment include depletion of forests, expansion and modernization of agricultural practices, and natural disasters such as floods. These potentially lead to changes in microbial ecological niches and fuel microbial adaptation to human host [20, 21]. Sociodemographic factors such as increase in population density, falling living standards, decline of infrastructure, human travel, conflicts and social instability, and killing of wild animals for meat all lead to increase in host-microbe contact, which facilitate infections in humans [22–25]. There are also some pathogens whose emergence is as a result of deliberate human action. These are those employed as biological weapons for destruction and so their emergence is “deliberate.”
Besides host and environmental factors, changes or mutation in the genome of a pathogen, which occurs as a result of exposure to chemicals and antimicrobial agents (e.g., antibiotic), may lead to gene damage and emergence of drug resistant pathogen variants that could cause new disease. Thus, human, microbial, and environmental factors constitute major causes of infectious disease emergence and the virulence or pathogenic potential depends on a complex combination of these factors. However, generally, emerging infectious diseases caused by viral pathogens are responsible for the greatest proportion of the EID threat, having caused about two-thirds of the infectious disease burden and usually characterized by very high epidemics. Examples are Filoviruses, Ebola, and Marburg [28, 29].
The studied communities were located within 95 km (severe neurological infectious disease) and 62 km (fatal respiratory infectious disease) of a surveillance hospital. In these communities, 76 of 426 severe neurological disease cases (18%, 95% CI 14%–22%) and 234 of 1,630 fatal respiratory disease cases (14%, 95% CI 13%–16%) attended a surveillance hospital. Adjusting for distance, the case detection probability was nearly twice as high among severe neurological disease cases than among fatal respiratory disease cases (risk ratio 1.8, 95% CI 1.4–2.3; p < 0.001). At 10 km distance, an estimated 26% (95% CI 18%–33%) of severe neurological disease cases and 18% (95% CI 16%–21%) of fatal respiratory disease cases were detected by the hospital-based surveillance. The detection probability decreased with distance from the surveillance hospital, and the decline was faster for fatal respiratory disease than for severe neurological disease. A 10 km distance increase resulted in a 12% (95% CI 4%–19%; p = 0.003) relative reduction in case detection probability for severe neurological disease but a 36% (95% CI 29%–43%; p < 0.001) relative reduction for fatal respiratory disease (Fig 2C). Including more complex functional forms of distance in the log-binomial regression models did not improve model fit based on AIC (Table A and Figs. B and C in S1 Text).
The probability of detecting an outbreak of exactly three cases (if a single detected case was considered an outbreak) dropped below 50% at distances greater than 26 km for severe neurological disease and at distances greater than 7 km for fatal respiratory disease (Fig 3A). Fig 3B and 3C show the minimum number of cases required for surveillance to detect outbreaks with a probability of ≥90% if different outbreak thresholds are applied. For outbreaks defined as detection of at least one case, we found that an outbreak of fatal respiratory disease required 12 cases (95% CI 11–13) to be detected with 90% probability at 10 km from a surveillance hospital, but 30 cases (95% CI 24–39) to be detected at 30 km. In contrast, the impact of distance on the outbreak size requirement was much more limited for severe neurological disease: eight cases (95% CI 6–12) at 10 km and 11 cases (95% CI 9–14) at 30 km. For outbreaks defined as detection of at least two cases, 14 severe neurological disease cases (95% CI 11–20) and 20 fatal respiratory disease cases (95% CI 18–23) would be necessary for an outbreak to be detected at 10 km distance, and 19 severe neurological disease cases (95% CI 15–24) and 51 fatal respiratory disease cases (95% CI 41–66) at 30 km. The necessary outbreak sizes increased further when a five-case threshold was applied, so that 28 severe neurological disease cases (95% CI 21–39) and 39 fatal respiratory disease cases (95% CI 35–44) would need to occur for an outbreak to be detected at 10 km distance, and 36 (95% CI 30–46) and 97 (95% CI 79–128), respectively, cases at 30 km.
Surveillance hospital attendance among community cases varied by case characteristics, leading sometimes to biased disease statistics among surveillance cases (Table B in S1 Text). For severe neurological disease, individuals aged <5 y represented 48% of community cases but only 29% of surveillance cases (p < 0.001). Additionally, the proportion of cases in the lowest socioeconomic group was lower among surveillance cases than among community cases (43% versus 57%; p = 0.012), while the proportion of individuals aged 15–59 y was higher (43% versus 29%; p = 0.005) (Fig 4A). For fatal respiratory disease, the proportion of individuals aged ≥60 y (47% versus 62%; p < 0.001) was lower among surveillance cases than among community cases, while the proportion of individuals aged <5 y (24% versus 18%; p = 0.020), individuals aged 15–59 y (27% versus 18%; p < 0.001), and cases in the highest socioeconomic group (43% versus 37%; p = 0.022) was higher (Fig 4B). We observed a slight difference in the proportion of females for fatal respiratory disease (34% among surveillance cases versus 38% among community cases; p = 0.108), but not for severe neurological disease (39% versus 40%; p = 0.861). Results were consistent in sensitivity analyses with age as a continuous variable and socioeconomic status classified into quintiles (Figs. D and E in S1 Text).
A substantial proportion of cases (severe neurological disease 42% [95% CI 38%–47%]; fatal respiratory disease 26% [95% CI 24%–28%]) visited multiple healthcare providers during their illness. Forty-eight percent (95% CI 44%–53%) of severe neurological disease cases and 31% (95% CI 29%–34%) of fatal respiratory disease cases attended any hospital, including surveillance hospitals (Fig 5). Including other hospitals that were attended by cases in the surveillance system could have increased the overall case detection probability by 31% (absolute increase) for severe neurological disease cases and 17% for fatal respiratory disease cases. The capacity to detect outbreaks would have increased, so that outbreaks containing four severe neurological or eight fatal respiratory disease cases would have been detected with ≥90% probability for any distance in the range 0–40 km from the original surveillance hospital, compared to 13 and 47 cases, respectively, with the current system (Fig. F in S1 Text). However, since individuals who attended any hospital had similar characteristics in terms of sex, age, and socioeconomic status as those attending surveillance hospitals (Fig. G in S1 Text), this expansion would not have increased disease detection in key groups such as the lowest socioeconomic group. Only with the informal sector incorporated in the surveillance system would cases in such groups be detected.
This work has made possible to assess the risk of entry of different infectious diseases at the same time, and through different routes of entry into a large geographical area.
The use of spread sheets for the development of probabilistic formulation has been of vital importance for the collection and analysis of data, although its validity depends on the confidence and quality of the available information. In this case, there is only complete information of five countries: Morocco, Algeria, Tunisia, Egypt, and Saudi Arabia.
It has been established a model for vectors introduction in wind flow that confirms the potential entry by this pathway of some vector-borne diseases, bluetongue and epizootic haemorrhagic disease, from Morocco, Algeria and Tunisia.
Of all the diseases analyzed in this study, Newcastle disease and avian influenza are the ones with a higher risk of entry in the European Union. The pathway with more relevance in the risk of entry of these diseases is the wild bird's migration.
The diseases with a moderate risk of entry are bluetongue, epizootic haemorrhagic disease and foot and mouth disease. These diseases have in common the possible entry through wind dispersion. In the case of vector-borne diseases it is possible by vectors dispersion in wind currents, and in the case of foot and mouth disease it is possible by virus spreading through wind currents.
Due to the absence of live dromedary movement to Europe, the more likely way of entry of the Middle East respiratory syndrome is through infected people movement, from Saudi Arabia, Kuwait, Qatar and Oman.
The contagious bovine pleuropneumonia is the only disease with no risk of introduction in the European Union, due to the absence of cattle movement from the countries affected by this disease, Chad, Niger, Mali, and Mauritania.
Guinea pigs and golden hamsters were found to be relatively resistant to MPXV (West African clade Copenhagen strain) infection by multiple routes. Guinea pigs were challenged via an intracardial, intranasal (IN), oral or foot pad (FP) inoculation with no observable symptoms of disease except for edema at the FP inoculation site. Golden hamsters were also resistant to MPXV infection via several routes of infection with no observable signs of disease, even with large dosages of virus (1.5–5.9 × 107).
Rabbits have also been considered as a possible animal model for the study of MPXV [2–3]; susceptibility depended greatly on the method of inoculation and the age of the animals. In adult rabbits challenged with MPXV (West African clade Copenhagen strain) via an oral inoculation, no signs of disease were seen. However, if virus was delivered by intravenous route, acute illness was observed with generalized rash. Young rabbits (10 days old) inoculated via IN or oral route developed severe illness; two day old rabbits were highly susceptible to infection by intracutaneous inoculation or skin scarification. The intracutaneous route led to the development of discrete white translucent lesions. Another study found that intracerebral inoculation was 100% fatal and that intratesticular or intracorneal inoculation with MPXV was also pathogenic in rabbits.
Several rat species including white rats, cotton rats and multimammate rats have been challenged with MPXV. Adult white rats inoculated with 101 to 103 plaque forming units (pfu) of West African MPXV were not susceptible to infection with intravenous, IN, or cutaneous routes of infection. However, newborn white rats (1–3 days old) developed adynamia leading to death in 5–8 days when challenged with MPXV intranasally. Cotton rats and multimammate rats were both found to be highly susceptible to MPXV infection. When cotton rats were challenged with 105 pfu via an intravenous route of infection, 100% mortality was seen 4–5 days post infection (p.i.). The infection was characterized by difficulty breathing, cough, sneezing, cyanosis, rhinitis, purulent conjunctivitis, and progressive emaciation. An IN MPXV challenge in cotton rats caused 50% mortality with a clinical picture such as that seen with the intravenous route. Multimammate rats were highly sensitive to both IN and intraperitoneal (IP) inoculation.
Marennikova et al. challenged adult common squirrels (Sciurus vulgaris) with 106 pfu of MPXV Z-249 (Congo Basin clade) via IN, oral or scarification routes of infection. Disease progression occurred earlier in animals infected IN or orally than those animals infected via a scarification route. Skin lesions did not develop on any animals; symptoms of disease included fever, inactivity, inappetence, rhinitis, cough and difficulty breathing. Infection was 100% lethal by day 7 or 8 p.i. regardless of inoculation route. Shelukhina et al. challenged six African squirrel species (including members of the genera Funisciurus, Protexerus and Heliosciurus) with Congo Basin MPXV via an IN infection (105 or 106 pfu/0.1 mL). All squirrel species were highly susceptible to Congo Basin MPXV challenge and developed an acute, generalized infection that was 100% lethal. However, some varying degree of susceptibility in the different squirrel species was seen with lesser dosages of virus. Cutaneous inoculation of squirrels resulted in a thick, red papule at the inoculation site. Skin lesions (restricted to non-fur-bearing areas of the skin or at the borders of the skin and mucous membranes of the nose and lip) occurred in only a few squirrels that had been infected by the oral or IN route with small (nonlethal) doses of virus. Most often the rash appeared in the later stages of disease (16–25 days p.i.). Transmission studies were also conducted with squirrels and authors found that infected animals were able to transmit the disease to naïve animals via airborne and direct contact.
Ground squirrels (Spermophilus tridecemlineatus) are very susceptible to MPXV infection. Tesh et al. challenged adult ground squirrels with the West African MPXV clade either IP or IN with 105.1 pfu. In both groups, symptoms of disease included anorexia and lethargy within 4–5 days of infection, with no other observable symptoms. Weight loss was not measured for these animals. Animals in the IP group died within 6–7 days p.i.; those IN challenged all died within 8–9 days. A follow-up study compared the pathogenesis of the two MPVX clades in the ground squirrel model. Inoculation of 100 pfu by a subcutaneous route of infection was 100% lethal for both MPXV clades. However, the authors noted that the onset of severe respiratory distress was more rapid and uniform for the Congo Basin MPXV challenged animals. Additionally, animals challenged with the Congo Basin MPXV began to die earlier than West African challenged animals. However, LD50 values were similar for the two strains using the ground squirrel MPXV model (0.35 pfu for Congo Basin and 0.46 pfu for West African MPXV).
Prairie dogs have also been looked at as a possible MPXV animal model. Xiao et al. challenged prairie dogs with 105.1 pfu of West African MPXV via an IP or IN route of infection and observed 100% and 60% mortality, respectively. Hutson et al. challenged prairie dogs with either the Congo Basin or West African MPXV (104.5 pfu) via an IN or scarification route of infection. Animals were asymptomatic until days 9–12 when a generalized rash was observed on challenged animals. Signs of disease included lethargy, inappetence, nasal discharge, respiratory distress, and diarrhea; morbidity was noticeably more for the Congo Basin MPXV challenged animals as was mortality. A follow-up study found the LD50 for the prairie dog MPXV model is approximately a hundred-times lower for the Congo Basin clade compared to the West African clade (5.9 × 103 and 1.29 × 105, respectively), utilizing an IN route of infection. Weight loss occurred in 2/4 West African MPXV challenged dosage groups and 3/4 Congo Basin MPXV challenged dosage groups; for both viral strains the highest percent weight loss calculated occurred in the highest viral inoculum group. A trend of increasing viral titers in oropharyngeal swabs with increasing viral inoculums dose was apparent for both MPXV strains, and when all mean values were combined, Congo Basin challenged animals had statistically higher levels of virus. Furthermore, the duration of MPXV DNA and viral shedding tended to occur earlier, attain higher levels, and persist longer for Congo Basin challenged animals. Symptoms were also more numerous and severe for Congo Basin MPXV infected prairie dogs.
Schultz et al. challenged African dormice, Graphiuris kelleni, with 1.4 × 104 pfu of a Congo Basin clade of MPXV via FP route and observed 92% mortality. The authors further developed the model by infecting dormice with various dosages of Congo Basin MPXV by the IN route and calculated the LD50 as 12 pfu. Animals became symptomatic at day 3 (conjunctivitis and dehydration), and animals that succumbed to disease had a mean time to death of 7.9 ± 1 to 12.3 ± 5 days (depending on dose). Morbidity for those animals that succumbed to disease included decreased activity, hunched posture, unkempt hair coat, dehydration, and conjunctivitis; lesions did not develop on any animals. Weight loss was highest with 2,000 pfu (the highest dosage given), but also was seen in animals given 200 or 20 pfu. Weight loss was not observed in the lowest dosage groups (2 and 0.2 pfu). Disease pathogenesis was described as localized inflammation, viral replication and hemorrhage in the nasal mucosa, followed by dissemination around day 3 with subsequent necrosis of liver, spleen, lung and gastrointestinal tract tissues. When the West African strain of MPXV was used to challenge dormice by an IN infection, similar days until death and mortality rates were seen as the Congo Basin MPXV challenged animals.
As is the case for many pathogens, mice have been utilized numerous times for the study of MPXV. Results have varied greatly depending on the type of mice used (i.e., wild strains or laboratory strains). In early studies white mice were challenged via intracerebral, IN, IP, FP, oral or intradermal (ID) inoculation with a West African strain of MPXV and found to be highly sensitive to most inoculation routes. Intracerebral inoculation was 100% fatal in adult white mice as was IN inoculation of suckling mice. Inoculation in eight day old mice via the FP, IP, or IN route resulted in 100% mortality; ID or oral inoculation caused 50% and 40% mortality, respectively. Oral inoculation of 12 day old white mice only caused 14% mortality; however IN inoculation in 15 day old animals led to 100% mortality. Inbred laboratory mouse strains have also been studied by several groups. Hutson et al. compared the two clades of virus (105 pfu) in immunocompetent BALB/c and C57BL/6 laboratory mouse strains via an IN or FP route of infection. Localized signs in the FP challenged animals included edema at the inoculation site, while the Congo Basin IN route of infection led to weight loss. However, symptoms were minimal and all animals survived infection. Osorio et al. compared both clades of virus by an IP inoculation in either BALB/c or severe combined immune deficient (SCID) BALB/c mice. Biophotonic imaging was used to visualize the disease progression. BALB/c mice developed rough coats, and decreased activity but cleared infection within 10 days p.i. In contrast, SCID BALB/c mice developed similar symptoms, but resulted in 100% mortality by day 9 p.i. (Congo Basin MPXV) or day 11 p.i. (West African MPXV). Stabenow et al. utilized a laboratory mouse strain lacking STAT1 (C57BL/6stat-/-), an important protein involved in Type I and Type II IFN signaling. Mice were challenged with dosages between 4.7 to 4,700 pfu via an IN route of infection. Weight loss was seen with all dosages given except for the lowest (4.7 pfu). Mortality occurred at 25–50% at the 47 pfu dose (12–21 days p.i.); 100% mortality occurred by day 9 p.i. with the highest dose given (4,700 pfu). The calculated LD50 for the Congo Basin clade in the C57BL/6stat-/- mice was 213 and 47 pfu for females and males, respectively. Americo et al. screened 38 inbred mouse strains and identified three that are highly susceptible to MPXV. Of these three strains, the CAST/EiJ was developed as a model. Signs of morbidity in moribund animals included ruffled fur, hunched posture, and lethargy; no animals developed lesions. Animals challenged with the highest dosages by an IN route (105 or 106 pfu Congo Basin MPXV) lost up to 28% of the starting body weight and 100% died between days 5–8. Animals challenged with 104 pfu all died between days 8–10. Animals challenged with 103 pfu had an even longer delay in weight loss and death and 40% of the animals recovered. No animals perished in the 102 pfu challenge group. The calculated LD50 for the Congo Basin clade in the CAST/EiJ mice given an IN challenge was 680 pfu. Animals were found to be even more sensitive with an IP Congo Basin MPXV infection and the calculated LD50 was 14 pfu. Challenging mice with 105 or 106 pfu of a West African MPXV strain resulted in rapid weight loss and 100% mortality by day 8 p.i. Lower dosages of West African MPXV resulted in less weight loss and lower amounts of death than what was observed for the Congo Basin MPXV; the calculated LD50 was 7,600 pfu, more than a log higher than for the Congo Basin clade.
In terms of burden of congenital infections in the newborns, almost all the burden (97%) was attributable to toxoplasmosis, listeriosis and rubella infections (Table 3).
With all the probabilities calculated already, we can calculate the total probability of entry of the disease j from the country i to the European Union, taking into account all the routes of entry already evaluated(PIij).
To do this, we calculate the probability of occurrence of the opposite case, the probability of no introduction of the j disease by any of the routes of entry, using the following formula:
With the same type of formula, it is estimated the likelihood of entry of a disease j in the European Union.
A high, moderate and low risk of introduction of infectious diseases from different countries has been estimated based on a 75 and 90-percentile (P75 and P90) over the final results of probability of each route of entry. Therefore, the results that are over the 75-percentile and 90-percentile are classified as moderate and high risk of entry.
The diseases with the highest number of DALYs per case, which represents the individual burden and to a certain extent the severity of the disease, were rabies and variant Creutzfeldt–Jakob disease, which are ultimately fatal conditions. HIV/AIDS, invasive meningococcal disease, listeriosis, TB, IHID, Legionnaires’ disease, HBV infection, IPD, congenital toxoplasmosis, tetanus and diphtheria followed, with DALYs per case ranging from 6.03 to 1.16. Diseases determined to have a high individual and population burden were Legionnaires’ disease, IPD, HIV/AIDS and TB, while influenza was determined to have a low individual but high population burden (Figure 4).
Since the outbreak of SARS in 2003 the public health system in China has improved due to all levels of governments’ increased investment in disease control and prevention. The health status of the Chinese people has also improved. However, infectious diseases still remain a major cause of morbidity and mortality, and this may be exacerbated by rapid urbanization and unprecedented impacts from climate change. Although infectious diseases in China have been significantly reduced over recent decades, China’s current capacity to manage emerging and re-emerging infectious disease outbreaks is facing formidable challenges. A timely, streamlined, well-funded and efficient disease reporting and surveillance system is essential to monitor the threat of potential epidemics, which may not only affect population health in China but may also have wider implications for global health. In order to deal with future infectious disease threats effectively and promptly, comprehensive prevention and response strategies, which integrate a variety of complementary actions and measures, are needed. More research about infectious diseases, urbanization, climate change, and changing demographics needs to be conducted to support efforts to build China’s capacity to control and prevent the spread of emerging and re-emerging infectious diseases in the future.
By the 1970s, the human burden of infectious diseases in the developed world was substantially diminished from historical levels, largely due to improved sanitation and the development of effective vaccines and antimicrobial drugs. The emergence of a series of novel diseases in the 1970s and 1980s (e.g. toxic shock syndrome, Legionnaire's disease), culminating with the global spread of HIV/AIDS, however, led to infectious disease rising back up the health policy and political agendas. Public concern about emerging infectious diseases (EIDs) has been heightened because of the perception that infectious diseases were previously under control, because of their often rapid spread (e.g. severe acute respiratory syndrome; SARS), because they often have high case fatality rates (e.g. Ebola virus disease) and because the development of drugs and vaccines to combat some of these (e.g. HIV/AIDS) has been slow and costly. By the 1990s, authors had begun to review similarities among these diseases and identify patterns in their origins and emergence. Similarities included a skew to zoonotic pathogens originating in wildlife in tropical regions (e.g. Ebola virus), and that emergence was associated with environmental or human behavioural change and human interaction with wildlife (e.g. HIV/AIDS) or with domestic animals which had interactions with wildlife (e.g. Nipah virus) [5–7]. Emergence was found to be exacerbated by increasing volumes and rates of human travel and globalized trade.
By the end of the 1990s, the study of EIDs was a staple of most schools of public health, a key focus of national health agencies, a book topic and the title of a scientific journal. Novel diseases continued to emerge, often from unexpected reservoirs and via new pathways. For example, between 1994 and 1998, three new zoonotic viruses (Hendra, Menangle and Nipah viruses) emerged from pteropodid bats in Australia and southeast Asia. Each of these was transmitted via livestock (horses or pigs), and each belonged to the Paramyxoviridae. Around this time, emerging diseases were identified in a series of well-reported die-offs in wildlife, including canine distemper in African lions (Panthera leo) in the Serengeti, chytridiomycosis in amphibians globally, pilchard herpesvirus disease in Australasia and West Nile virus in corvids and other birds in New York [10–13]. Pathogens were also implicated for the first time in species extinctions, or near-extinctions, e.g. canine distemper in the black-footed ferret (Mustela nigripes), chytridiomycosis in the sharp-snouted day frog (Taudactylus acutirostris) and steinhausiosis in the Polynesian tree snail, Partula turgida [14–16]. Novel diseases and their emergence in people and wildlife were reviewed, and commonalities in the underlying causes of emergence discussed, in a paper published at the end of the decade. Here, we re-examine some of the key conclusions of that paper, review how the field has progressed 17 years on and identify some of the remaining challenges to understanding and mitigating the impacts of disease emergence in and from wildlife.1
To date, four MPXV small animal models have been used for the testing of antiviral drugs Cidofovir, CMX001 and ST246 (tecovirimat). Herein we will summarize those studies, efficacy data, and discuss the advantages, and limitations, of the animal models used.
Sbrana et al. utilized ground squirrels to test the efficacy of ST-246 against a MPXV challenge. The authors used 100 pfu of MPX-ZAI-1970 (200 × LD50) via a subcutaneous route of inoculation. Squirrels (8–9 per group) were divided into five treatment groups; drug was given either at 0 hours of infection, 24 hours, 48 hours, 72 hours or 96 hours p.i. 100 mg/kg of drug was given once a day for 14 days. Two animals in each group were sacrificed at day 7 to measure objective morbidity; the remainder of the animals were used to calculate survival rates. Animals in the placebo group, that were not given ST-246, showed signs of illness beginning on day 4 and all died between days 6–9. Signs of disease included lethargy, anorexia, nosebleeds, and terminal respiratory distress. At day 7, a sampling of placebo-treated animals exhibited significant leukocytosis, transaminitis, and coagulopathy; almost 105 pfu/mL of infectious monkeypox was found in blood; at this time, between 107 and 108 pfu /mL of infectious MPXV was observed in 10% organ homogenates of liver, spleen and lung. Animals treated on days 0, 24, 48 or 72 hours, before symptomatic disease onset, all survived infection and showed no signs of disease. At day 7, in a sampling of animals treated at hour 0, 24, 48 or 72 p.i., no virus was found in the liver, spleen, lung, or blood; although some abnormal values were apparently recorded, no clear trends in leukocytosis, transaminitis or coagulopathy were noted with delay in treatment onset. In animals initiating treatment at 96 hours p.i., concurrent with symptomatic disease onset, 67% of animals survived infection. 2/4 survivors showed signs of disease. In those animals that succumbed to infection, ST-246 prolonged the time to death; the mean time to death was day 7 for animals receiving placebo and day 13 for those receiving ST-246 in the 96 hour p.i. treatment group. The sampling of animals at day 7, initiating ST-246 at 96 hour p.i., demonstrated lower levels of viremia (∼3 log decrease) and ∼5 logs less virus in liver, spleen and lungs than that seen on the placebo treated animals at day 7. Although some evidence of transaminitis was present, leukocytosis and coagulopathy were not observed in this treatment group. Pathologic examination of tissues in general showed greater tissue necrosis in animals treated at later times p.i. This study was able to demonstrate a survival benefit in animals treated prior to, or at the onset of disease symptoms, in a disease model that has a time course attenuated with respect to what is seen in human disease.
Schultz et al. infected African dormice with a lethal challenge of Congo Basin clade virus MPXV-ZAI-79 via an IN route of infection to evaluate the efficacy of Cidofovir as post exposure prophylaxis. Four hours post intranasal infection with 75, 4 × 103, or 5 × 103 pfu of MPXV, animals were intraperitoneally administered 100 mg/kg cidofovir (the calculated LD50 for the dormouse MPXV model was 12 pfu). Aggregate data from all challenges showed animals treated with cidofovir had a mortality rate of 19% (7/36), whereas vehicle treated animals all (41/41) succumbed to disease. Treatment initiation at later times p.i. was not evaluated; effects on viral load or histopathologic changes were not reported.
As inbred mice have historically shown little disease symptomatology or pathogenesis post monkeypox infection, Stabenow et al. utilized a laboratory mouse strain lacking STAT1 (C57BL/6stat-/-), which has been found to be sensitive to a range of viruses including SARS, murine norovirus 1, respiratory viruses, dengue virus and MPXV [19,22–25]. These animals are deficient in their ability to transcribe many of the Type I and Type II receptor interferon response genes. The authors used the Congo Basin clade virus MPX-ZAI-79, evaluated disease and the protective efficacy of CMX001 and ST246. In untreated mice, 0% mortality was observed with 4.7 pfu challenge, 90% mortality with 470 pfu of virus and 100% mortality with 4,700 pfu. Over 25% total body weight loss, and mortality was observed on or prior to day 10 p.i. in untreated animals. Animals in the treatment studies were subsequently challenged with 5,000 pfu via an IN infection. Animals were then treated with 10 mg/kg of CMX001 by gastric gavage on the day of challenge followed by every other day with 2.5 mg/kg until day 14 p.i. All C57BL/6 stat-/- mice that were treated with drug survived infection, demonstrated <10% body weight loss between days 10 and 20, and developed a serologic response to monkeypox. Similarly, mice treated daily, starting at the day of virus challenge, with 100mg/kg of ST246 for 10 days also survived infection and manifest <10% body weight loss between days 10 and 20. In this system, antiviral treated animals rechallenged with monkeypox at day 38 post initial infection (at least 10 days post reinitiation of steady weight gain), manifest 20% mortality. The model—again one with a short disease course—is useful for demonstrating immediate post exposure efficacy of antiviral treatment in the absence of a functioning interferon response system. Additionally, in this animal model system, perhaps due to the immune defect, a monkeypox protective immune response was not elicited in all animals receiving antiviral treatment. This observation merits further observation in other animal model systems.
Smith et al. tested the efficacy of ST246 in a prairie dog MPXV model. MPXV challenged prairie dogs have previously been shown to have an asymptomatic period followed by symptoms of disease including lethargy, nasal discharge, inappetence, weight loss and systemic lesion development most commonly between days 9–12. In the current study, animals were inoculated via an IN challenge with the Congo Basin clade virus ROC-2003-358. This is a different strain of MPXV than that used in the previous described studies, but is also a strain belonging to the Congo Basin clade. The challenge dose was 3.8 × 105, equal to 65 × LD50 for the prairie dog model. Animals were divided into three treatment groups; prophylactic (day 0), post exposure (day 3) and therapeutic (varying day based on rash onset), and a control vehicle treated group. ST246 was formulated at 30 mg/mL and administered daily, by oral gavage, for 14 days. Animals initiating treatment at day 0 or 3 were protected from death and apparent signs of illness. Animals treated at rash onset had symptoms similar to the placebo control group; however symptoms were less severe in the treated animals. Although all animals treated at rash onset survived infection, animals lost 10–24% of body weight and did develop generalized rash (however, lesions resolved more quickly when compared to untreated prairie dogs in previous studies. Although asymptomatic, viable virus was shed sporadically from animals in the prophylaxis and post exposure groups (from two oropharyngeal samples in the day 0 prophylaxis group, and five samples from the day 3 post exposure group). More, sustained virus was detected in the oropharyngeal samplings of the animals in the therapeutic treatment group, but levels were less than the virus levels in the untreated group. 1/4 sham-treated animals survived infection. Signs of disease and viral titers were all increased in this group of animals compared to the animals treated with ST-246. This is the first small animal study where a treatment and survival benefit has been demonstrated when animals are treated at later stages of illness. Initiation of treatment at rash onset is similar to expectations of a human treatment regimen. The observation of virus shedding after treatment cessation in the prophylactically or post exposure treated animals merits further study to assess whether this reflects viral resistance or a blunted and delayed immune recognition and ultimate clearance of virus.
Acute viral infections such as influenza also have profound impacts on global health. In contrast to the yearly epidemics caused by seasonal influenza, a pandemic can occur when a new virus emerges in a naive population and is readily transmitted from person to person. The US Centers for Disease Control (CDC) estimates that the H1N1 2009 pandemic resulted in 41 to 84 million infections, 183,000 to 378,000 hospitalizations, and nearly 285,000 deaths worldwide. Although the morbidity and mortality of that pandemic were lower than feared, public health professionals continuously monitor for the emergence of more virulent strains.
As an airborne infection, influenza is transmitted easily and quickly, and its effects can be acute, although there is wide variability in response to infection. Much of the heterogeneity in the severity of seasonal influenza infections has been attributed to the degree of acquired immunity in the population affected, patient co-morbidities and the virulence of the strain. Also, influenza epidemics and pandemics are often caused by the introduction of novel viruses for which most people have limited acquired immunity. The emergence of new strains, and the lack of cross-protection by existing vaccines, does not leave much time for vaccine development. In pandemics, including the H1N1 2009 influenza pandemic, healthy young individuals with no co-morbidities have comprised a significant proportion of fatal and severe cases. These pandemics have provided an opportunity to evaluate the host innate immune response among populations without underlying background immunity.
Research has identified genetic factors associated with severity of illness due to influenza [63–65] and death from severe influenza. Genetic information about immune response to influenza could inform vaccine development and distribution, and disease treatment strategies. Several candidate gene studies suggest that variations in HLA class 1 and other genes contribute to differences in antibody response to influenza vaccines. Ongoing experience with vaccine use has provided opportunities to learn about the potential role of genetics in vaccine safety and efficacy.
An emerging infectious disease (EID) can be defined as ‘an infectious disease whose incidence is increasing following its first introduction into a new host population or whose incidence is increasing in an existing host population as a result of long-term changes in its underlying epidemiology'.1 EID events may also be caused by a pathogen expanding into an area in which it has not previously been reported, or which has significantly changed its pathological or clinical presentation.2
Mostly, infectious disease emergence in humans is caused by pathogens of animal origin, so-called zoonoses.2,3,4,5 Likewise, cross-over events may occur between non-human species including between domestic animals and wildlife, and such events also involve transmission from a reservoir population into a novel host population (spill-over).5,6,7 Emergence in a novel host, which includes spill-over/zoonoses, has been extensively studied. An elaborate framework featuring the subsequent stages in the emergence process of a species jump has already been developed, describing how an established animal pathogen, through stages of spill-over and lengthening of the transmission chain in the novel host, may evolve all the way up to an established and genetically consolidated pathogenic agent.8,9,10 However, as implied by the above broader definition of EID, other categories can be distinguished in addition to emergence in a novel host, including disease outbreaks in an existing host or the emergence of a disease complex beyond the normal geographic range.
Here, we will argue that changes in host range, in pathogen traits displayed in the same host, and the geographic distribution of a disease complex, form three distinct sets of complementary and only slightly intersecting disease emergence scenarios. Together, these scenarios present the full picture and range of possible disease emergence dynamics. Hence, we categorize EIDs into three main groups, with emergence of (i) a pathogen in a novel host; (ii) a pathogen with novel traits within the same host; and (iii) a disease complex moving into a novel geographic area. Human actions that modulate the interplay between pathogens, hosts and environment are at the basis of almost all EID events, although the exact drivers and mechanisms differ. For each of the three groups, we will argue how the emergence process is driven by specific sets of causal factors, discuss the changes in disease ecology and transmission and elaborate on the invasion dynamics and on the characteristics of pathogens that are dominant in each group. Such structuring of the myriad of EID on the basis of the changes in the interplay between pathogens, hosts and environment will assist in better understanding of specific EID events and in designing tailored measures for prevention and prediction. Moreover, the framework contributes to understanding the effects of human actions that pave the way for the three distinct emergence scenarios. We propose that the resulting framework applies not just to pathogens affecting humans and animals in agriculture and natural ecosystems; it may be usefully applied also for pest and disease emergence in aquaculture, plant production and insect rearing.
Environmental factors, specifically climate conditions, are the seasonal drivers that have received the most attention. This may be because they often covary with seasonal disease incidence. Environmental drivers are abiotic conditions that influence transmission via their effects on hosts and/or parasites; classic examples are temperature and rainfall, which influence a variety of infectious diseases, but other examples include seasonal nonclimatic abiotic environmental conditions, such as water salinity, which may impact water-borne pathogens. Environmental factors can impact pathogen survival during transitions between hosts. Transitions can take place during short time windows (e.g., for droplet-transmitted infections) or long time windows (e.g., for parasites with environmental life stages). In addition to their impact on pathogens, environmental drivers can also influence host susceptibility to infection or vector population dynamics.
As for host susceptibility, environmental conditions can impact the host immune response and increase cells' susceptibility to infection or pose seasonal challenges (such as food limitations) that leave hosts vulnerable to infection or pathology, which has been proposed to influence disease progression in individuals infected with HIV. For directly transmitted infections, environmental conditions can be major drivers of cycles in incidence, with influenza and cholera transmission being notable examples (e.g., see [3, 80]). The effects of climate on flu transmission have been studied using population-level data coupled with transmission models, as well as empirical animal studies, to demonstrate the effects of temperature and humidity on transmission. Although climate conditions undoubtedly play a direct role in several directly transmitted infections, they may play a more nuanced role in vector-borne disease systems in which they modulate vector population dynamics and subsequently disease transmission. For example, in the case of African sleeping sickness (Table 1), the rainy season is hypothesized to modify tsetse fly distribution, which results in changes in human–tsetse fly contact and subsequently African sleeping sickness incidence; in this case, we can classify the seasonal driver as (1) vector seasonality alone or as (2) seasonal climate influencing vector seasonality and vector seasonality having a downstream effect on seasonal exposure. Abiotic and biotic seasonal drivers are therefore interconnected and not mutually exclusive.
Since the earliest documented epidemics of plague, leptospirosis, viral hemorrhagic fevers, and rabies, we have known that humans and our domestic animals can become ill after contact with other animals. Most animal pathogens can infect multiple host species, and pathogen spillover from one host species to another is common. Pathogen spillover has been defined as scenarios in which disease occurrence in a focal population depends on a distinct reservoir source that maintains the pathogen indefinitely. Thus, controlling spillover diseases is complicated by the need to manage not only cases in the target population but also transmission interfaces and reservoir populations. Broadly, a disease reservoir is the source of new cases in a target population. A better understanding of the types of species that form reservoirs will therefore facilitate the management of many emerging infectious diseases.
However, what precisely constitutes a reservoir has been the subject of debate [4–6]. Haydon et al. defined a reservoir as “one or more epidemiologically connected populations or environments in which the pathogen can be permanently maintained and from which infection is transmitted to the defined target population.” Following this definition, one pathogen may cause disease in multiple target populations, and the reservoir for each target population can be different. To identify reservoirs, researchers must find evidence of natural infection and spillover. Here, we follow the Haydon et al. definition to study reservoirs of spillover pathogens.
A number of recent studies have attempted to answer the question, “Are reservoirs special?”, with the goal of identifying likely disease sources. Ungulates, carnivores, and rodents are the source of most zoonoses, yet recent outbreaks suggest that bats may be a uniquely important reservoir of human pathogens. Reservoir species with fast life history characteristics appear to have higher reservoir competence and contribute disproportionately to cases of some zoonotic diseases. In a study of reservoirs for tick-borne zoonotic diseases, the mammalian species with the fastest life history traits also exhibited the highest reservoir competence. Species with high reproductive output produce a large number of naïve host individuals that can sustain pathogens, even those that induce long-lasting host immunity. For example, the persistence of classical swine fever in pigs and wild boar has been attributed, in part, to high birth rates that maintain a large population of susceptible individuals. Further, because of trade-offs between pace of life and the immune system investment, fast-lived species may exhibit greater reservoir competence through a proneness to acquire, maintain, and transmit pathogens.
To summarize characteristics of reservoirs and determine what makes reservoirs special, we assembled and analyzed a database of pathogens, their targets, and their known reservoirs to address the following questions: 1) What are the most represented taxa among reservoirs of spillover pathogens? 2) What are the characteristics and common taxa of reservoirs for pathogens that are zoonotic “high priority” (defined below) or have epidemic potential? And 3) Are mammalian reservoir species distinct in their life history traits compared to all mammals? Given the strong research bias in favor of human and domestic animal pathogens, we expect that most known disease reservoirs will include mammals. We also hypothesize that, more often than not, reservoirs will include multiple species and wildlife. Because past studies suggested that faster-lived species may be more competent hosts, we predict that reservoir species will exhibit faster life histories.
A “disease” is any condition that impairs the normal function of a body organ and/or system, of the psyche, or of the organism as a whole, which is associated with specific signs and symptoms. Factors that lead to organs and/or systems function impairment may be intrinsic or extrinsic. Intrinsic factors arise from within the host and may be due to the genetic features of an organism or any disorder within the host that interferes with normal functional processes of a body organ and/or system. An example is the genetic disease, sickle cell anaemia, characterized by pain leading to organ damage due to defect in haemoglobin of the red blood cell, which occurs as a result of change of a single base, thymine, to adenine in a gene responsible for encoding one of the protein chains of haemoglobin. Extrinsic factors are those that access the host's system when the host contacts an agent from outside. An example is the bite of a mosquito of Anopheles species that transmits the Plasmodium falciparum parasite, which causes malaria. A disease that occurs through the invasion of a host by a foreign agent whose activities harm or impair the normal functioning of the host's organs and/or systems is referred to as infectious disease [1–3].
Infectious diseases are generally caused by microorganisms. They derive their importance from the type and extent of damage their causative agents inflict on organs and/or systems when they gain entry into a host. Entry into host is mostly by routes such as the mouth, eyes, genital openings, nose, and the skin. Damage to tissues mainly results from the growth and metabolic processes of infectious agents intracellular or within body fluids, with the production and release of toxins or enzymes that interfere with the normal functions of organs and/or systems. These products may be distributed and cause damage in other organs and/or systems or function such that the pathogen consequently invades more organs and/or systems.
Naturally the host's elaborate defence mechanism, immune system, fights infectious agents and eliminates them. Infectious disease results or emerges in instances when the immune system fails to eliminate pathogenic infectious agents. Thus, all infectious diseases emerge at some point in time in a given population and in a given context or environment. By understanding the dynamics of disease and the means of contracting it, methods of fighting, preventing, and controlling are developed [2, 5, 6]. However, some pathogens, after apparent elimination and a period of dormancy, are able to acquire properties that enable them to reinfect their original or new hosts, usually in increasingly alarming proportions.
Understanding how once dominant diseases are reappearing is critical to controlling the damage they cause. The world is constantly faced with challenges from infectious diseases, some of which, though having pandemic potential, either receive less attention or are neglected. There is a need for constant awareness of infectious diseases and advances in control efforts to help engender appropriate public health responses [7, 8].
Prior to 2000, wildlife diseases were mostly studied to improve zoo animal survival and welfare, with little published on the diseases of free-living wildlife unless they affected heavily hunted species (e.g. deer in North America) or were considered a threat to livestock health (e.g. tuberculosis, rinderpest). While non-infectious diseases had been widely recognized as important drivers of species declines (e.g. DDT poisoning of raptors), only a small number of researchers investigated infectious disease as a factor in, often covert, wildlife population regulation. The role of infectious diseases in mass mortality events or population declines was often considered controversial or secondary to other factors, and their role in species extinctions often disputed. The first definitive identification of disease as a cause of species extinction was published in 1996 following the demise of the last population of the Polynesian tree snail P. turgida due to a microsporidian infection. This added to evidence that infectious agents had caused the extinction in the wild of the black-footed ferret, the extinction of around one-third of Hawaiian honeycreepers and the slime mould-induced decline of eelgrass (Zostera marina) beds in the USA, leading to extinction of the eelgrass limpet (Lottia alveus) [14,25–27]. During the 1990s, wildlife mortality events caused by infectious diseases were reported in zoos, in wildlife translocation programmes and in other conservation programmes [28–32]. Perhaps the most important of these was the discovery of amphibian chytridiomycosis, caused by the chytrid fungal pathogen Batrachochytrium dendrobatidis, which was first recognized in the 1990s and has since been implicated in the decline or extinction of over 200 species of amphibian. This disease continues to threaten amphibians globally and has been described as ‘the worst infectious disease ever recorded among vertebrates in terms of the number of species impacted, and its propensity to drive them to extinction’.
Amphibian chytridiomycosis appears to have emerged contemporaneously in Australia and Central America, associated with large-scale die-offs and extinction events, although in retrospect it might have been causing amphibian mortalities and declines in North America prior to this. Proving that a disease is a cause of population declines in wildlife requires longitudinal population and pathogen data, which are often very difficult to collect. Thus, a series of papers disputing the role of chytridiomycosis in amphibian declines ensued, with most suggesting that this disease either emerged secondarily to other factors, or that it was not the cause of declines/extinctions [37–40]. Long-term datasets have since been published which provide convincing evidence that amphibian chytridiomycosis alone can cause mass mortalities leading to population declines. Policy measures to control amphibian chytridiomycosis, however, have been slow to be enacted, with the first international policy measure (listing of chytridiomycosis by the World Organisation for Animal Health) occurring in 2010 and with the implementation of measures recognized to mitigate the spread of this disease still not being enacted by the international community.
Public and political reaction to the more-recent emergence of white nose syndrome (WNS) in North American bats provides evidence that the conservation implications of wildlife EIDs are becoming more widely accepted. The causative agent of WNS is the fungus Pseudogymnoascus destructans which colonizes the skin of a range of temperate-zone bats, often causing death during hibernation. Only 1 year after the initial discovery of the disease in the USA in January 2007, visitors to bat caves across the country were being advised to reduce visits and to implement biosecurity measures, and by 2009, caves in over 20 states were closed to the public. The disease has been the focus of a series of grants, formation of multi-disciplinary research partnerships and significant efforts to identify pathogenesis, transmission pathways and potential control measures.
Although there is a growing recognition of the impact of pathogens on wildlife, the significance of infectious disease as a cause of historical extinctions is likely underestimated due to a previous relative lack of infectious disease focus and diagnostic capability. Collaboration among ecologists, conservation biologists and veterinary pathologists is relatively recent and increased pathological and epidemiological involvement in studies of the causes of wildlife declines are critically needed to identify and understand disease threats to wildlife and how to mitigate them.
There are several theories regarding the cause of the first reported case of BSE in the mid-1980s; some insist that the BSE pathogen (PrPSc) formed naturally and others claim that the disease was caused by the cow feed made from sheep infected with scrapie. By an extensive epidemiologic investigation, the main cause for BSE turned out to be a meat and bone meal (MBM) made from the discarded bones and intestines of slaughtered cows and sheep. In the UK, in particular, cow intestines have been used in MBM as a protein supplement since 1972, which accelerated the increase of the occurrence of BSE.
BSE has occurred in European countries that import MBM from the UK; according to statistics from the World Organization for Animal Health (Office International des Epizooties; OIE), there have been 190,628 BSE cases in 25 countries worldwide as of August 30, 2012 (http://www.oie.int). Most reported cases are from the UK, peaking in 1992, and in other countries the epidemic peaked in 2002 or 2003; from then the number started to decrease sharply.
BSE is a chronic degenerative neurological disease in cows; part of the brain becomes sponge like, and exhibits many different kinds of neurotic symptoms and paralysis, eventually leading to death. In BSE, nerve cells and central nerve tissues take on a sponge-like form. After approximately 2–5 years of incubation, the animal dies within approximately 2 weeks to 6 months of development of the disease. Clinical symptoms include extreme sensitivity to external stimuli such as light and sound, neurotic changes (depression and nervousness), positional imbalance, inability to stand straight or move, paralysis in the hind legs, and paralysis of the whole body before death.
At present, BSE is under surveillance by the OIE; in Korea, it is classified as a second category of animal epidemics along with scrapie and CWD. BSE has no effect on the cattle younger than 7 months old; by the time cows reach 24 months of age, there are many variant prions in the body. Most occurrences of BSE are in cows older than 36 months. Therefore, the OIE examines the occurrence of BSE in 24-month-old cows.
In the UK, more than 184,000 cases of BSE have been reported and more than 3 million cows were destroyed to stop the spread of the disease; hence, the UK strictly banned MBM. Owing to their efforts, the occurrence of BSE was dramatically reduced. However, since the 2000s, the disease has been spreading worldwide, including in the USA, Japan, Israel, and various African countries. Determining the precise number of occurrences is challenging, as some infected animals do not exhibit any particular symptoms. Without total inspection and surveillance, it is difficult to research the actual status of the disease.
Therefore, the EU places much emphasis on active monitoring and surveillance systems, such as total inspection, thorough removal of SRM (where 99% of the pathogenic prions exist), banning MBM, and monitoring animal feed. Through such actions, BSE has become manageable, but it is still not eradicated. The USA also started to emphasize the development of an effective animal-monitoring system over concerns for human health.
However, some BSE cases have been reported even after stricter surveillance was put in place, which means that the disease is not controllable by monitoring animal feed alone. Some scientific evidence is given regarding this: pathogenic prions from the feces of TSE-infected animal can be absorbed into the soil, can combine with minerals in the soil, and can become stable. Although BSE does not seem to be transmitted horizontally within species, such findings suggest that more precautionary actions and approaches should be performed in epidemiologic investigations, including studying the possibility of transmission via a contaminated environment.
BSE-infected cows show the possibility of self-mutation of the BSE prion, since the prion gene that causes vCJD in humans, which had some mutations, was found in the brains of affected cows. This implies that a wide range of monitoring systems in DNA and/or protein levels is necessary in addition to a strict animal feed policy. Considering the transmission of BSE to humans, control SRM is the most important step to take. Based on recent findings on the relationship between SRM and the occurrence of the disease, the EU developed some guidelines in April 2008 for its member countries to follow regarding SRM. According to these guidelines, the tonsils, whole intestine, and mesenterium are all vulnerable to prions across all ages; the brains, eyes, spinal cords, and skulls of cows that are older than 12 months are considered to be SRM.
Some older cows have an atypical form of BSE (BASE), which differs from typical BSE with respect to its molecular and biochemical properties; it appears to be a sporadic BSE, although more precise etiological studies need to be performed for confirmation. Recently, attention has been given to BASE [47–49] because of its infectivity and relation to vCJD.
PrPSc in infected animals is concentrated in specific areas. These areas are called specific risk material (SRM) and include the brain, eyes, spinal cord, skull, vertebral column, tonsils, and distal ileum; these are the most crucial areas for disease management and control. The disease is transmittable via surgical tools that came into contact with SRM and via blood transfusion. Since blood rarely contains the prion, it was considered safe until a death caused by vCJD from a blood transfusion was reported in the UK, which was alarming to the public. From this case, the UK spent more than £200 million on a preventive process to protect surgical tools against prion transmission. This shows that the occurrence of BSE or vCJD can cause an enormous amount of indirect expenses, although they do not occur frequently.
The species barrier makes it difficult for infectious diseases to be transmitted from one species to another. The value of the species barrier for prion disease transmission between humans and cattle has been estimated to be 4000, based on the study of BSE zoonosis. However, a precautionary principle assumes that there is only one value of species barrier between humans and cattle, and suggests that the same dose that causes the disease in cattle may affect humans in the same way. The experimental disease-inducing amount of SRM in a single administration is 0.001 g injected orally; 10 g can lead to BSE in all administered cattle. The amount required to induce the disease is very small; a scientific report submitted to the British Council in 2001 states that an amount as small as one speck of pepper may cause the disease. Five grams of an oral inoculum with brain homogenate from a BSE-infected cattle in a primate (Cynomolgus macaques) resulted in the development of a vCJD-like neurological disease 60 months after exposure.
Since such a small dose can cause the disease affecting both humans and animals and there is currently no cure, experiments on the PrPSc agent transmissible to humans are supposed to be performed in Biosafety Level 3 in the same manner as biologically strong infectious agents (e.g., anthrax bacterium, Severe acute respiratory syndrome, and the West Nile virus). It is notable that in the study of pathogenic prions, the protein particles have been observed to have many different strains, which are under investigation. In fact, the discovery of new strains is related to the species barrier and has many implications in disease control in relation to the adaptation and progression of TSE.
In that respect, the EU, which has conducted much research on BSE and vCJD, defines any beef that has had contact with any SRM as SRM itself and advises people not to use any cosmetic or food items with SRM, although transmission via cosmetics or food has not been reported yet. In BSE-infected cattle, PrPSc is also found on peripheral nerves. Consequently, the whole body of the infected cattle is disposed of according to the EU regulation.
It is noteworthy that Payer’s patch tissue, which is the most essential factor for PrPSc absorption, is mostly on the ileum in humans; however, similar tissues are predominantly found in the whole intestine including mesentery in cattle. Therefore, the EU defines the whole intestine as SRM, the evidence of which is verified annually. Recently, Switzerland submitted a request (EFSA-Q-2009-00226) to the European Food Safety Authority (EFSA) to reassess the use of bovine intestines for stuffing sausage. The request was declined, which demonstrates a careful approach to the consumption of bovine intestine by global organizations. Similarly, Koreans also need to take precautionary action in consuming bovine intestines.
There is growing awareness that drivers of disease emergence modulate the interplay between pathogens, hosts and environment.2,4,20,21,22,23 An EID event can be considered as a shift in the pathogen–host–environment interplay characteristics. Changes in the host–environment and the disease ecology are key to creating novel transmission patterns and selection of novel pathogens with fitter genetic traits. This process will finally result in a novel steady state pathogen–host–environment interplay. Yet, it is difficult to tell what this new pattern will look like until it materializes.
A key factor influencing both the likelihood and outcome of disease emergence is pathogen invasiveness, i.e., the ability of a pathogen to emerge. Such invasiveness is determined by the combination of pathogen traits, including opportunism and evolvability.20,24,25 Notably, RNA viruses with an inherent high mutational rate, bacteria capable of acquiring genetic material and pathogens infecting multiple hosts are more likely to turn into an emerging disease agent.2,4,8,24,26 On the other hand, emergence is influenced by the invasibility of the host–environment, in terms of the individual body, host population structure, host community composition and mosaic, and the associated landscape and its resilience towards pathogen invasion.25,27 The role of the environment extends also to the role of temperature and humidity in pathogen environmental survival and transmission, seasonality in abundance and distribution of arthropod vectors, and roles of geographic, physical or chemical barriers.
Based on the collected healthcare utilization data, we evaluated how the sensitivity and representativeness of a surveillance system may be improved by integrating other healthcare providers. We classified healthcare providers as (i) surveillance hospitals, (ii) other hospitals (government and private clinics), (iii) qualified private practitioners, and (iv) the informal sector (unqualified practitioners such as traditional healers, village doctors, homeopaths, and pharmacies). We estimated the proportion of cases attending each healthcare provider class, with exact binomial confidence intervals, and estimated outbreak detection probabilities based on proportions attending the surveillance hospital plus (i) other hospitals, (ii) qualified private practitioners, or (iii) informal healthcare providers. Furthermore, we compared the proportion of cases with each characteristic (sex, age, and socioeconomic group) among community cases to the proportion among those attending each healthcare provider class and quantified absolute differences in proportions with 95% CIs and p-values using bootstrapping (2,000 bootstrap iterations).
All statistical analyses and graphics were implemented in the R computing environment; maps were created using QGIS software.
The spread of infectious diseases strongly depends on how habitat characteristics shape patterns of between-host interactions,. In particular, habitat heterogeneity influences patterns of between-individual contacts and hence, disease dynamics,. For example, “habitat hotspots”, sites that attract individuals or social groups over long distances, can be visited by a large subset of a population. Around hotspots, between-individual contact rates often increase in frequency, which amplifies disease transmission. In humans, schools and working places are typical examples of hotspots and have been shown to accelerate the spread of measles, influenza and SARS,,. Thus, limiting transmission at hotspots has become a promising strategy for mitigating epidemics (e.g., influenza) although the efficiency of such strategies also depends on the role hotspots plays relative to other sources of local transmission (e.g., influenza,)
In wild animal populations, high quality feeding spots (e.g., fruit trees), breeding sites, waterholes or sleeping sites can exacerbate direct physical contacts. Empirical and theoretical studies on the epidemiological importance of habitat hotspots have mainly focused on how the spatial aggregation of animals favors disease transmission at the hotspot itself,. For example, the aggregation of wild boar at watering sites significantly increases the transmission of tuberculosis-like lesions. However, inter-individual contacts may not always significantly increase at the hotspot itself. This is for example the case of habitat hotspots that some animal species only visited occasionally, such as some mineral licks,. Also, animals present at the same time at a particularly large hotspot may not be close enough to each other to transmit infectious diseases. This is the case of large forest clearings, or large waterholes. Finally, species such as primates and ungulates might avoid defecating in hotspots of high food resources, limiting the transmission of fecal-oral parasites at hotspots,.
When disease transmission does not occur at the hotspot, it can still occur at a certain distance from the hotspot. This phenomenon has received little attention so far. Specifically, infective contacts may be observed when infectious individuals travel to the hotspot and cross the territory of susceptible individuals and, reversely, when susceptible individuals cross the territory of infectious individuals. This second type of transmission may be prominent when the disease reduces the mobility of sick individuals (i.e., sickness behavior,,). For example, in humans, sick individuals often stay home, which alters disease dynamics,. Sick wild animals also commonly reduce their rate of search for food or water. Such transmission may particularly apply to parasites that can survive in the environment (e.g., gastrointestinal parasites) for which the spatial overlap of the home ranges of sympatric hosts favors transmission.
To investigate these transmission mechanisms, we developed an agent-based model exploring patterns of disease spread in a large closed population composed of territorial social groups, in which one or more hotspots influence group movement patterns, but where direct disease transmission at the hotspot itself is negligible. Our hypothesis is that terrestrial animals necessarily cross conspecifics' home ranges on their way to a hotspot, which modifies the contact network of the population and may subsequently alter disease transmission. We assumed that between-group disease transmission can occur both between groups having neighbouring territories and between groups travelling to a hotspot and groups whose territories are crossed en route. We also assumed that only groups which territory lies within a certain distance from the hotspot (further referred as “radius of attraction”) can visit it, and that their visitation rate decreases as this distance increases.
The relationship between the radius of attraction and the disease dynamics was then investigated under two scenarios: i) when groups including sick individuals do not travel to the hotspot, and ii) when these groups still travel to the hotspot. The first scenario corresponds to the case of virulent parasites that can strongly decrease the mobility of infected individuals, such as Ebola virus in western lowland gorillas, whereas the second scenario applies to pathogens that do not strongly modify the behavior of their host, such as some gastro-intestinal macro-parasites and bacteria. Under both scenarios, we investigated the relationship between the disease attack rate and the hotspot radius of attraction, identified the groups in the population that have the highest risk of infection and explored the relationship between the number of hotspots and the magnitude of an epidemic.
The third and final step is to design intervention approaches. The goal is to design barriers to interrupt critical pathways at critical times and locations. These interventions may include: (1) sustainable engineering technologies for human and animal water/wastewater/waste management, (2) medical and veterinary interventions to manage infections, and (3) education of local communities and governance to modify human behavior, current practices and policy based on relationships between environmental health, human health and animal health.
The foremost intervention for waterborne viruses is that of drinking water and wastewater treatment facilities. Both types of treatment plants provide the most immediate barrier between a drinking water source and consumption and utilize several unit processes, such as filtration and disinfection, to ensure the removal of pathogens, including viruses, from water. Both types of treatment plants have been shown to be effective at reducing the concentrations of human viruses from influent to effluent, but it has also been shown that wastewater treatment plants may release viruses in effluent [14,75,,,,,,]. There also exist interventions to prevent non-point-source pollution of water sources. One such intervention is that of watershed protection plans set by the states. Similarly, stormwater management is implemented in several states based on multiple strategies [,,,].
Numerous policy measures in the United States and abroad can be considered examples of the permanent implementation of interventions. Since the Safe Drinking Water Act in 1974, a number of additional policies have been put into effect to strengthen water quality and prevent disease. The EPA sets Maximum Contaminant Levels (MCLs) for several contaminants, including viruses; drinking water treatment facilities are required to attain a 4-log reduction in viral concentration to meet the MCL for viruses. Another example, the Groundwater Rule, was put into effect in 2006 and requires the regular surveillance of groundwater sources that are used for drinking water to ensure that MCLs for pathogens are met. Internationally, the Guidelines for Drinking-water Quality put forth by WHO is used as a basis for the setting of regulations. There still exists a need, however, for the regulation of animal waste products, especially in rural areas in which animal waste is determined to be a critical pathway for viral transport.
The modification of human behavior is also imperative to minimize the transmittance of viral disease along pathways in which interventions cannot be performed for reasons of cost, capability, or convenience. The primary method of altering behavior is education. This applies to the education of medical professionals, both doctors and veterinarians, and environmental professionals in One-Health approaches. It is also critical to educate the public to prevent situations in which people are leaving themselves vulnerable to transmission of disease. Especially in impoverished, high-risk areas, robust measures should be taken to educate the public on the concept of the critical pathways of transmission of viral disease.