Dataset: 11.1K articles from the COVID-19 Open Research Dataset (PMC Open Access subset)
All articles are made available under a Creative Commons or similar license. Specific licensing information for individual articles can be found in the PMC source and CORD-19 metadata.
More datasets: Wikipedia | CORD-19
Made by DATEXIS (Data Science and Text-based Information Systems) at Beuth University of Applied Sciences Berlin
Deep Learning Technology: Sebastian Arnold, Betty van Aken, Paul Grundmann, Felix A. Gers and Alexander Löser. Learning Contextualized Document Representations for Healthcare Answer Retrieval. The Web Conference 2020 (WWW'20)
Funded by The Federal Ministry for Economic Affairs and Energy; Grant: 01MD19013D, Smart-MD Project, Digital Technologies
Common bean, Phaseolus vulgaris, represents a great source of nutrition for millions of people and is the second most important legume crop, after soybean. It is the target of multiple pests and diseases causing substantial losses. For example, on susceptible bean cultivars, bean rust, caused by Uromyces appendiculatus, may cause yield reduction from 18 to 100% with favorable environmental conditions, such as high moisture and temperature between 17 and 27°C. Among the 5 different stages of the bean rust life cycle, basidia, pycnia, aecia, uredinia, and telia, the most devastating on bean is the uredinial stage. The latent period between the germination of an urediniospore and the formation of a sporulating pustule can be as short as 7 days. Signs of infection by Uromyces appendiculatus include the presence of uredinia or spore-producing pustules on the surface of the leaf. The identification of fungal proteins from quiescent and germinating uredospores enhanced the understanding of the infection process of this fungus.
Based upon mapping and quantitative trait loci (QTL) analysis, several genes involved in Colletotrichum lindemuthianum (Co; anthracnose)resistance and other resistance genes for bean common mosaic virus (BCMV), bean golden yellow mosaic virus (BGYMV), common bacterial blight, and bean rust are clustered. The large number of resistance (R) genes for bean rust may correlate with the high pathogen population diversity; with 90 different races identified. The locus Ur-3 confers resistance to 44 out of the 89 U. appendiculatus races present in the USA. Besides the Ur-3 locus, a number of other R genes were identified in bean; such as locus Ur-4 for race 49, locus Ur-11 epistatic to Ur-4 for race 67 or locus Ur-13 mapped to the linkage group B8. To date, no large scale transcriptomic analysis of bean rust infection has been performed to better understand the mechanism of resistance. All of these Ur genes are effective against one specific rust strain, following the gene-for-gene resistance theory. Consequently, gene pyramiding was used to produce cultivars carrying multiple resistance genes. Unfortunately, such resistance may prove to be effective in the field for only a short time due to the adaptation of the fungus to overcome plant defenses. Consequently, unraveling and understanding the mechanisms downstream of these R genes is a key research goal to circumvent the adaptation of the fungus to plant resistance.
We investigated the Phaseolus vulgaris-Uromyces appendiculatus pathosystem at a transcriptional level for a better understanding of the plant response to fungal infection. In this study, we developed a subtractive suppressive hybridization (SSH) library made from the common bean cultivar Early Gallatin that exhibits susceptibility to U. appendiculatus race 41(virulent strain) but resistance to U. appendiculatus race 49 (avirulent strain). The resistance to U. appendiculatus is conferred by the presence of the Ur-4 gene in this cultivar that leads to a hypersensitive response (HR) in presence of the pathogen race 49. This cDNA bean library was enriched in expressed sequence tags (ESTs) that are potentially up-regulated by the compatible and incompatible interactions. More than 20,000 clones from the SSH library were sequenced and assembled into contigs. A total of 10,221 P. vulgaris sequences and 360 U. appendiculatus sequences were added to the NCBI database, significantly increasing the number of ESTs available for common bean. The regulation of 90 genes was confirmed by quantitative real time polymerase chain reaction (qRT-PCR) revealing 3 main expression patterns and highlighting gene regulation that occurs downstream of R protein activation.
The median probability (PContBatch) that a batch of blood collected at the abattoir abroad is contaminated with PEDV, due to the slaughtering of at least one infected pig, was 6.0 % (90 % Prediction Interval: 0.3 %; 22.6 %).
We found that if the batch of blood collected at the abattoir could be tested before the spray-drying process, the PPlasma of scenario II (where storage is not efficient for complete removal of all the remained viable PEDV) would be reduced from 4.7 to 0.3 % (90 % P.I.: 0.004 %; 6.0 %). The latter would correspond to at least 1 introduction each 333 years, on average.
RNA extraction and cDNA synthesis from leaf tissues of common bean cv Early Gallatin infected with either U. appendiculatus race 41 (virulent strain) or race 49 (avirulent strain) and from soybean tissues infected by P. pachyrhizi, were performed as described by Libault et al., 2008. Briefly, RNAs were extracted from ground frozen tissues using TRIzol@reagent (Invitrogen, Carlsbad, Calif.) and purified by two phenol/chloroform extractions. The RNAs were treated with TURBO DNA-free enzyme (Ambion) to remove all DNA contaminants. cDNA synthesis was prepared from 5 μg of RNA using the MMLV reverse transcriptase (Promega, Madison, WI).
From 2004 to 2013, there were 77.76 thousand newly notified pulmonary TB cases in Wuhan, China (Table 1). Figure 1 showed the original time series of pulmonary TB cases from 2004 January to 2013 December. Wuhan city is located in Hubei Province which is in the middle of China. Wuhan has a subtropical wet monsoon climate. Based on the temperature data and climatic seasonal division method, seasons in Wuhan were defined as spring (March–May), summer (June–September), autumn (October–November), and winter (December–February). There were a dominant spring peak (March) and a second summer peak (September), along with the trough (December).
Figure 2, 3 and 4 displayed the three components produced by TRAMO-SEATS program which were trend component, seasonal and irregular noise component. There was an upward trend from 2004 to 2005, a decreasing trend from 2005 to 2010, then a slowing upward trend from 2010 to 2013. From the isolated seasonal component, it was found that seasonal amplitude decreased each year from 2004 to 2013.
Table 2 illustrated spring seasonal amplitudes for subgroups of each group and the comparisons of spring seasonal amplitudes. An annual mean of 56.81% (95% CI, 41.93%–71.69%) more pulmonary TB cases were notified in the spring peak month (March) compared with the trough month (December) from 2004 to 2013. The spring seasonal amplitude in 2004–2008(73.17%) was higher than that of 2009–2013(40.45%), and the difference between them was significant (P<0.05). There were no statistical differences in spring seasonal amplitude between males and females, age groups, center and far district, smear positive and smear negative pulmonary TB (P>0.05). There were significant differences in spring seasonal amplitude by occupation, with amplitude ranging from 59.37% to 113.22% (P<0.05). The most common of spring peak month was March, the others were April and May. The trough month for all groups and subgroups was December.
Table 3 summarized summer seasonal amplitudes for subgroups of each group and the comparisons of summer seasonal amplitudes. An annual mean of 43.40% (95% CI, 28.88%–57.92%) more pulmonary TB cases were notified in the summer peak month (September) compared with the trough month (December) from 2004 to 2013. The summer seasonal amplitude in 2004–2008(60.11%) was higher than that of 2009–2013(26.68%), and the difference between them was significant (P<0.05). There was no statistical difference in summer seasonal amplitude between males and females, center and far district, smear positive and smear negative pulmonary TB (P>0.05). There were significant differences in summer seasonal amplitude by age, with amplitude ranging from 36.05% to 100.09% (P<0.05). There were significant differences in summer seasonal amplitude by occupation, with amplitude ranging from 43.40% to 109.88% (P<0.05). The most common of summer peak month was September, others were July and August. The trough month for all groups and subgroups was December.
Table 4 displayed the median, the lower and upper quartile of spring and summer seasonal amplitude difference in Wuhan from 2004 to 2013. The median of 0–14 years old group and student were negative which showed that the summer seasonal amplitude was higher than the spring seasonal amplitude.
Firstly, the TRAMO-SEATS program was applied to the raw monthly case counts. The time series of total pulmonary TB cases was decomposed into trend component, seasonal and irregular noise component. A decomposition of monthly case counts was obtained for groups of interest, according to time period, sex, age, occupation, district, and sputum smear result. If the population had the identifiable seasonality judged by TRAMO-SEATS program, we calculated the mean spring peak month, mean summer peak month, mean trough month, annual spring seasonal amplitude and annual summer seasonal amplitude with 95% confidence interval (CI), and annual spring-summer amplitude difference with median, upper and lower quartile for the years from 2004 to 2013. Annual spring and summer seasonal amplitude were calculated from isolated seasonal factor and defined as the fraction with the numerator being the spring-peak-to-trough or summer-peak-to-trough difference between the months with the highest spring or summer and the lowest case counts and with the denominator being mean case counts for that year. Annual spring-summer amplitude difference was defined as a fraction with the numerator being the spring-peak-to-summer-peak difference between the months with the highest spring and the highest summer case counts and with the denominator being mean case counts for that year.
Secondly, the amplitudes of spring and summer seasonal fluctuation were compared within groups. The Student’s t-test for two independent samples was used to compare seasonal amplitudes of two subgroups. The Bonferroni method for one way analysis of variance was used to compare all pairwise seasonal amplitudes of three or more subgroups in condition of equal variance and normal distribution, and the Kruskal-Wallis test was used in condition of unequal variance or skew distribution. P-value<0.05 was considered statistically significant.
Simulations showed that the case-fatality rate in the absence of intervention was about 91% (Additional file 2: Figure S2), which falls within the range of literature estimates of 0.4 to 0.91. Note that the simulations assumed worst case scenarios where no other inverventions were done except the vaccination. Furthermore, simulation results showed that all the intervention strategies mentioned previously reduced the case-fatality rate. These results highlighted a benefit of vaccination programs even if they were late: reducing the disease severity in newly infected subjects after the introduction of the vaccination program. As such, relying on R0 as the solely indicator for evaluating intervention programs would overlook this life-saving aspect.
One thousand one hundred seventy eight CPV-positive cases were identified by the CPV colloidal gold test strip. PCR analysis confirmed that 1169 (99.24 %) of these cases were CPV-positive. Of the 1169 partial VP2 sequences, 33 were submitted to the NCBI Genbank database under the accession numbers KF772192–KF772224. The morbidity of CPV infection between 2009 and 2014 was 5.87 % (1169/19907). The annual morbidity rate ranged from 4.16 to 8.06 % with the highest morbidity recorded in 2009 and the lowest in 2012 (Fig. 1), but morbidity was not significant (P > 0.05) between March 2009 and December 2014. In addition, we observed a significant influence of age on morbidity (P < 0.05, Fig. 2). Morbidity was relatively high among puppies during the first 4 month of life with the highest rate occurring in the third month. Afterward, the morbidity rate gradually decreased. Seasonal morbidity ranged from 4.89 to 7.32 % with the highest during the Spring (Fig. 3). However, as a whole, differences in morbidity according to season was not significant (P > 0.05). There was no significant difference in morbidity rates between male and female dogs (643/10937, 5.88 vs. 526/8970, 5.86 %, respectively, P > 0.05). The CPV-positive rate was significantly greater among purebred dogs compared to other dogs (1031/1169, 88.20 vs. 138/1169, 11.80 %, respectively, P < 0.05), and was highest among unvaccinated dogs (680/1169, 58.17 %), although not significantly (P = 0.069) as compared to intermittently vaccinated dogs (463/1169, 39.61 %), but significantly (P < 0.05) as compared to completely vaccinated dogs (26/1169, 2.22 %).
ARI is one of the most common human diseases, predominantly caused by different respiratory viruses [41, 42]. One of these viruses, HBoV1 infection, causes global epidemics, has a high public health burden and circulates with different patterns in different areas [3–7, 43]. In general, the prevalence of viruses varies because of factors such as geographical location, climatic conditions, population and social activity. Epidemiology of HBoV1 in temperate regions has been described in more detail and a high incidence of infection has been observed in children under the age of 2 years in winter and spring [15, 16, 39, 44].
To describe the epidemiology of HBoV1 in Guangzhou, we collected throat swabs from 11,399 children (≤14 years old), hospitalized with ARI and monitored HBoV1 and other common respiratory pathogens over a 7-year period (Table 1).
In the current study, 86.5% (9857/11399) of patients were under the age of 5 years, with a median age of 1.75 years, indicating that infants and young children were most at risk of ARI, consistent with previous reports [45, 46]. Overall, 49.2% (5606/11399) of patients tested positive for one or more respiratory pathogens, 2.2% (248/11399) of patients were tested with HBoV1 infection (Table 1). A higher prevalence of HBoV1 was detected in male patients compared with female patients (p = 0.019), consistent with previous reports [15, 16, 39, 44].
Co-infection with HBoV1 and other pathogens is common [14, 15]. In our study, 45.2% (112/248) of HBoV1-positive patients also tested positive for other pathogens (Table 1). This may be partly caused by coinciding epidemics of HBoV1 and other pathogens. In our study, the HBoV1 seasonal distribution and total positive pathogen distribution were consistent, confirming this inference (Fig. 2). Current research shows that HBoV1 infection can lead to the collapse of the first line of defense of airway epithelium [35–37], which may lead to a higher susceptibility to other pathogens, explaining the high rate of co-infection. Whether co-infection leads to more severe disease is currently unknown and more research is needed to determine this. The characteristics of the HBoV1 infection are likely to be a good model for studying the effects of co-infections.
In this study, there was a significant difference in prevalence of HBoV1 in patients of different ages (p < 0.001). The majority of HBoV1 infections occurred in patients under 2 years old and the peak frequency of HBoV1 infection occurred in patients aged 7–12 months (Fig. 1), consistent with previous serological and epidemiological reports on the virus [8–11, 15, 16, 39, 44]. This might be because children’s immune systems are still under development and maternal antibodies gradually disappear in this age group. The distribution of HBoV1 in patients of different ages will provide important reference for future vaccines and new drug research and development, as well as providing important data for disease prevention and control.
Many factors affect the epidemiology of pathogens, such as geographical location and local climate. Guangzhou, a central city and main transport hub in southern China, is located in a subtropical region. Guangzhou is hot and has high annual rainfall, long summers, short winters and the annual precipitation and high temperature are almost in the same period (Fig. 3). In this study, two HBoV1 peaks were observed (Fig. 2). The large prevalence peaks of HBoV1 infection occurred between June and September of each year, which are the summer months in Guangzhou, with mean temperatures of higher than 25 °C (Fig. 3). Small peaks of HBoV1 infection occurred in winter, between November and December of each year. This seasonal distribution is similar to the prevalence in subtropical regions reported previously, but different from the HBoV1 epidemics in temperate regions, which mostly occur in winter and spring [15, 16, 39, 44], as well as from tropical regions, such as India, where no obvious epidemic season has been found.
To analyze the correlation between HBoV1 prevalence and meteorological conditions, multiple linear regression analysis was performed, with HBoV1 monthly prevalence as the dependent variable and mean temperature (or mean temperature in the preceding month), mean relative humidity, mean wind speed and sunshine duration as the independent variables (Table 2). Both regression models were established (p < 0.001) and the adjusted R2 value (0.373) of the temperature dorp 1 month model was greater than the adjusted R2 value (0.231) of the current monthly temperature model, indicating that the temperature dorp 1 month model had better explanatory power than the current monthly temperature model. Both of the models showed that the prevalence of HBoV1 was significantly correlated with temperature and relative humidity (Table 2). In detail, HBoV1 prevalence was positively correlated with temperature, that is consistent with previous reports [47, 49]. Conversely, HBoV1 prevalence was negatively correlated with relative humidity, this was different from a previous report in Suzhou, which may be related to Guangzhou high humidity (mean monthly relative humidity was 77.2 ± 7.3%) (Fig. 3). It is common for pathogen prevalence to fluctuate over time because of a variety factors. In this study, HBoV1 prevalence was relatively low in 2013 to 2014. It might be partly related to the relatively higher mean relative humidity during this period (Fig. 3). Climate conditions may impact the survival and spread of respiratory viruses, however no significant linear relationship between HBoV1 infection and wind speed or sunshine duration were found in this study (p > 0.05) (Table 2).
Some limitations of this study should be noted. First, because our study mainly focused on HBoV1 circulation in hospitalized patients with ARI, HBoV1 in outpatients and the asymptomatic population were not included. Second, many factors can affect virus epidemics, meteorological data analysis alone may not serve as a final conclusive interpretation. Third, the study was only conducted in three hospitals and may not be representative of the overall population.
Theoretical analyses of stochastic epidemic models showed that when R0 is larger than unity, the final size of an epidemic converges to a bimodal distribution: either the epidemic dies out with a small number of infected cases or the epidemic takes off to a normal distribution with a high number of cases. Our simulation results recreated this epidemic behavior (Fig. 6). Without intervention, EBOV had approximately 50% probability to infect more than half the population. The introduction of vaccination programs at the two previously mentioned coverages and at any vaccination time points under assessment scaled down the epidemic size (Fig. 6). The earlier the vaccination programs were deployed, the closer the epidemics size distribution resemble to a uni-modal distribution centered at a low infected fraction. The high vaccine coverage strategy effectively eliminated the possibility of having a major outbreak infecting a large proportion of the population. This was achieved when the vaccination programs were deployed at any time point from one week to five months before time zero.
Figure 6 also shows that a random vaccination program covering 33% of the population one week before the time zero reduced the final size by more than 100 times compared to a no intervention scenario. However, the low coverage strategy still showed a small probability that the epidemic becomes major, whereas the high coverage strategies did not. Vaccination programs deployed during the epidemics also substantially reduced the epidemic’s size: the vaccination program conducted one month after time zero still reduced the final size by more than ten times. Furthermore, these interventions were not only able to reduce the final size, but they could also increase the epidemics extinction probability (Fig. 6).
Antananarivo is located in the central highlands of Madagascar and culminates at an altitude of 1280 m. It has a subtropical climate with a cold and dry season from May to October (with average minimal temperature around 11°C) and a hot and rainy season from November to April (with average maximal temperature around 27°C) (Figure 1). December, January and February are the 3 months with the highest rainfall and they also correspond to the lean season. Rice is by far most consumed staple food in Madagascar; it furnishes more than 50% of the average calorie ration of the country. The cropping calendar varies greatly according to rice species and climate conditions of the regions, but about 70% of the rice produced in the country is harvested between April and June. Because of its predominance in the agriculture and diet, the seasonal production of rice drives seasonal movements in food prices and overall food consumption.
Since the 1950s, climate change has led to a rise in temperatures in Madagascar, particularly in the dry season. The rainy season is also being delayed. The frequency of extreme events such as cyclones, floods and droughts is increasing. All these changes are likely to modify seasonal patterns of mortality.
Antananarivo has undergone a major epidemiological transition in the last decades, only interrupted by a mortality crisis in the mid-1980s caused by the combination of the resurgence of malaria and food shortages. Life expectancy first declined from 56 in 1976 to 47 in 1986, before increasing steadily to reach 64 years in 2015 (Figure S1). This progress was mostly driven by a decline in under-five mortality, which fell to 34 deaths per thousand live births in 2015, from 116‰ in 1976 (a 70% decline). By comparison, there has been virtually no improvement in survival changes in adults: the risk of dying between ages 15 and 60 was 280 deaths per thousand in 1976, peaked at 473‰ in 1986, at the height of the crisis, and declined again to 283‰ in 2015. The contrast between child and adult mortality suggests that survival gains were for the most part achieved through public health interventions targeting diarrheal and vaccine-preventable diseases. Demographic and Health Surveys show increases in the percentage of children who received all 8 basic vaccinations (BCG, DPT1-3, Polio 1–3 and measles), from 43% in 1992 to 62% in 2008–2009 in the country. Chronic malnutrition has been slightly reduced; the prevalence of stunting in children under age 5 declined from 60% in 1992 to 50% in 2008–2009 (47% in the capital). In contrast, there has been little progress in skilled attendance at birth, access to improved water sources and sanitation. According to the 2008–2009 DHS, only 14% of the population of the capital lived in households with improved, non-shared toilet facilities. Seventy percent of the population had access to water from public taps or standpipes. Overall, the country’s health situation remains exceptionally fragile, as illustrated by recent outbreaks of plague (in 2014 and 2017) and measles (in 2019). Madagascar has one of the lowest levels of per capita health spending in the world, and more than three quarters of the population live in extreme poverty. Still, the epidemiological transition is well underway. As a result of population ageing and changes in risk factors, the distribution of causes of death has changed considerably. In the period 1976–1980, 54% of deaths registered in Antananarivo were due to communicable, maternal, neonatal and nutritional conditions, but this proportion had dropped to 21% in 2011–2015 (Figures S2 and S3).
Our study has provided a better understanding of the epidemiology of HBoV1 in subtropical regions, specifically correlations with climate data; these data will be helpful for future control and prevention of HBoV1 infections.
Statistical analysis to assess significant differences of morbidity was performed using SPSS version 17.0 software. A p-value <0.05 was considered statistically significant.
In many parts of the world, deaths exhibit strong seasonal variation, especially among children and the elderly. The underlying drivers may be varied. Many aspects of human biology relevant to health status are seasonal. For example, vitamin D metabolism and sunlight have been suggested as important drivers of seasonality in immune function. Food intake can also vary substantially seasonally, from fluctuations in access to fruit and vegetables through to the emergence of a ‘hungry season’ in the most severe cases, where poor rural families are unable to maintain body weight and function throughout the year. Both non-infectious and infectious causes of mortality will be modulated by such underlying biology. For non-infectious causes of mortality, seasonal fluctuations in temperature may modulate associated risk factors (such as the effects of temperature on stroke); and seasonal fluctuations in behavior may alter psychological conditions (e.g. depression), or exposure to pollutants. For infectious diseases, climatic variables may drive additional seasonality for a range of pathogens, via their effects on vector life-cycles, how infectious particles fall out of the air for directly transmitted pathogens or by how flooding shapes transmission of water-borne infections. Seasonal patterns of human behavior have also been shown to be a key driver of infections, with seasonal aggregation due to school terms or seasonal migration increasing the magnitude of measles transmission.
The drivers of seasonal variation in mortality are also subject to change over time, due to (i) improvements in socioeconomic conditions, (ii) epidemiological shifts, and (iii) the effects of climate change. All three might directly alter the dominant causes of death or shift their distribution over the course of the year. Taking each in turn, first, in western countries, there is some evidence of a reduction in the seasonality of mortality, partly because of the spread of central heating and improvements in housing [9–11]. Second, the epidemiological transition has led to a reorganization of the hierarchy of causes of death. This can drive trends in the seasonality in mortality because seasonal variation is larger for some diseases than for others. For example, seasonal variation is characteristics of many infectious and parasitic diseases (including malaria), cardiovascular diseases, respiratory diseases, and acute gastroenteritis, but rare for neonatal disorders and neoplasms. Third, climate change can also affect the seasonality of mortality, e.g. via increases in mortality in the summertime due to the increased frequency of heatwaves or effects on pathogen life history.
With few exceptions, the literature on seasonal variation in mortality in Sub-Saharan Africa is patchy, especially when it comes to analyzing cause-specific mortality or long-term changes. This is because analyzing seasonal patterns requires statistical series of deaths tabulated by months, for which death registers are the preferred source of data. However, few countries in Sub-Saharan Africa have a comprehensive system of death registration in place. Often less than half of all deaths are registered, and causes of death are rarely established. As a result, few countries in Sub-Saharan Africa have high-quality data on causes of death, apart from geographically defined populations monitored in Health and Demographic Surveillance Systems (HDSS). Some studies based on HDSS data in Africa have highlighted strong associations between temperature or rainfall and all-cause mortality but they were limited by the relatively short length of the periods covered, the absence of disaggregation by cause of death and their concentration in rural areas [12,13,17,18,20–22].
In Antananarivo, the capital city of Madagascar, a unique data source provides the opportunity to examine seasonality in cause-specific mortality over a long period. The notification of deaths to the health system was introduced in 1921 in the Municipal Hygiene Office (henceforth BMH for Bureau Municipal d’Hygiène), in response to a plague epidemic. The BMH issues a death verification form that is required to obtain a burial permit or to move the body outside the city to reach the family tomb. All records covering the period from 1976 to 2015 were transcribed from registers maintained by the BMH. Previous research has shown that this death notification system can be considered complete since the mid-1970s.
In this study, we capture the seasonal patterns of mortality for infants, children aged 1–4, older children and adults aged 5–59 years, and the population aged 60 and above. We evaluate whether these seasonal patterns have changed over a 40-year period. We hypothesize that the seasonality of deaths in childhood has attenuated because the share of communicable diseases has reduced, and the burden of neonatal disorders has increased. Based on the literature related to high-income countries, we anticipate a reduction in the seasonality of mortality among older adults. We also examine whether changes in seasonal variation can be ascribed to changes in the seasonality of cause-specific mortality or shifts in the hierarchy of causes of death.
Middle East respiratory syndrome coronavirus (MERS-CoV), first detected in the Kingdom of Saudi Arabia (KSA) in 2012, causes severe acute respiratory tract infection in humans, with a high case fatality rate (CFR) (1–4). Dromedary camels are believed to be important reservoir hosts or vectors for human infection; bats may also be implicated (5–8). As of 17 July 2015, 1,368 laboratory-confirmed cases of human infection with MERS-CoV had been reported to the World Health Organization (WHO), including at least 490 deaths, corresponding to a CFR as high as 35.45% (9). Recent MERS clusters in South Korea are thought to be the largest outbreak outside the Middle East countries (10). As of 25 July 2015, 186 laboratory-confirmed cases of MERS-CoV infection have been confirmed (including 36 deaths) in South Korea (9). A South Korean man who was a relative of some of the laboratory-confirmed cases traveled to Guangdong Province (10) and was diagnosed as the first imported MERS-CoV case in China by molecular detection of MERS-CoV (11, 12). The rapid spread of disease in South Korea raised concerns that the imported virus had evolved to become more transmissible. Here, we report a comprehensive phylogenetic analysis of the complete MERS-CoV genome sequence of the first Chinese imported case of MERS (ChinaGD01), and the results indicate its probable origin and show evidence of genetic recombination.
On 23 January 2020, the government of China announced a lockdown of all public transportation departing from Wuhan, including airports, trains, and buses. It is the first time in the history of Wuhan to have such a lockdown. Wuhan is home to 11 million people and is considered the epicenter for a novel strain (nCoV) of Coronavirus (CoV), that has not previously been identified in humans.
This current crisis, and subsequent lockdown, raises an important question—is a transportation lockdown in Wuhan necessary to control Wuhan pneumonia? Of importance to take note, January 23 is the Traditional China Lunar Mouse New Year (like Thanksgiving in the USA), where family members from other cities need to go home to celebrate the New Year. The Chinese Lunar New Year is the most important time for families to come together. Usually, this spring festival lasts for 7 days, during which Chinese citizens visit their relatives and friends. After the Traditional Lunar New Year, people return to their cities and resume work. Both the transportation to and visiting of family and relatives during this celebration period increase the likelihood and risk of transmission of respiratory disease.
Although the WHO did not formally list China as a dangerous area for Wuhan pneumonia, Thailand, Japan, USA, and Singapore have already reported imported cases from China. Due to the incubation period of the 2019-nCoV at around 14 days, there might be another outbreak peaking around 6 February 2020, a result of potentially infected people leaving Wuhan before or on 23 January 2020 by private transportation means. Those infected people might not have had symptoms at that time, and therefore could have spread the virus.
The challenge of controlling Wuhan pneumonia caused by 2019-nCoV is to identify the original source of Wuhan pneumonia. Until now, it is unknown which animal is the reservoir for the coronavirus. Huanan wholesale seafood market in Wuhan city has been suspected as the source of the outbreak, yet this has not been confirmed. In 2013, when there was the Asian lineage avian influenza A(H7N9) virus outbreak, a suspected poultry market was shut down. Likewise, the wholesale seafood market has been shut down since 1 January 2020.
A significant difference between H7N9 and Wuhan pneumonia is that H7N9 was found to be transmitted only from poultry to human beings. However, the 2019-nCoV has been found to also be transmitted from human to human. The secondary cases found have indicated that there is a transmission among human beings, which can in part explain the rapid increase of the 2019-nCoV cases. The original cases could be due to transmission from animals to human beings, which is highly suspected to have happened in the Huanan wholesale seafood market. While the secondary, more rapid infections among the more recent cases could come from human to human infection. Despite the shutting down of the market, the number of new cases continues to grow. According to a predication model, the number of cases is expected to be around 19 thousand by 4 February 2020. Yet, as of 29 January 2020, there have been 5974 confirmed cases in China. Hence, understanding the origin of the latest coronavirus might not be the priority in the controlling of the disease.
Compared with Severe Acute Respiratory Syndrome in 2003, this time, the government is handling the outbreak much more quickly and transparently, although there is room to improve. Isolation of the possible source had been examined as an effective way to control infectious diseases. Quarantining people who have been infected has also been found to be an effective way to cut off transmission chains in infectious disease. Compared to when patients are quarantined at their home, which was applied in the case of H7N9 and others infectious diseases; transportation lock down and roadblocks are necessary in the case of Wuhan.
We propose to look back into public health’s history, back to the 19th century, when John Snow successfully traced the outbreak of Cholera in London and stopped the transmission of the pathogen. Today, we instead tend to identity the pathogen and then address how to handle and stop it. Given technological advances in science, this is often a successful and expedient way to address an outbreak of an infectious agent. However, with greater and faster movement of people around the globe, diseases can also now spread more quickly in a shorter period. As a result, public health efforts should be re-evaluated to understand what the best way is to stop the spread of pathogens. In the case of Wuhan, was a lockdown of transportation the most effective way to thwart risk of spreading the infectious agent? This is something that warrants evaluation in order to apply to future and similar situations.
At the time, when John Snow made the final decision to shut down the broad street water pump in Soho district in 1854, the people of London suspected the results would be effective, although later it was declared as the most effective measure and became the beginning of modern epidemiology. Likewise, it might be too early to evaluate the effectiveness of Wuhan’s lockdown now. Yet, there is no doubt that lockdown will greatly reduce 2019-nCoV cases due to the reduction in the secondary cases from the community. However, long-term limitations on movement of traffic and information poses a great challenge on the stability of a social system.
Nevertheless, it is the first time that such a national scale intervention is being conducted to stop the spread of the new emergency infectious disease. Until now, several other cities in Hubei province have also been locked down. This national innovative intervention measures enacted provides a unique experience to fight new emergency infectious diseases in the modern era.
Other potentially confounding factors include changes in socioeconomic variables, temperature and traffic density occurring in the Oakville area over the same time as the refinery closure. However, using census data from 2001, 2006 and 2011, we found no corresponding change in median income, sex, education status, labor force status, or industry of main employment, nor was any such change reported by community organizations. For vehicular traffic, the time period of interest shows a steady increase in traffic volume and density, which we cannot plausibly link to a sudden decrease in respiratory-related hospitalizations.
This study demonstrated improved air quality and population health ameliorated in relation with the closure of an oil refinery. As a natural experiment, it provides evidence supporting a link between refinery emissions and adverse health effects.
Without interventions, Ni = 87% of the population become infected during the course of the epidemic and the cumulative number of outpatients reaches No = 29%, reflecting the assumption that approximately one third of infected individuals becomes sufficiently sick to seek medical help. These outcomes remain surprisingly stable even for interventions assuming optimistic resources (cf. footnotes to Figures 1, 2, 3, 4, 5). For instance, immediate and unlimited availability of antivirals reduces these fractions only to Ni = 72% and No = 24% (Figure 2). This minor effect has three reasons: only about one third of cases seeks medical help and will receive antiviral treatment, many infections are passed on before cases seek medical help and antiviral treatment does not fully prevent further transmission. These disadvantages do not apply to contact reduction measures. For instance, a reduction of 20% of contacts reduces these fractions to Ni = 68% and No = 22% (Figures 4A, B). A combination of antiviral treatment and contact reduction can further reduce these values to Ni = 53% and No = 18% (Figure 5).
Livestock diseases are associated not only with the economic loss caused by reduced populations, reduced productivity, and veterinary care costs, but also pose a threat to human health. Because of the increased demand for food related to the growing human population, we have become exposed to pathogens present in the production environment and food poisonings associated with them.
The best solution to limit the incidence of zoonoses in humans is to control and prevent dissemination of pathogens in animals constituting the main reservoir of the infections. In industrialised countries, animals are examined before slaughter, there are vaccinations and monitoring of animals, and also the hygienic conditions of the production facilities are controlled (physical decontamination). However, those methods often prove ineffective in developing countries. For these reasons, it is crucial that education in areas such as microbiology, sanitation, hygiene, food science, good agricultural and good manufacturing practices, and also implementation of risk assessment through hazard analysis and critical control points should be considered as necessary.
There are 17 species in the Yersinia genus belonging to the Enterobacteriaceae family. Three of them are pathogenic for humans—Y. pestis (transmitted by fleas and air bacteria causing plague), Y. pseudotuberculosis (the aetiologic factor of rodentiosis), and Y. enterocolitica (the aetiological factor of yersiniosis)—and cause diseases of the gastrointestinal tract in humans. Infections associated with Y. pseudotuberculosis are rather rare.
The bacteria belonging to the Yersiniae genus are rods or cocci, 0.5–0.8 μm wide and 1–3 μm long, nonsporulating. Those microbes are mobile in temperatures ranging between 22 and 30 °C due to a flagellum positioned on a pole of the bacterial surface. They are not mobile at 37 °C. Y. enterocolitica, causing the zoonosis called yersiniosis, are Gram-negative, relatively anaerobic, catalase-positive, nonsporulating, and absolutely psychrophilic enteropathogens. Those bacteria are able to grow at temperatures ranging between 0 and 45 °C, whereas their optimum growth temperatures range between 25 and 32 °C. Y. enterocolitica are able to survive in the VNBC condition and proliferate and produce a thermostable toxin in cooling conditions (4–8 °C). Because of that, they pose a great problem for food. These pathogens grow at an environmental pH lower than 9, and the water activity must not be lower than 0.96. Low pH and the presence of organic acids, including acetic acid, lactic acid, and citric acid, may inhibit the growth of Y. enterocolitica.
There are six different biotypes of Y. enterocolitica reported in the available sources, namely, 1A, 1B, 2, 3, 4, and 5, classified based on biochemical reactions: production of indole, production of acids from xylose, trehalose, saccharose, sorbose and sorbitol, hydrolysis of salicin and eskulin, dekarboxylation of ornitine and o-nitrophenyl-β-d-galactopiranoside, Voges-Proskauer reaction, and reduction of nitrates. Eleven of the serotypes belonging to those biotypes were associated with the development of yersiniosis in humans. The highest incidence of yersiniosis in Europe is caused by the bacteria belonging to biotype 4, serotype O:3. Biotype 1A is common in the environment, and considering the absence of the majority of virulence markers (chromosomal and plasmid pYV), it is considered nonpathogenic. Serotypes of Y. enterocolitica are classified based on the structure of the lipopolysaccharide O-antigen, constituting a part of the superficial lipopolysaccharide layer of the cell. The most pathogenic bioserotypes/serotypes are: 1B/O:8, 2/O:5,27, 2/O:9, 3/O:3, and 4/O:3, with the last one being responsible for the highest incidence of yersiniosis among humans in Europe, Canada, Japan, and China.
Virulence factors of Y. enterocolitica are pieces of genetic information contained in the chromosome (more stable virulence markers) and in the plasmid of Yersinia—pYV, approximately 70 kilo base pair (kb) long. After consumption of infected food or water, Y. enterocolitica reach the distal part of the small intestine and the proximal part of the large intestine, start to proliferate there, and colonise the environment, leading to infection of the host organism. Mechanisms of virulence of Y. enterocolitica are presented in Table 4.
The infective dose of pathogenic Y. enterocolitica is higher compared to that of other pathogenic bacteria present in food, and is 108–109 cells. Incubation time is nearly 3–7 days, but may range between 1 and 11 days. Symptoms of the disease may be mild but may also have a form of severe gastritis and enteritis, disappearing within 1–3 weeks. Yersiniosis affects all humans, but children under the age of 5 years, people with reduced immunity, and the elderly are more at risk. The disease is manifested by fever, stomach ache, and diarrhea (often bloody), and in adult patients often resembles appendicitis. Complications may involve erythema nodosum, osteoarthritis, bacteraemia, purulent hepatitis, splenitis, or nephritis, myocarditis, and less often sepsis and endocarditis. Infection with Y. enterocolitica is often associated with nonspecific symptoms, resembling diseases of other aetiology. For that reason, many cases of yersiniosis are misdiagnosed. In the case of infected individuals with reduced immunity, antibiotic therapy is recommended. According to WHO recommendations, the therapy is based on tetracyclines, chloramphenicol, gentamycin, or cortimoxazole. Due to mutations occurring in the bacterial chromosome, the phenomenon of antibiotic resistance to fluoroquinolons is observed in Y. enterocolitica, but also to ampicillin, ticarcilin, amoxicillin/clavulanian, cefazolin, and cefalotin.
According to US Centers for Disease Control and Prevention (CDC) estimates presented in 2016, yersiniosis affects nearly 117,000 people a year in the United States, including 640 cases requiring hospitalisation and 35 resulting in death. In the EU, a decreasing tendency of confirmed cases of yersiniosis has been observed between 2005 and 2016. The incidence in 2005 was 9630, and in 2016 was 6861. Incidence of Y. enterocolitica infections is carefully monitored in developed countries, but no sufficient diagnostics are available in Africa and the Middle East, and no exact number of cases of the disease is known for those areas.
The main source of yersiniosis in humans is food, in particular raw or undercooked pork, but also fresh and pasteurised milk and other dairy products, infected plants, seafood, and drinking water. Food may be contaminated primarily or by contact with an infected surface or equipment. Although pigs are a leading reservoir of Y. enterocolitica, those bacteria are abundant in the environment and are also isolated from other animals—including poultry, cattle, sheep, and goats—and from wild animals such as rodents, deer, boars, and also cats and dogs. Pigs carrying Y. enterocolitica do not have any symptoms of infection. Pathogens occupy their tongues, oral cavities, tonsils, lymph nodes, and intestines and are present in their faeces. During the slaughter and processing of meat, Y. enterocolitica may be transferred from infected tissues onto other meat. Meat from the areas close to the head and sternum is the most exposed.
Contact reduction measures, comprising social distancing and the isolation of cases, can be an effective part of mitigation strategies; they have the advantage over antiviral treatment to be not limited per se, i.e. they can be continued for a sufficiently long period of time. Figure 4 examines the effect of isolation of cases and social distancing measures (see figure caption for details) in the absence of antiviral treatment. The peak of the epidemic is protracted by about 1 day for every percent of contact reduction if this intervention starts immediately after the introduction of the infection. Thus, a peak shift is not only possible by early action, but also by the degree of contact reduction. If contact reduction is initiated later, the peak shift diminishes, but the proportionality remains. For example, if the intervention starts three weeks after the introduction of infection, the peak of the epidemic is only mitigated by about half a day per 1% contact reduction (Figure 4B). Premature cessation of contact reduction measures restores the infection rates to the pre-intervention values which fuels the epidemic. It can lead to a delayed course and a higher total number of infections, involving a plateau or even a second peak of the epidemic (Figure 4C).
We downloaded the protein, and mRNA sequences of five ebolaviruses: Bundibugyo ebolavirus (BDBV) (Accession Number FJ217161), Reston ebolavirus (RESTV) strain Pennsylvania (Accession number AF522874), Sudan ebolavirus (SUDV) strain Gulu (Accession Number AF086833), Tai Forest ebolavirus (TAFV) (Accession Number FJ217162), and Ebola virus - Mayinga, Zaire, 1976 (EBOV-May) (Accession Number AF086833) from the NCBI2223. These genomes have sizes between 18–19 kb with 40–42% GC content. There are seven genes present in EBOV with two variants of glycoproteins (GP) gene in RESTV and SUDV, while three splice variants have been annotated in other species. Following are the splice variants of GP; i) GP is the spike protein, ii) sGP: (soluble GP) is the default 7A variant and iii) ssGP: (small soluble GP) is a -1A(6A) variant. The alignment file for each protein/mRNA across the different species was generated using ClustalW24.
Four patients, two from each sex, had six cases of infection related to CVC during the study (two patients had two cases). All were younger than 65 years old, none was at the ICU, and three had a totally implantable CVC. The pathogens identified were Acinetobacter junii, methicillin-resistant Staphylococcus aureus and Staphylococcus warneri. Approximately 5% (n=15) of the observations were collected within a range of 3 days before and one day after that positive blood cultures were drawn. Discarding two observations (outliers) and comparing the remaining measurements with the observations not related to CRBSI, higher values were encountered in patients with infection in both subtractions of temperature: catheter area minus contralateral region (95%CI: -0.17 - +0.33 versus -0.33 - -0.20°C; p=0.02; statistical power of 0.82), and catheter area minus forehead (95%CI: -0.02 - +0.55 versus -0.22 - -0.10°C; p<0.01; statistical power of 0.88), respectively. Moreover, the temperature from catheter area was higher in patients with CRBSI (95%CI: 36.6-37.5 versus 36.3-36.5°C; p<0.01; statistical power of 0.94) (Table 3).
Among the 308 observations not associated with positive blood cultures, 16 were collected in presence of clinical signs suggestive of infection, between 3 days before and 1 day after withdrawal of negative blood cultures. Therefore, when the measurements related to negative and positive blood cultures were compared, the temperature around the CVC insertion area was higher in those with CRBSI (p=0.03; statistical power of 0.58). Also, higher mean values were encountered in patients with infection in both subtractions of temperatures: catheter area minus forehead (p=0.03; statistical power of 0.85) and catheter area minus contralateral region (p=0.02; statistical power of 0.45), respectively (Table 4).
How to cite this article: Dhanda, S. K. et al. A web-based resource for designing therapeutics against Ebola Virus. Sci. Rep.
6, 24782; doi: 10.1038/srep24782 (2016).