Dataset: 11.1K articles from the COVID-19 Open Research Dataset (PMC Open Access subset)
All articles are made available under a Creative Commons or similar license. Specific licensing information for individual articles can be found in the PMC source and CORD-19 metadata.
More datasets: Wikipedia | CORD-19
Made by DATEXIS (Data Science and Text-based Information Systems) at Beuth University of Applied Sciences Berlin
Deep Learning Technology: Sebastian Arnold, Betty van Aken, Paul Grundmann, Felix A. Gers and Alexander Löser. Learning Contextualized Document Representations for Healthcare Answer Retrieval. The Web Conference 2020 (WWW'20)
Funded by The Federal Ministry for Economic Affairs and Energy; Grant: 01MD19013D, Smart-MD Project, Digital Technologies
Table 2 displays the results of the univariable logistic regressions used to screen variables for inclusion in the mixed-effects logistic regression model.
Table 3 displays the results of the mixed-effects logistic regression model. The overall model was significant, Wald χ2(5) = 57.6, p < 0.01. However, the mixed-effects logistic model represented no significant improvement in fit over a regular logistic model, χ2(1) = 1.9, p = 0.09. Because Stata’s boxtid command is not supported within mixed-effects regression, the regular logistic model was used to assess the assumption of a linear relationship between the explanatory variables and the log odds of URI. None of the variables departed significantly from linearity, upholding this assumption of the model. The mean variance inflation factor (VIF) was 1.1 and tolerance was between 0.9 and 1.0 for all explanatory variables, upholding the assumption of no substantial multicollinearity.
Age, time in transport and length of stay at the destination agency were significant factors (p < 0.05) associated with URI in this model (Table 3). Younger age, longer time in transport and longer length of stay were associated with increased URI prevalence at the destination agency.
Relocation programs allow for the transfer of animals from source agencies with limited resources to destination agencies with a greater capacity to support and provide positive outcomes for animals. This study reveals that highly contagious feline infectious diseases that can result in high mortality and morbidity of cats, such as FPV and dermatophytosis, were uncommon following transport in one established relocation program. The development of URI in shelters, which has been closely associated with stress, is a predictable concern when relocating cats considering that transport itself, as well as the introduction to a new environment, are stressful events. Factors associated with URI in cats following transport include younger age, longer time in transport and increased length of stay at the destination agency. The prevalence of infectious disease noted in this study is based on the transport of overtly healthy animals in an established relocation program. Although these data are collected from one relocation period over one period of time, they can be used to improve the understanding of factors associated with disease and provide direction and focus for future research. Importantly, this information can help shelters develop protocols around relocated animals to reduce disease transmission within the shelter. Documented strategies that can minimize the risk of developing URI in shelters should be followed for relocated cats at destination agencies, including the provision of larger housing units, minimization of movement of cats within and between cages, utilization of appropriate sanitation methods, minimization of length of stay within the shelter and reduction of shelter population density.
Starting in 1999 and over the subsequent 2 decades, human and animal cases of Cryptococcus gattii have been identified in the North American Pacific Northwest (PNW) (1). Initial environmental studies found widespread fungal presence within the coastal forests and parks of Vancouver Island, near Victoria, Canada (2). Numerous studies have since found the presence of the three unrelated outbreak clones (known as VGIIa, VGIIb, and VGIIc) in the larger PNW region, including the American states of Washington and Oregon (3–5), with VGIIc largely restricted to the Willamette Valley in Oregon (6). In this essay, we use the nomenclature for the C. gattii species complex (7) and major molecular type (i.e., VG) for consistency with the referenced studies in this report, while being aware that the members of the species complex have been proposed to be designated separate species (8). The PNW disease emergence was unexpected, as C. gattii was thought to be restricted primarily to tropical and subtropical zones, namely, in South America, Africa, Asia, and Australia (9). The hunt to find the cause of the dispersal to this temperate zone soon followed the first human cases, leading to a number of causal hypotheses, including shipment of eucalyptus trees (thought to be a possible dispersal mechanism of C. gattii to parts of central and lower California and Mediterranean countries and even from Australia to South America) or agricultural products (11), ecological niche changes due to global warming (2, 12), ocean and wind currents, movements of animals, and human/mechanical transmission (e.g., contaminated tires, shoes, and crates) (13). Multiple causes may have occurred concurrently or sequentially. Multiple groups (11, 12, 14) have proposed the possibility that C. gattii was dispersed to the region decades prior to the outbreak by some natural or anthropogenic means, which was followed by an unknown niche disturbance that led to subsequent human infections.
We previously described a hypothetical transport of C. gattii from eastern South American port cities (e.g., Recife, Brazil) to the PNW (including British Columbia, Washington, and Oregon) via contaminated ballast water, due to early shipping between the regions following the 1914 opening of the Panama Canal (15). The timing of this comports with the molecular clock analyses that suggest that the clonal populations of the three C. gattii subtypes identified in the PNW are between 66 and 88 years old (15) (Fig. 1). This would have established populations of C. gattii in the coastal waters of the PNW (where C. gattii has been found and is known to survive for long periods). We now further hypothesize that the PNW coastal forests became contaminated with C. gattii nearly simultaneously from the tsunami waves immediately following the 27 March 1964 earthquake in Prince William Sound, AK.
Western honey bees (Apis mellifera) are highly social insects that live in colonies of ∼30,000 individuals,. Honey bees are essential pollinators of agriculturally important crops including apples, almonds, alfalfa, and citrus. Current agricultural practices, such as large-scale monocultures, demand a seasonal abundance of honey bees in geographic locations incapable of maintaining sufficient pollinator populations year-round. Migratory beekeeping operations fulfill this need. For example, each February in the Central Valley of California 1.3 million honey bee colonies (∼50% of the U.S. honey bee population) are required for almond pollination,,. Pollination of this and other U.S. crops is valued at ∼$15 billion annually.
There are numerous threats facing honey bee populations and the recent losses of honey bee colonies in the United States, Canada, and Europe is alarming. In the U.S., annual honey bee colony losses increased from 17–20% to 32% during the winter of 2006/07 with some operations losing 90% of their hives. Average annual losses have remained high, averaging 32.6% from 2007–2010,,. One factor contributing to increased losses is Colony Collapse Disorder (CCD), an unexplained loss of honey bee colonies fitting a defined set of criteria,. While factors such as pesticide exposure, transportation stress, genetic diversity, and nutrition affect colony health, the most significant CCD-associated variable characterized to date is increased pathogen incidence. Although greater pathogen incidence correlates with CCD, the cause is unknown in part due to insufficient knowledge of the pathogenic and commensal organisms associated with honey bees,.
Parasitic threats to honey bee colonies include viruses, Nosema, bacteria, and Crithidia. The majority of honey bee infecting viruses are positive-sense single-stranded RNA viruses of the Picornavirales order. They include acute bee paralysis virus (ABPV), black queen cell virus (BQCV), Israeli acute bee paralysis virus (IAPV), Kashmir bee virus (KBV), deformed wing virus (DWV), sacbrood virus (SBV), and chronic bee paralysis virus (CBPV) (reviewed in Chen and Siede, 2007). Several DNA viruses that infect honey bees have also been described. Viral infections in bees can remain asymptomatic, or cause deformities, paralysis and/or death,. Symptoms associated with specific viruses include wing deformities (DWV), hairless, dark, shiny bees (CBPV), swollen yellow larva and/or dark-brown larva carcasses in the cells of worker-bees (SBV) or queen-bees (BQCV), however accurate diagnosis requires molecular biology techniques as asymptomatic bees frequently test positive for one or more viruses,,,. In addition to viral infections, honey bees are also readily parasitized by the microsporidia Nosema
[19],. Historically U.S. honey bees were predominantly infected by Nosema apis, but recently Nosema ceranae infections dominate,. The effects of Nosema infection on individual bee and colony health are unclear,. Some reports suggest infections decrease longevity and may lead to collapse,,, but since Nosema is widespread and often detected in healthy colonies its role in colony health requires further investigation,,. Another fungal pathogen Ascophaera apis, the causative agent of Chalkbrood disease, kills infected larvae, but does not typically cause colony loss,. Bacterial pathogens of honey bees include Paenibacillus larvae and Melissococcus plutonius, the causative agents of American and European Foulbrood disease,,,. In addition to microbial infections, mite infestation (Ararapis woodi, Tropilaelaps sp., and Varroa destructor) also weakens and kills honey bee colonies,. Introduction of V. destructor mites, which feed on the hemolymph of developing honey bees and transmit viruses (DWV, KBV, IAPV), in the late 1980s was devastating to the U.S. honey bee population,,. Notably, the restricted genetic diversity of the U.S. honey bee population may make it particularly susceptible to catastrophic and episodic losses,.
To gain a more complete understanding of the spectrum of infectious agents and potential threats found in commercially managed migratory honey bee colonies, we conducted a 10-month prospective investigation. Our broad-scale analysis incorporated a suite of molecular tools (custom microarray, polymerase chain reaction (PCR), quantitative PCR (qPCR) and deep sequencing) enabling rapid detection of the presence (or absence) of all previously identified honey bee pathogens as well as facilitating the detection of novel pathogens. This study provides a comprehensive temporal characterization of honey bee pathogens and offers a baseline for understanding current and emerging threats to this critical component of U.S. agriculture.
Cockroaches are an important indoor allergen source and a major risk factor for development and exacerbations of asthma. German cockroaches are dominant in temperate regions, whereas American cockroaches (Periplaneta americana) are prevalent in warm and humid tropical and subtropical regions [1–4]. The prevalence of American cockroach allergy as determined by skin test ranges from 40~57% in Taiwan [1, 5], 44~60% in Thailand, 55~79% in Brazil, and 53% in India. Previous studies have characterized 12 groups of allergenic components, namely Per a 1~12, from P. americana [9–12]. We previously reported that allergens from American cockroaches exhibit varying levels of pathogenicity. Sensitization to Per a 2 correlated with more severe clinical phenotype and inflammation among patients with airway allergy. Furthermore, we found Per a 2 protein existed abundantly in roach feces and was more resistant to decomposition up to one year in the environment, which would tend to make it more likely to persist in the environment.
It has been reported that a reduction in cockroach allergens below clinical relevant thresholds might be beneficial [13–15]. However, complete eradication of cockroach infestations in workplaces and homes is extremely challenging. In addition to traditional antihistamines and glucocorticoids with various delivery systems, a number of novel biologic target agents for the treatment of allergic inflammation have emerged. However, allergen-specific immunotherapy remains the only approach to establish specific immune tolerance and potentially cure the disease [16, 17]. Traditional immunotherapy for cockroach allergy is based on crude extracts containing a wide variety of undesirable proteins and may involve a long and cumbersome treatment course, lasting up to 5 years or more [17, 18].
Plant systems capable of producing vaccine antigens either by transgenic approach or a plant viral vector have the advantages of low cost, easily scalable production, and safety. Over the past two decades, several studies have proven the efficacy of plant-made vaccines delivered orally against infectious diseases in animal models and in human clinical trials [19–25]. This approach has also been extended to the field of allergen expression. The feasibility of allergen-specific immunotherapy using plant-derived vaccines has also been demonstrated recently [26–28]. However, cockroach immunotherapy with plant-expressed allergen vaccine has yet to be studied. TuMV belongs to the genus Potyvirus, one of the most important groups of positive sense ssRNA plant viruses infecting a vast array of crops worldwide. Several infectious clones of TuMV have been generated as useful molecular tools for studying the viral expression of heterologous open reading frames in plants [29, 30].
We previously identified three linear IgE-binding epitopes on Per a 2. On this basis, we generated a recombinant hypoallergenic clone 372 of Per a 2, which containing only one IgE- binding epitope of Per a 2 sequence 200–211, as the candidate for a low-risk immunotherapy vaccine. In this study, we examined whether edible plant vaccine encoding Per a 2 full-length clone 996 or its hypoallergenic clone 372 could exert a prophylactic effect in roach-allergic asthmatic mice.
The 1964 Great Alaskan Earthquake was the largest ever recorded in the Northern Hemisphere, registering M9.2 on the Richter scale, second only to the M9.5 1960 earthquake in Chile (NOAA, National Centers for Environmental Information, https://www.ngdc.noaa.gov/nndc/struts/results?bt_0=1964&st_0=1964&type_7=Like&query_7=prince&d=7&t=101650&s=7). The Alaskan Earthquake was felt as far as 4,500 km away, with tidal effects recorded on the Hawaiian Islands. The tidal waves reported in nearby Shoup Bay, AK, were reported to be 67 m high (220 ft), causing significant shoreline devastation. Further to the south, the tsunamis caused significant water surges along western Vancouver Island, most notably in Port Alberni, where dozens of homes were destroyed or washed away. The tsunami continued south, affecting much of the coastline of western North America, even causing several deaths on the beaches of northern California (USGS, Earthquake Hazards Program, https://web.archive.org/web/20141011013757/http://earthquake.usgs.gov/earthquakes/states/events/1964_03_28.php).
It stands to reason (see below) that there has been, and continues to be, a continual presence of C. gattii in the PNW coastal marine environment, again possibly originating from contaminated ballast water from ports from other locales where the organism is endemic. However, the next dispersal leap, the large-scale contamination of the PNW forests, is less obvious. If the coastal marine environment was contaminated first, then there must have been a broad-scale or continual mechanism for aquatic C. gattii to invade the coastal forest of Vancouver Island and other areas of the PNW. If, on the other hand, the ocean-first hypothesis is incorrect and the coastal lands were first contaminated, leading to subsequent coastal and ocean contamination, then we must account for large-scale contamination of the environmental region. Previous hypotheses of original contamination by transport of goods and materials (including plants and trees) require a mechanism for broad dispersal from such material to the larger landscape, including the coastal forests and waters. Ocean environments might be contaminated from snow melt and rain runoff or from contamination from infected sea birds (2), although neither of these phenomena have previously been documented, except for a single report of a C. gattii-positive blue heron (17). It is important to note that, unlike Cryptococcus neoformans, which is classically associated with bird guano, C. gattii, while known to infect multiple bird species, is not shed or otherwise found in guano (18). An alternate hypothesis has been the contact of ocean with contaminated air, as early air sampling studies identified the presence of ambient fungi (likely desiccated yeast cells) in multiple locales on Vancouver Island (2). However, to date, there has been no feasible hypothesis promoted that allows for the initial large-scale land-based contamination of the PNW.
We note that tsunamis have been associated with an increase in fungal diseases (19). Tsunami water can carry pathogenic fungi, as evidenced by case reports of invasive fungal skin and pulmonary disease (“tsunami lung”) in survivors of near-drowning episodes (20–23). Anecdotal evidence for the ability of a tsunami to transport C. gattii comes from a case of cutaneous VGII infection in a survivor of the 2004 Indonesian tsunami in Thailand in which skin injuries presumably became infected by contaminated water (23). These clinical experiences establish that tsunamis can move pathogenic fungi in water flows and provide support for the hypothesis that C. gattii in marine estuaries and other coastal waters in the Pacific Northwest may have reached the land through a tidal wave.
We therefore propose an ocean-first hypothesis, essentially that PNW marine coastlines developed fairly wide-scale marine C. gattii contamination in the decades after one or more initial ballast water-dumping events. Then, on 27 March 1964, the PNW coastal forests became contaminated from the tsunami water surges following the great Alaskan earthquake on that date. This one event, like no other in recent history, caused a massive push of ocean water into the coastal forests of the PNW. Such an event may have caused a simultaneous forest C. gattii exposure up and down the regional coasts, including those of Vancouver Island, BC, Canada, Washington, and Oregon. Local marine populations of select C. gattii strains would have caused local and wide-scale forest contamination events that would subsequently spread more naturally in wooded environments, via other proposed mechanisms both natural (e.g., airborne yeast movements) and anthropomorphic (e.g., fomite-contaminated vehicle tires and shoes) (12, 13, 17). Natural water cycling mechanisms (e.g., snow melt and rain runoff) may account for continued transport of fungi back to the ocean environment, causing cycling of local endemicity. Furthermore, we posit that transport to land in 1964 was followed by a period of soil and tree colonization where C. gattii was exposed to biological and physical selection that possibly increased its infectiousness and virulence for animals, leading to the PNW outbreak 3 decades later.
Multiple pieces of evidence support the ocean-first/tsunami dispersal hypothesis, including (i) evidence of phylogenetic diversification about 50 years ago; (ii) the prevalence of environmental C. gattii predominantly only in coastal forests, rather than further inland, suggesting a connection to the shoreline; (iii) the presence of C. gattii in soils on the Gulf Islands between Vancouver Island and mainland British Columbia; (iv) the presence of C. gattii in humans, mammals, and the forested environment near Port Alberni in central Vancouver Island, an area of the island that was greatly affected by the 1964 tsunami; and (v) the fact that the earliest known case of PNW C. gattii occurred nearly 30 years prior to the 1999 PNW outbreak, establishing a historical record in the region that matches a terrestrial emergence in the 1960s.
(i) Phylogenetic evidence. Phylogenetic analysis of the PNW clones (VGIIa, VGIIb, and VGIIc) provides fairly conclusive evidence of a single introduction event into the PNW followed by local evolution for both VGIIa and VGIIc, whereas VGIIb has a dominant PNW clade resulting from a single primary introduction with possibly one or more additional introductions, with limited evolution and spread (24). The only non-PNW case belonging to the PNW VGIIa clade (besides cases of nonresident travelers to the PNW) was a single isolate obtained from Recife, Brazil, in 1983; all other VGIIa isolates from outside the PNW clade originated from Brazil or elsewhere in the region (24). While not definitively established as originating from Brazil, both VGIIb and VGIIc are most closely related to other VGII lineages from Brazil (and VGIIb has been identified in multiple geographic regions, including Brazil). These data support the “out of Amazon” theory for these subtypes (14, 24). All three PNW clades have been roughly estimated to be 70 to 90 years old using Bayesian statistical inference (Fig. 2) (15), providing for the “Teddy Roosevelt effect” hypothesis, which links the opening of the Panama Canal and subsequent shipping from Brazilian ports to the PNW as a driver of dispersal to the PNW region (15). While specific exported materials that may have been contaminated with C. gattii have not been identified, shipping most certainly would have carried ballast water (and likely microbial contaminants) between the regions. While such molecular clock analyses can provide only relative dating accuracy, closer analysis of the phylogenetic trees from these studies suggests that secondary diversification events, which define much of the current population structure, appear to have initiated approximately 1 to 2 decades following the introduction of each of the clones (Fig. 1). A 1964 tsunami (54 years prior to the above-described analyses) may account for the timing of these population diversification events. Limited tertiary structure suggests limited subsequent evolution beyond these secondary events; therefore, some phenomenon (e.g., simultaneous seeding of the terrestrial landscape) seems to have established multiple individual sublineages that have had restricted subsequent diversification.
(ii) Environmental evidence.
(a) Trees and soils. The early environmental analyses of VGIIa in British Columbia identified that most C. gattii-positive environmental collections (soils and trees) were in the coastal Douglas fir forests and in coastal western hemlock forests bordering the coastal Douglas fir forest (16) (Fig. 1). While these studies were limited in geographical space, the identified contaminated landscapes did encompass the known locations of human and animal cases. Additionally, further ecological analyses have identified higher levels of soil and tree contamination at low-lying elevations close to sea level (25). This is the expected pattern of a tsunami-caused dispersal from contaminated coastal waters.
(b) Water.
C. gattii from the PNW has been shown to demonstrate long-term survival (at least 1 year) in ocean water, and immediate environmental studies found multiple positive ocean samples near the Vancouver Island coastline (16). Additionally, dozens of infected cetaceans have been found along the PNW coasts, including on the shores of Vancouver Island and the Gulf Islands (2), since the outset of the 1999 outbreak (1, 26, 27); more recently infected pinnipeds have been documented in the region (28), all suggesting large-scale ongoing contamination of the marine environment. Numerous Cryptococcus species are adapted to and live in marine water (29–32). Interestingly, C. gattii (and C. neoformans) has an intrinsic mechanism to support cell buoyancy; specifically, its ability to increase capsule production decreases cell density and supports buoyancy in marine salinity levels (33). Nearly all marine mammal infections have been pulmonary, suggesting infection via inhalation. As marine mammals breathe at the surface of the water, it is possible that C. gattii survives at the sea surface microlayer, as described for other Cryptococcus species (34, 35), and is subsequently inhaled during breathing episodes. For this report, we also undertook an initial scan of available metagenomic data sets collected in the PNW area for other purposes and have identified a recent metagenomic study of marine microbial communities from expanding oxygen minimum zones in the Saanich Inlet off southeast Vancouver Island (36), where multiple C. gattii VGII-specific kmer sequences were present in the multiple collected marine sample metagenomes (data not shown). These findings further support the concept of long-term ocean survival of C. gattii in the region. Alternatively, these findings may represent continued seeding of coastal waters by contaminated soil runoff and/or by dispersal of airborne yeast cells from nearby contaminated forests, as suggested previously (2).
(iii) Geographic evidence. One of the regions on Vancouver Island to receive the most significant tsunami damage was the town of Port Alberni, tied to an inlet on the west side of the island. Several hours after the Great Alaskan Earthquake struck, multiple waves flowed up the Alberni Inlet, cresting at 8 m and striking the Port Alberni region, washing away 55 homes and damaging nearly 400 others (37). Kidd et al. (16) sampled the Port Alberni region, which is approximately 40 km from the eastern edge of the island, finding C. gattii in the forest near Port Alberni. Multiple environmental samples were found to be positive in that study, and it identified human and terrestrial animal cases in this locale (Fig. 1). Infected sea mammals have also been found in the Alberni Inlet (25). Human and terrestrial and marine animal cases have also been reported along the western coast of Vancouver Island, suggesting that coastline contamination has occurred beyond Vancouver’s port region and that contamination of the Port Alberni region may be due to its coastal contamination (i.e., from the 1964 tsunami event), rather than from terrestrial dispersal from the eastern side of the island.
(iv) Patient evidence. One of the confounders for a more recent emergence of C. gattii in the PNW (i.e., <50 years ago) is that the earliest known case of C. gattii VGIIa occurred nearly 30 years prior to the 1999 PNW outbreak, a Seattle patient in 1971 (2). Unfortunately, no epidemiologic details exist for the case (e.g., it remains unknown if the case had a history of travel to Vancouver Island); however, this isolate clearly belongs to the PNW VGIIa clade (24), and other cases, human and veterinary, have been subsequently found in the Puget Sound region (38). A scouring of medical records and archived samples has not been able to identify any cases or records of possible C. gattii infection occurring in the region prior to 1970. While only circumstantial, this finding comports with the timeline of environmental seeding from the 1964 tsunami. Additional subsequent infections may have been undetected prior to the 1999 outbreak, especially when viewed in light of the fact that cryptococcal infection can remain dormant after acquisition (39–41).
Much of the above discussion has focused on the dispersal of the VGIIa clone in the Vancouver/Puget Sound region. It is worth noting that the Oregon cases (and all findings of the VGIIc clone) are largely restricted to the Willamette Valley (4, 6, 42, 43), a large river valley south of the Columbia River, which is the primary waterway for shipping to the Pacific Ocean (Fig. 1). The Columbia River inlet of the Pacific is not thought to have been as heavily affected by the 1964 tsunami, with the largest impacts on coastlines occurring nearest the mouth of the inlet (44). Nevertheless, the international shipping port of Portland, further up the Columbia inlet, at the mouth of the Willamette River, may still have acted as a gateway for waterborne C. gattii. The tsunami wave was recorded up to 145 km (90 miles) upriver, including nearly 5-foot waves at the junction of the Columbia and Willamette Rivers (45), which may have transported contaminated estuarine waters upriver. Then a second fluvial event in 1964 consisting of a major flood may have provided another event contributing to the establishment of C. gattii in the area. The 240-km (150-mile)-long Willamette Valley is transected from Eugene to Portland by the Willamette River, which flows north and empties into the Columbia River at Portland Harbor. Nearly 62,000 ha (153,000 acres) of the Willamette Valley was flooded in December 1964, in a devastating “hundred-year flood” (46), causing significant inundation, saturation, and likely contamination of the surrounding region, acting possibly as a similar, yet distinct, water-to-land dispersal mechanism in this southern region of the Pacific Northwest, ironically in the same year as the Alaskan tsunami.
Over the course of the experiment, HF1995 successfully established an infection in the tracheal mucosae of all 15 house finches inoculated, whereas Rlow established an infection in 12 of 15 (80%) inoculated birds (χ2 = 1.5, df = 1, P = 0.22). There was, however, a difference in the timing of the establishment of infection. At 2 days postinfection (dpi), HF1995-inoculated birds were significantly more likely to test positive for infection. M. gallisepticum could be detected in 12 out of the 15 (80%) birds inoculated with HF1995 but in only 6 of the 15 (40%) birds inoculated with Rlow (logistic regression: z = −2.2, df = 1, P = 0.03). By 7 dpi, however, all the birds that became infected (i.e., 15 HF1995 birds and 12 Rlow birds) tested positive for M. gallisepticum.
When considering only the birds that became infected, we found that HF1995-infected finches reached higher peak bacterial loads than Rlow-infected finches (Mann-Whitney U test: Wilcoxon test statistic (W) = 170, P < 0.0001) (Fig. 1). The number of days between inoculation and peak bacterial loads, however, did not significantly differ between treatments (Mann-Whitney U test: W = 104, P = 0.43; mean ± standard deviation: Rlow = 9.3 ± 4.6 days, HF1995 = 11.2 ± 6.4 days). By the end of the experiment (56 dpi), all 12 Rlow-infected finches had cleared the infection, whereas 3 of 15 (20%) HF1995-infected birds remained positive for M. gallisepticum (χ2 = 14.2, df = 1, P < 0.0002). Additionally, Rlow-infected birds cleared the infection significantly faster than HF1995-infected ones (linear model: t = −4.5, P < 0.001; mean ± standard deviation: Rlow = 22.8 ± 13.5 days, HF1995 = 41.5 ± 5.3 days).
While 14 out of 15 finches (93%) inoculated with HF1995 developed clinical symptoms (i.e., conjunctivitis), only 5 out of 12 (42%) Rlow-infected individuals exhibited conjunctivitis. This difference was significant. Rlow-infected individuals exhibited a significantly lower probability of developing clinical symptoms than those inoculated with HF1995 (logistic regression: z = −2.5, P = 0.01). Furthermore, when we considered symptomatic birds only, birds that were infected with HF1995 developed significantly more severe conjunctivitis than birds infected with Rlow (Mann-Whitney U test: W = 69, P < 0.001) (Fig. 2).
Overall, there was a significant quadratic relationship between the production of M. gallisepticum-specific antibodies and time (linear mixed model; time: F1,53.9 = 35.8, P < 0.0001; time2: F1,54.5 = 8.2, P < 0.0001) (Fig. 3). However, the strength and pattern of this relationship differed significantly between the two treatment groups, leading to a significant treatment-by-time interaction (F1,53.9 = 8.2, P = 0.045). For example, while HF1995 triggered antibody responses to increase at a rate of 0.06 enzyme-linked immunosorbent assay (ELISA) units (EU)/ml between 7 and 14 dpi, Rlow did so at a rate of 0.01 EU/ml over that same period, generating a 130% increase in the amplitude of the response of HF1995-inoculated birds relative to Rlow-inoculated ones (treatment, F1,58.8 = 18.6, P = 0.022).
The Arthropod Pathogen Microarray (APM) is a custom DNA microarray capable of detecting over 200 arthropod associated viruses, microbes, and metazoans. This DNA microarray includes oligonucleotides representing every arthropod-infecting virus with published nucleic acid sequence in the International Committee on Taxonomy of Viruses database as of November 2008,. Design principles used for APM oligonucleotides (70-mers) were based on previous pan-viral microarrays using ArrayOligoSelector (AOS). In addition, non-viral pathogens including, Nosema (microsporidia), Crithidia (trypanosomatid), Varroa (mite), Tropilaelaps (mite) and Acarapis (tracheal mite) as well as Paenibacillus larvae and Melissococcus plutonius bacterial species were represented on the microarray (Table 1). This new diagnostic tool is composed of 1536 oligonucleotides, including viral, non-viral and positive control targets (Table 1). Array analysis is performed computationally using e-predict,. The sensitivity of the APM was estimated to be 1.9×105 viral genome copies (1 pg Drosophila C virus in vitro transcribed genomic RNA) in an A. mellifera RNA (1 µg) background (see Materials and Methods). Array specificity was confirmed by performing pathogen-specific PCRs in conjunction with nucleic acid sequencing. Test samples included honey bees from managed and feral colonies, Vespula sp. (yellow jackets), and Bombus sp. (bumble bees) (Table S1). A sample from a collapsed colony in Montana tested positive for the highest number of viruses (BQCV, DWV, KBV, IAPV) and documented the array's ability to simultaneously detect multiple pathogens. Analysis of symptomatic honey bees, such as hairless, shiny bees and bees with deformed wings, confirmed the presence CBPV and DWV, respectively,. Likewise, analysis of Varroa destructor RNA validated the array's ability to detect mites and their associated viruses (DWV). Interestingly, pathogens normally associated with honey bees, DWV and ABPV, were also detected in a yellow jacket sample (Vespula sp.) obtained near a hive entrance from which the honey bees also tested positive for ABPV and DWV. We used the APM to detect several pathogens (BQCV, DWV, SBV and Nosema) in CCD-affected colony samples from an Oklahoma based migratory beekeeping operation (Feb. 2009). In total we detected and sequence confirmed ten previously characterized honey bee pathogens using the array including: CBPV, IAPV, DWV, ABPV, BQCV, SBV, KBV, Nosema apis, N. ceranae and Varroa destructor.
All statistical analyses were conducted in R (http://www.R-project.org/). We tested for differences in the abilities of HF1995 and Rlow to establish an infection using a chi-square test. Differences in the probabilities that birds were infected at 2 dpi were determined using logistic regression, with infection status (infected/not infected) as the response variable and treatment (HF1995 or Rlow) as the explanatory term. Differences in the peak bacterial loads and in the times of clearance of the infection were modeled only in infected individuals by performing Mann-Whitney U tests, with either peak bacterial load or date of clearance as the response variable and with treatment as the explanatory term. We tested for differences in the abilities of HF1995- and Rlow-inoculated birds to clear infection using a chi-square test. Differences in the probabilities of developing clinical symptoms were modeled using logistic regression, with clinical symptoms (0/1) as the response variable and treatment as the explanatory term. We then investigated differences in the severity of conjunctivitis as a function of treatment in symptomatic individuals using only a Mann-Whitney U test, with peak conjunctivitis as the response variable and treatment as the explanatory term. To test for differences in circulating levels of anti-M. gallisepticum antibodies over time (i.e., between 7 and 28 dpi) in infected individuals, we used lme4 and performed a generalized linear mixed-model analysis, with antibody concentration as the response term and with time, treatment, and their interaction as explanatory terms; individual identity was specified as the random effect (54). All figures were made using ggplot2 (55).
Six-week-old female BALB/c mice were purchased from the National Laboratory Animal Center, Taiwan, and raised under specific pathogen-free conditions. All animal experiments were reviewed and approved by the Institutional Animal Care and Use Committee of Taichung Veterans General Hospital.
In the present status of health care technologies, Veterinary Medicine will enter a phase of new and incredible transformations. Recently, nanotechnology offers great significant contributions for the development technology in human and Veterinary Medicine, particularly to create new knowledge and make it translation in to the field. Nanotechnology is one of the key technologies of the 21st century and it could offer numerous benefits to the human as well as animal for the developments various competitive devices in wide range of sectors, which would directly help human welfare. Nanotechnology is a multidisciplinary approach using principles of various subjects, including physics, chemistry, material science, biology, engineering, and medicine. Nanoparticle strictly refers to 1–100 nm in size. Recently nanotechnology and nanomedicine offers significant contribution to clinical therapeutics using biocompatible nanoscale drug carriers, such as liposomes, micelle nanoparticles, dendrimer nanoparticles, polymeric nanoparticles, and metal nanoparticles for more efficient and safer delivery of various anticancer drugs. The nanoparticles mediated drug delivery provides longer circulation half-lives, improved pharmacokinetics, and reduced undesired side effects. Nano-sized materials were employed to improve pharmacological therapies, novel modalities for the treatment and diagnosis. The new technology of nanomedicine increases the efficacy of delivery and potential in cancer treatment, particularly to decrease bio-distribution of a drug, thereby reducing off-target side effects, whilst increasing drug exposure to target cells only. Nanoparticle based therapy could improve the balance between the efficacy and the toxicity of systemic therapeutic interventions. Nanoparticles can carry a high dose of drug payloads; it would overcome the small molecules drugs, which are susceptible to transmembrane diffusion. Nanomedicine offers several nanoparticulate platforms such as liposomes, liposome composites, lipid micelles, polymer micelles, polymer drug conjugates, dendrimers, protein carriers, biologically synthesized nanoparticles, and inorganic nanoparticles, as a drug delivery vehicle and some cases nanoparticles acting as cytotoxic and bio-imaging agents. Veterinary medicine is an important aspect taking care of health care in dairy industry to overcome the significant economic loss.
Domestic livestock are primary source for the living and getting revenue of more than 600 million farmers in developing countries and contribute to about 30–35 per cent of agricultural gross domestic product. Devastating outbreaks of new diseases and re-emergence of old infections in animals lead to unbelievable loss of income for livestock keepers. The delay of detection and controlling zoonoses is associated with spread of the vulnerable infection to entire herds and humans. Therefore, it is essential to explore the novel technology to prevent and treat the diseases that are caused by various microorganisms to the veterinary animals. The recent development in nanoscience and nanotechnology helps several areas of research and applications in Veterinary Medicine by developing new diagnostic tools and the development of new forms of treatments, which facilitate increasing the longevity and improve the quality of life of veterinary animals. Nanotechnology is specifically defined as the design, characterization, and application of structures, devices, and systems by controlling shape and size at the nanometer scale level (ranging from 1 to 100 nm) and nanomedicine considered to be offers a more strategic approach, these smaller size of particles can encapsulate and deliver drugs to dramatically enhance their effectiveness. Associating a therapeutic molecule with a nanoparticle can enhance its solubility by orders of magnitude, allowing for hydrophobic drugs to be carried more easily through the bloodstream. This could help to address a serious challenge in pharmacology, since an estimated 40 per cent of new drugs are poorly soluble in biological fluids. In addition, nanoparticles can also enable the controlled release of the drug, or deliver two different drugs simultaneously to give a more powerful combination therapy. Most of the studies either beneficial or cytotoxic effects have been done on rodents as in vivo models due to the similarity in biochemical and physiological pathways with human metabolism, but limited studies are only available in veterinary animals. However, the clinical studies were performed in both primary research focused on the treatment and diagnosis of veterinary diseases and translational research in which spontaneous diseases in animals can be used as models of human diseases. Based on the literature considering into account, this review focused on the basic principles behind the use of very commonly used nanoparticles for antimicrobial agents, drug delivery, diagnostics, vaccine formulation, feed additive, reproductive aides, animal growth, and animal production. Further, we discussed the clinical applications and limitations, providing the reader with a realistic synopsis of the practical applications of nanoparticles to veterinary medicine at present and in the near future.
Nanotechnology is an important emerging industry with a projected annual market of around one trillion US dollars by 2015. It creates novel materials with a variety of useful functions, including many that could be exceptionally beneficial in medicine. However, concerns are growing that it may have toxic effects, particularly damage to the lungs. The application of nanotechnology in medicine need special attentions is required related to the toxicology of nanoparticles and nanostructures. Therefore, exclusive nanotoxicology studies are warranted, particularly the subcategory of toxicology. Classical categories are required for toxicological risk assessment of the use of nanoparticles, including hazard identification, hazard characterization, exposure assessment, and risk calculation Luther (2004). Since there are practically no toxicology studies available on the emerging applications of nanotechnology in medical technology, therefore studies are urgently required to address toxicological risks of the application of nanotechnology in medical technology. Furthermore, there is a lack of knowledge on the fate of ingested nanoparticles in human body and it is essential to investigate routes of exposure and also it is important to know about basic knowledge of their absorption, distribution, metabolism, and excretion. Finally, the implementation of a risk management strategy is required for all medical products using nanoparticles for all medical technology applications.
Before 2012, the subfamily Coronavirinae included three genera (Alphacoronavirus, Betacoronavirus and Gammacoronavirus). However, in 2012, an emerging genus, Deltacoronavirus, was found in many animal species including swine from Hong Kong. At present, more than five different coronaviruses have been described in swine populations. Among them, porcine epidemic diarrhea virus (PEDV), transmissible gastroenteritis virus (TGEV), and porcine respiratory coronavirus (PRCV) belong to the genus Alphacoronavirus; meanwhile, porcine hemagglutinating encephalomyelitis virus (PHEV) and Porcine deltacoronavirus (PDCoV) are assigned to the genus Betacoronavirus and the genus Deltacoronavirus, respectively. Numerous studies have shown that more than half of porcine coronaviruses (including PDCoV) were enteropathogenic and caused acute diarrhea and vomiting in pigs, which resulted in huge economic losses for the global swine industry [2–6].
Currently, PDCoV has been reported in Hong Kong, North America, Mexico, South Korea, Thailand and some provinces of China [1, 7–19]. Despite recent progress, little is known about the prevalence and epidemiology of PDCoV in southern China (including Guangdong province, Hainan province, and Guangxi autonomous region), where major swine production is operated. Therefore, the aim of this study was to investigate the prevalence and sequence properties of PDCoV in this region.
Porcine reproductive and respiratory syndrome (PRRS) is widely accepted as being one of the most economically important diseases affecting swine industry. In 2006 there was an unparalleled large-scale outbreak of the so-called high fever disease in most provinces of China that affected more than 2,000,000 pigs, leading to concerns within the global swine industry and in relation to public health [2–4]. In March 2007 the disease was identified in the Hai Duong province of Vietnam and it spread countrywide affecting more than 65,000 pigs [5, 6]. The outbreaks caused extensive concern worldwide. Studies demonstrated that highly virulent porcine reproductive and respiratory syndrome virus (HP-PRRSV) was the major causative pathogen of the so-called high fever disease. Genetic analysis indicated that the HP-PRRSVs isolated from China and Vietnam shared a discontinuous deletion of 30 aa in nonstructural protein 2 (NSP2), as compared with the North American type PRRSV strains (NA PRRSV) [2, 5, 8]. Since 2006, the HP-PRRSV and classical North American type PRRSV strains coexist in China. Now PRRS epidemic situation is very complicated in China, of which the predominant form is the HP-PRRSV. Rapid differential detection of the two strains of PRRSV is very important for effective PRRS control. Therefore, it is imperative to develop an assay for simultaneous detection and strain identification of HP-PRRSV and PRRSV.
The current immunoassay, such as immunohistochemistry and serological methods, cannot differentiate between the two strains of PRRSV. Conventional RT-PCR is time-consuming, lowly sensitive, and also prone to contamination. The development of real-time RT-PCR technology offers the opportunity for more rapid, sensitive, and specific detection of virus. The current two major genotypes, the European (EU) and the North American (US) strains, have been rapidly identified by SYBR Green-based or TaqMan probe-based real-time RT-PCR assay [9–11]. A specific TaqMan probe real-time RT-PCR has been developed for assaying the HP-PRRSV, but it is not able to differentially detect the HP-PRRSV and PRRSV.
In this research, the real-time RT-PCR for simultaneous detection and differentiation of HP-PRRSV and PRRSV by using both SYBR Green and TaqMan probe was developed and validated. These two methods provided alternative diagnostic assays in diverse PRRSV epidemiological circumstances.
The total number of prescribed capsules was 768.6 ± 30.9 (P = 0.141) and total intakes were 727.4 ± 55.8 (P = 0.888), which was not significant between the two groups. Compliance reported taking more than 75% of the medication was high in both groups; 95.2 ± 4.7% in KRG, 94.0 ± 7.2% in placebo group. No subject failed to complete the study due to under compliance (Table 2). Blinding was also maintained adequately during the study period. After the study, 26.7% of those taking KRG and 21.3% of those taking placebo thought they had been given the KRG.
Human cultural practices have drastically modified environmental conditions and behaviors, promoting rapid and substantial genomic changes often associated with positive selection and adaptation (gene-culture dynamics,). In the history of Homo sapiens sapiens, a particularly important event that triggered a new and striking gene-culture-coevolution cycle was the development of agriculture and animal domestication during the Neolithic period (∼10,000 years ago). Further, the human gene-culture coevolution mediated by the domestication of plants and animals has been argued to provide some of the clearest and most spectacular examples of niche construction. The Niche Construction Theory can be defined as a branch of evolutionary biology that emphasizes on the ability of organisms to modify the pressure of natural selection in their environment and thereby act as co-directors of their own evolution, as well as that of other directly associated species–[6].
Although more than 100 regions/genes had been identified as the likely targets of recent positive selection resulting from cultural pressures in newly constructed niches, well-documented examples are scarce. One of the best-known cases of gene-culture coevolution is lactase persistence (LP; the ability of adult humans to digest the lactose found in fresh milk) and dairying. High frequencies of LP are generally observed in traditional pastoralist populations. For example, LP reaches ∼64% in Beni Amir pastoralists from Sudan, whereas its frequency in a neighboring non-pastoralist community is only ∼20%. In Europe, LP varies from 15–54% in eastern and southern regions, 62–86% in central and western regions, and 89–96% in northern regions,–[13]. Multiple independent mutations have been associated with this characteristic, some of which are located in an intron of the MCM6 gene, a region fundamental to lactase expression,. The alleles that led to lactose persistence in Europe, such as MCM6 13,910*T, first underwent selection among dairying farmers around 7,500 years ago, possibly in association with the dissemination of the Neolithic Linearbandkeramik culture over Central Europe. The high copy number variation of the amylase gene and the spread of the corresponding alleles in agricultural societies are another well-studied example,,,,. Additionally, the West African Kwa-speaking agriculturalists cut and clear the forest to grow yams, increasing the amount of standing water after rain, therefore providing better breeding grounds for malaria-carrying mosquitoes favoring the HbS allele, which confers protection against malaria in heterozygous individuals.
America was the last continent colonized by modern humans in prehistoric times. In less than 15,000 years before present (YBP), these first migrants had to adapt to an immensely wide variety of environments. In some regions during this evolutionary trajectory, as in Mesoamerica and the Andes, hunter-gatherer/forager societies gave rise to agriculturalist and urban communities, while others remained with a hunter-gatherer/forager subsistence system until the time of contact with Europeans or even until the present day. Thus, studies with Native American populations can provide useful information for better understanding gene-culture coevolution and the niche construction processes.
Based on studies with blood groups and other classical genetic polymorphisms, J. V. Neel and F. M. Salzano were pioneers in identifying complex population processes highly dependent on cultural factors in Native Americans (e.g., fission-fusion dynamic;). Other examples are related to the coevolution of genes and languages,, but only two more recently reported examples might be associated with positive selection: (1) Tovo-Rodrigues et al.
[21] investigated the distribution of D4 dopamine receptor (DRD4) alleles in several South Amerindian populations and found a significant difference in the allelic distributions between hunter-gatherers and agriculturalists, with an increase of the 7R allele among the former; and (2) Acuña-Alonzo et al. showed that the 230Cys allele (Arg230Cys, rs9282541) of the ATP-binding cassette transporter A1 (ABCA1) gene, which was previously associated with low HDL-cholesterol levels and obesity-related comorbidities, was exclusively present in Native American and mestizo individuals. These authors verified that cells expressing the ABCA1*230Cys allele showed a 27% cholesterol efflux reduction, confirming that this Native American autochthonous variant has a functional effect in vitro. Other investigations have shown that the presence of ABCA1*230Cys explains almost 4% of the variation in plasma HDL-C concentrations in Mexican admixed populations. This variation in HDL-C concentration was the highest one associated with a single nucleotide polymorphism (SNP) among different continental populations in these genome-wide association studies, corroborating its functionality.
Acuña-Alonzo et al.
[22] demonstrated that 230Cys resides in a haplotype that was the target of an ongoing directional selective sweep, suggesting that 230Cys conferred an advantage during periods of food deprivation in the past. On the other hand, under the current modern lifestyle, 230Cys may have become a major susceptibility allele for low HDL levels and has been correlated with metabolic diseases. This study provides an example of the “thrifty” genotype hypothesis, which postulates that variants that increase the efficiency of energy use and storage during periods of famine would have been positively selected in prehistoric times but can be associated with diseases of affluence in contemporary societies, where food is usually abundant.
Here, we expand the investigations of the Arg230Cys polymorphism in Native Americans and integrate the thrifty genotype concept with the gene-culture coevolution process, considering the human ability to create new ecological niches that may lead to the selection of genetic variants.
Among the 34 subjects that experienced ARI, the symptom duration was 5.2 ± 2.3 day in the KRG group and 6.3 ± 5.0 day in the placebo group (P = 0.475). The symptom duration in the KRG group was short, but there was no statistical difference between the two groups (Table 3). The symptom score (KRG 9.5 ± 4.5 vs placebo 17.6 ± 23.1) was lower in the KRG group than in the placebo group. However, no statistical significance was observed (P = 0.241). Among the symptoms, the frequency rate of cough (P = 0.045) in the KRG group was significantly lower than in the placebo group. The nasal congestion (P = 0.064) and headache (P = 0.059) in the KRG group was lower than in the placebo group, which was marginally significant.
Fifty-four ministries (26 MOH, 25 MAg, and 3 combined responses) in 31 (93.9%) of the 33 target LAC countries responded to the survey. Respondents within the Ministries were mostly zoonoses programme managers. Responses from national zoonoses groups with personnel from both ministries were labelled as ‘combined’ for the purposes of this analysis. Responses were received from: Argentina, The Bahamas, Barbados, Bermuda, Brazil, Bolivia, Cayman Islands, Chile, Colombia, Cuba, Dominican Republic, Ecuador, El Salvador, Grenada, Guatemala, Guyana, Haiti, Honduras, Jamaica, Mexico, Nicaragua, Panama, Paraguay, Peru, St. Lucia, St. Vincent and the Grenadines, Suriname, Uruguay, Trinidad and Tobago, The Turks and Caicos Islands, and Venezuela.
Twenty-two (85%) MOH, 5 (20%) MAg, and 2 (67%) combined entities indicated they had specialized zoonoses units. The proportion of South American countries with zoonoses units in their Ministries was smaller (12 ministries, 60%) than that of Central American countries and Mexico (9, 82%), but larger than for Caribbean countries (8 ministries, 35%). The annual budget available for zoonotic diseases programmes at the national level ranged from 69,000 to 6,683,000 USD with a median of 447,000 USD, and was correlated with country GDP (MOH r = .69, p = 0.009; MAg r = .54, p = 0.048). The median number of full-time senior management staff working on the national zoonoses programmes was 1 (range 0–12), 4 for technical staff (range 0–150), and 1.5 for administrative/supporting staff (range 0–57).
We can now examine some hypotheses in an attempt to explain the results and to draw the evolutionary scenario associated with the pattern of diversity of the ABCA1* Arg230Cys polymorphism.
Maize is considered the most important native crop of the Americas–[53]. Several lines of evidence indicate that the Mesoamerican village lifestyle began with maize domestication,,–[51],,. Originating in the Mexican southwestern lowlands, maize journeyed southwards, traveling hand-in-hand with pottery and bringing sedentary life to the Andes, although the date of its entry, as well as the dispersion pattern of this crop into and throughout South America, remain controversial.
Other crops were also present in the pre-Columbian Mesoamerican civilizations (squash and beans;,), but maize was the dietary base for most of these civilizations. For example, Benedict and Steggerda showed that 75% of the calories consumed by the Mayas were derived from maize. In addition, Mesoamerica was the only region in the world where an ancient civilization lacked a domesticated herbivore. Therefore, protein from domesticated animal sources would have been scarce in Pre-Hispanic Mesoamerica in comparison to other parts of the ancient urbanized world, including the Andes. As a whole, these studies demonstrated that the diet of the first Mesoamerican sedentary communities was extremely dependent on maize. These early farmers, however, suffered periods of plantation loss, questioning the common assumption that farming and sedentary lifestyle brought increased dietary stability and health homeostasis. Several studies have revealed that homeostasis should have declined with sedentary farming, and bioarcheologists and paleopathologists have also detected a deterioration in Mesoamerican health indices from ∼8,000 to ∼500 years before present-YBP ([63] and references therein). Domestic crops are more vulnerable than wild ones, crowding promotes crop diseases, and storage systems often fail (estimates suggest that as much as 30% of stored food is lost even in a modern sophisticated system). In other study, based on the molecular analysis of dietary diversity for three archaic Native Americans, Poinar et al. found evidence that, as compared to individuals dependent on agriculture, the diet of hunter-gatherers seems to have been more varied and nutritionally sound. Clearly, a diet based on one or only a few crops should have been deleterious to health in the pre-Columbian era. These different lines of evidence illustrate that the incipient farming niches of Mesoamerica, when communities of hunters/gatherers/foragers started to cultivate and domesticate wild plants, could have been remarkably unstable like those of other pre-industrial societies.
Based on what was discussed above, as well as in our results (allele age and neutrality/selection tests), it is reasonable to suppose that ABCA1*230Cys has an American origin and it could have had a selective advantage during the periods of food scarcity experienced by Mesoamericans during the implementation of the sedentary life style based on maize. The strong correlation between maize culture propagation and 230Cys frequencies in this region reinforces this suggestion, even when considering that the advantage of the allele may have been lost after technological innovations had been implemented and agricultural production stabilized. Peng et al.
[66] presented evidence for a similar case of gene-culture coevolution, suggesting that positive selection for the ADH1B*47His allele was caused by the emergence and expansion of rice domestication in East Asia.
Noteworthy is that other environmental factors may also have been involved in the distribution of the ABCA1 alleles, since cholesterol plays an important role in various infectious processes, such as the entry and replication of Dengue virus type 2 and flaviviral infection. Additionally, the ABCA1 transporter participates in infectious and/or thrombotic disorders involving vesiculation, since homozygous ABCA1 gene deletions confer complete resistance against cerebral malaria in mice,. These findings can be considered as additional causal factors to the ABCA1*230Cys selective sweep associated with agricultural development. A sedentary village lifestyle with a corresponding growth in the density of the local population can promote an increase in the mortality rate, particularly in children under 5 years of age. For example, archeological and paleoecological evidence in Europe showed that during the Neolithic demographic transition, the causes of increased infant mortality would have included a lack of drinking water supplies, contamination by feces, emergence of highly virulent zoonoses, as well as an increase in the prevalences of other germs such as Rotavirus and Coronavirus (causing diarrhea, one of the main killers of children under 5 years of age), Streptococcus, Staphylococcus, Plasmodium (P. falciparum and P. vivax, which are believed to have emerged more recently), and Herpesvirus
[70]. However, the real impact of the ABCA1*230Cys variant in these infectious processes will require additional functional studies.
In agreement with this historical scenario, the genetic variation in Arg230Cys presented a worse fit to neutrality than loci known to be neutral, indicating that selective mechanisms are necessary to explain the genetic diversity of Arg230Cys, especially when the Mesoamerican agriculturalist subdivision is considered in the analysis. This result also supports our hypothesis that maize domestication in Mesoamerica lead to changes in the gene pool of the natives from that region.
South America presents much more diversity in relation to habitats, people, and culture than Mesoamerica. For instance, maize arrived in South America, but apparently the level of consumption seen in Mesoamerica/Central America was rarely found there. Archaeological data indicate that only during the implantation and expansion of the Inca Empire (800–500 YBP) was the level of maize consumption important, but the level of consumption was not comparable to that of Mesoamerica/Central America–[73]. Additionally, South Amerindian hunter-gatherers/foragers present lower intrapopulation genetic variation and higher levels of population structure when compared to those seen in Andean populations,. This same tendency was also observed in the present study. These results indicate low levels of gene flow between villages/populations and low effective population sizes, favoring the role of genetic drift. Conversely, the Andean groups show opposing characteristics. These findings correlate well with distinguishing patterns of gene flow and historical effective sizes in these indigenous populations, with cultural differences, as well as with paleoclimatic and environment changes in their habitats. Therefore, the significant role of random processes and/or more heterogeneous cultural and ecological scenarios makes it difficult to define a particular pattern associated with the Arg230Cys polymorphism in South American groups, a situation different from that in Mesoamerica.
In conclusion, our analyses demonstrate for the first time a robust correlation between a constructed niche and a selected Native American autochthonous allele. The 230Cys allele, with a probable origin in America continent, seems have been the target for an ongoing directional selective sweep as a result of the origin and spread of the maize culture in ancient Mesoamerica.
Panax ginseng Meyer is a well-known medicinal plant in the world. The ginseng is a deciduous perennial belonging to the family Araliaceae and genus Panax. The genus name of ginseng, Panax, is derived from the Greek pan (all) akos (cure), meaning “cure-all” or “all healing,” which describes the traditional belief that ginseng has properties to heal all aspects of the body. The name ginseng comes from the Chinese words “Jen Sheng,” meaning “man-herb,” because of the humanoid shape of the root or rhizome of the plant, which is the part of the plant most commonly used for extraction,. There are about 13 different species of ginseng which have being identified all over the world. Among them, the most commonly used species of ginseng are Asian ginseng (P. ginseng Meyer, Renshen) and American ginseng (Panax quinquefolius L., Xiyangshen) which all belong to the Panax genus of the Araliaceae family. Asian ginseng has been used for thousands of years as a tonic to improve overall health, restore the body to balance, help the body to heal itself, and reduce stress, and American ginseng has been used by Native Americans for at least hundreds of years,. Ginseng is prepared and used in several ways as fresh ginseng (sliced and eaten, or brewed in a tea), white ginseng (peeled and dried), red ginseng (peeled, steamed, and dried), extract (tincture or boiled extract), powder, tea, tablet, or capsule,. It has been reported that ginseng exhibits a wide range of beneficial pharmacological effects including immunomodulation, antitumor, antioxidation, antidepression, hypoglycemic, inhibition of gastric lesions, attenuation of leptin-induced cardiac hypertrophy, heart protection against ischemia and reperfusion injury, prevention of glucose-induced oxidative stress, prevention of diabetic nephropathy, retinopathy, and cardiomyopathy,,,,. This broad spectrum of biological activity of ginseng has originated from its multiple bioactive components, namely ginsenosides, polysaccharides (PSs), peptides, polyacetylenic alcohols, and gintonin,,.
During the last decade, awareness concerning the intimate links between human and animal health has rapidly increased in the context of disease emergence. Indeed, approximately 80% of the infectious diseases that recently emerged were zoonotic. The role of wildlife in emerging pathogen transmission to humans and domestic animals has in many cases been pointed out–[5]. Conversely, pathogen transmission from domestic animals to wildlife has received far less attention, although the importance of this issue was often mentioned,. Indeed, contacts between wildlife and livestock or their environment sometimes result in wildlife diseases with conservation issues,. Besides, hand-reared animal releases into the wild for either conservation or exploitation purposes represent a particular case in which hand-reared individuals eventually share natural habitats with their wild congeners. In both cases such releases can dramatically influence disease dynamics in the surrounding wild animal populations–[14].
In the present study we focused on the case of game restocking, which implies the release of millions of individuals worldwide each year. Birds are the most frequently involved, with millions of individuals being released annually in Europe only. For example more than 3 millions red-legged partridges are released annually in Spain, and ca. 1.4 million Mallards are being so in France. The Camargue region, a complex network of wetlands situated in the Rhone delta, is a major duck winter quarter and a central place for wildfowl hunting in France. Hunting is also among the most important economic activities in the area, which is one of the reasons for the massive Mallard releases in the Camargue. At least 30 000 hand-reared individuals are released annually in the region. Maximum Mallard numbers in the wild are reached in September after the beginning of the hunting season, with 56 500 individuals on average over the last seven years (Gauthier-Clerc, unpubl. data), these numbers certainly include a mixing of wild and released Mallards. Given its central position on the flyway of many European migratory species, the Camargue is also a potential hotspot for the introduction and transmission of bird-borne pathogens.
For this reason, avian influenza viruses (AIV) have been studied since 2004 in the area. These negative-sense single stranded RNA viruses belonging to the Orthomyxoviridae family are commonly characterized by the combination of their surface proteins: hemagglutinin (HA) and neuraminidase (NA),. AIVs are highly variable and undergo continuous genetic evolution via two mechanisms: i) accumulation of point mutations at each replication cycle, ii) reassortment involving gene segment exchanges that occur when a cell is co-infected by different viruses. These mechanisms contribute to the emergence of new variants with the ability to transmit to new hosts and/or with epidemic or even pandemic potential. Aquatic birds, particularly Anseriforms (ducks, geese and swans) and Charadriiforms (gulls, terns and shorebirds) constitute their major natural reservoir,. The AIV circulating in wild birds are usually low pathogenic ones (LPAIV). LPAIV generally have little impact on their host,, although some studies have reported a possible influence on migration capacities. Besides, when LPAIV of H5 or H7 subtypes are transmitted from wild birds to domestic ones reared in artificial environments, their virulence can evolve to high pathogenicity,. Highly pathogenic avian influenza viruses (HPAIV), such as HP H5N1 strains currently circulating in Asia and Africa, are still of great economic concern, notably due to the cost of preventive actions including vaccination and massive birds culling. Moreover, HPAIV infections represent a threat for human health since 603 HPAIV H5N1 human infections including 356 fatal cases have been reported worldwide since 2003.
Relatively high prevalence of AIV was regularly detected in the wintering Mallard population of the Camargue (e.g. 5.4% prevalence during the 2006–2007 hunting season). Moreover a seasonal infection pattern was identified in Mallards during autumn and winter, with higher infection rates in early fall. Mallards hence represent a focal study species in AIV research. Indeed wild Mallards are one of the main low pathogenic AIV natural reservoir host, and have proven to be healthy carriers of some of the H5N1 HPAIV strains. However, to our knowledge no study ever aimed at investigating the potential role of hand-reared Mallards released for hunting in the epidemiology of AIV, despite the very large number of ducks being released in the wild annually.
To clear this gap we conducted a 2-year study in the Camargue to investigate the potential influence of hand-reared Mallard releases on AIV dynamics in surrounding wildlife. We first hypothesized that, owing to high density rearing conditions and to their genetic uniformity, hand-reared Mallards should be highly susceptible to AIV infections and could play an amplification role in AIV dynamics. This phenomenon has already been pointed out in red-legged partridge (Alectoris rufa) reared for hunting in Spain, where Escherichia coli prevalence was much higher in hand-reared populations before their release than in the wild ones. To test this assumption we collected cloacal swabs from Mallards reared for hunting in several game bird facilities (GBF) in 2009 and 2010, and analysed these samples to measure AIV prevalence. Second, we hypothesized that AIV exchange occurs between wild and hand-reared Mallards, potentially leading either to the circulation of new strains in wild populations or to the amplification and dispersal of wild strains. Indeed, no barrier prevents AIV exchange between wild birds and hand-reared ducks in the GBF since water flows exist between pens and ponds used by wild birds. Moreover, as the GBF roofs are made of nets wild birds can deposit feces in the pens. Finally, hand-reared ducks are in direct contact with wild ones after their release. To investigate these issues, we tested shot waterfowl before and after hand-reared Mallards were sampled. Noteworthy, genetic analyses suggest that 76% of hunted Mallards in the Camargue have a captive origin. Considering the low annual survival of released Mallards (0.8–15.9% depending on the release site), the individuals we tested from the Camargue hunting bags certainly included a large proportion of ducks released some months before being shot. These released ducks cannot be differentiated morphologically from the wild ones. Here we hence analyzed shot ducks as a whole since they are a representative sample of the Mallard population wintering in the Camargue, which is composed of individuals of both wild and captive origin that share habitats and can thus be considered as a single epidemiological unit.
Our third hypothesis was that any AIV strain potentially found in captive reared Mallards might present genetic characteristics linked to its circulation in domestic populations. We therefore performed a molecular study of the identified strains in order to look for such characteristics, and in particular to test for the presence of mutations known to be associated with increased virulence, since it has been highlighted that artificial environments are favorable to the appearance of such mutations in birds. We also searched for mutations linked with transmission to other species including humans, since some studies proved that they can be acquired during their transmission among birds. Thirdly, we tested for the presence of mutations conferring resistance to common antiviral drugs, since such resistance has recently been recorded in wild birds, notably in Sweden. Lastly, full sequence analysis of some strains was performed, so as to get insight into their geographic origin through a phylogenetic study, and to determine their relatedness with strains, which have caused human infections in the past.
Live animal markets (LAMs) represent a traditional place for congregation and commerce, particularly in developing countries. Owing to their role as a source of affordable, live or freshly slaughtered animals, LAMs act as a source for transmission of pathogens, especially viruses.1, 2, 3 Spread of a virus within the market is often enhanced due to high density, close contact animal housing, increasing the risk of zoonotic and anthroponotic transmission.4 Animals remain in the LAM for extended periods of time until sold and can consequently transform these markets into viral reservoirs. As new animals are introduced to the LAM, Infected animals easily transmit viruses to these naïve hosts, thus perpetuating and amplifying viral circulation.5 In addition, LAMs are often part of a larger marketplace ecosystem, potentially exposing people to zoonotic pathogens with little to no direct contact with infected animals.1 This is especially true with avian influenza viruses (AIV).
LAMs in many parts of the world harbor highly pathogenic as well as low pathogenic AIV, which can spread asymptomatically through poultry and are difficult to detect without routine surveillance.6, 7, 8, 9 Although AIV has been detected in North American and Caribbean LAMs, there is no information about South America LAMs,10, 11 likely due to minimal surveillance.12 During active surveillance at the largest LAM in Medellin, Colombia, we isolated two H11N2 viruses from asymptomatic birds. At the peak of the occurrence, 17.0% of the birds in the market tested positive. Genetically, the circulating virus was most similar to viruses isolated from North American migratory birds and to viruses isolated in 2013 from Chilean shorebirds. H11 viruses are distributed worldwide10, 13, 14, 15 primarily in wild ducks and shorebirds16, 17 but rarely are found in domestic poultry.9, 10, 18, 19, 20 Given this unique occurrence and the fact that H11 have been reported to cause human infections21, 22 we characterized the viruses in vitro and in vivo. The Colombian H11 viruses displayed no molecular markers associated with increased virulence in birds or mammals and had an α2,3-sialic acid binding preference. They replicated and transmitted effectively in chickens, explaining the spread throughout the market, but caused little morbidity in Balb/c mice. The genetic similarity to H11 viruses isolated from South American shorebirds suggests that the LAM occurrence may have resulted from a wild bird to domestic poultry spillover. These findings highlight the need for enhanced AIV surveillance in South America, especially in areas of high-risk, such as LAMs.
Porcine hemagglutinating encephalomyelitis coronavirus (PHEV) is a member of the Coronaviridae family, which causes porcine encephalomyelitis. PHEV predominantly affects 1-3 week-old piglets[1], with clinical piglets vomiting, exhaustion and obvious neurological symptoms as the main feature. The mortality rate is up to 20-100%[2]. Since 1958, when the disease broke out in the Canadian province of Ontario for the first time[3], many countries have reported about it. Serological test results proved that it is common for the pigs to be infected by PHEV[4,5], and the disease may have spread worldwide. In August 2006, the disease broke out in part of the pig farms in Argentina, resulting to 1226 deaths, with the morbidity rate up to 52.6%[1]. In China, an PHEV infection has been reported occurring in a pig farm in Beijing as early as in 1985, followed with reports from Jilin, Liaoning, Shandong, Taiwan, etc. The large-scale epidemics of HEV occurred in Taiwan in 1994 had a fatality rate of almost 100%[6], resulting to serious economic losses. Serological survey conducted by foreign scholars revealed that PHEV infection in pigs is very common, with a worldwide distribution[4,5].
Coronaviruses are usually divided into three groups based on genetic and serological relationship. HEV, togather with murine hepatitis virus (MHV), bovine coronavirus (BCoV), human coronavirus OC43 (HcoV-OC43), rat coronavirus (RCoV), belongs to group 2[8].
In this report, the clinical and neuropathologic feathers of spontaneous PHEV infection had been reported. Microscopically, coronavirus-like particles were detected in the supernatant of the brain samples by electron microscopy. One coronavirus strain (isolate PHEV-JLsp09) was isolated from the piglets. The Hemagglutinin- esterase (HE) gene of PHEV strains was amplified by reverse transcription- polymerase chain reaction (RT-PCR) and sequenced. The homology and phylogenetic analyses were done between the group 2 coronaviruses and influenza C virus strains downloaded from Genbank, based on the sequence of HE gene.
Thirty-eight Ministries (70%), 20 (77%) MOHs, 16 (64%) MAgs, and 2 (66%) combined entities, replied that they perform prioritization exercises for endemic diseases or provided a frequency in which the process is performed. When asked how often they prioritize, 22 (58%) prioritize annually, 2 (5%) every 2–3 years, and 5 (13%) every 5–10 years. The remaining 4 (11%) ministries responded that they prioritize after an outbreak or at undefined time to next prioritization. Five (13%) ministries responded that they prioritize but did not specify a time frame for prioritization.
Ministries were asked to list the three priority endemic zoonoses, resulting in 145 responses including 25 unique diseases (Table 1). The disease category “rabies” (including rabies (n = 24), bovine rabies (n = 5) and canine rabies (n = 1)), was reported by 30 ministries (55%), followed by leptospirosis (including leptospirosis (24) and Leptospira icterohemorragiae (1)) (25, 46%), brucellosis (including brucellosis (14), bovine brucellosis (4), and caprine brucellosis (1)) (19,35%), tuberculosis (including tuberculosis (7), and bovine tuberculosis (6)) (13, 24%), and Salmonella ((Salmonella (9), Salmonella enterica (1) and Salmonella Enteritidis (1)) (11, 20%). The most frequently reported priority endemic zoonoses by MOHs were leptospirosis (18, 69%), rabies (15, 58%), brucellosis (4, 15%), and salmonellosis (4, 15%). For MAgs, the most frequently reported were brucellosis (15, 60%), rabies (13, 52%), and tuberculosis (10, 40%) (Fig 1).
The number of criteria used by the ministries to prioritize endemic zoonoses ranged from 0–14, with 50 ministries (93%) reporting the use of at least one criterion. On average, ministries used 4.72 criteria per zoonosis. The criteria most frequently reported as used by the MOHs in their prioritization exercises were, in order of frequency, human disease incidence, human disease severity, human disease mortality, and human disease prevalence. For MAgs the criteria were economic impact, animal disease prevalence, human disease incidence, and animal disease incidence. Animal welfare, public opinion and DALYs were the criteria least frequently used by both ministries in their prioritizations. Respondents were asked to describe the methodologies used to aggregate the multiple impacts assessments across criteria. Most ministries reported using expert opinion (19, 59%), eight (25%) used multi-criteria decision techniques, five (16%) used epidemiology values, and three (9%) reported other approaches (producer demands, an analysis of indicators, and meetings regarding control and prevention).
For emerging zoonoses, 31 (57%) ministries, 16 (64%) MAgs, 14 (54%) MOHs, and 1 (33%) combined entity), completed formal prioritization exercises. Each ministry was asked to list their three emerging priority zoonoses, resulting in 130 responses including 25 diseases (Table 2). The most frequently reported emerging zoonoses were avian influenza (AI) (avian influenza (21), H5N1 (3), H7N9 (1), and highly pathogenic avian influenza (6)) (31 ministries, 57%), Ebola virus disease (EVD) (19, 35%), and Bovine Spongiform Encephalopathy (BSE) (15, 28%). The most frequently reported emerging zoonoses by MOHs were EVD (14, 54%), AI (11, 42%), and Chikungunya (8, 31%). For MAgs, the most frequently reported were AI (18, 72%), BSE (10, 40%), and West Nile virus (WNV) and rabies (7, 28% each).
Of the 31 ministries that completed prioritization exercises for emerging zoonoses, 30 considered the probability of introduction of the emerging condition in their countries as a criterion in their prioritizations, and 29 the impact of such introduction. Impact on public health, society, the economy, and the environment were the most frequently reported by both ministries. Other impacts used by both ministries included the impact on tourism, animal health and agriculture, international relations, and food security. Ministries also considered criteria such as international sanitary regulations and the probability of rapid transmission in their prioritization of emerging zoonoses. Only 18 (33%) of the ministries (6 MAg (24%), and 12 MOH (46%)) considered equity as a criterion in their prioritization and allocation of resources for the control of zoonoses, whether endemic or emerging.
Ministries were asked to select from a range (from very probable (0.81–1.00) to very unlikely (<0.20)) the probability of introduction of their priority emerging zoonoses into their countries. Approximately 85% of the ministries reported moderate to very probable the introduction of Chikungunya; 71% moderate to probable the introduction of AI; 56% moderate to probable the introduction of WNV; 36% moderate to very probable the introduction of EVD; and 25% moderate to probable the introduction of BSE. Chikungunya and EVD were the only two conditions reported as of very probable introduction by 43% and 7% of respondents, respectively (Fig 2). While ministries were asked for the timeline regarding the probability of introduction, the majority did not know or chose not to answer this question.
The infection rates in the different GBF were determined by real-time RT-PCR targeting the conserved M gene of AIV (Table 1). A very high infection rate (99%) was observed in the single GBF sampled in 2009. In 2010 the infection rates ranged between 0 and 24%, being of 8% in the farm infected in 2009. Initial subtyping, searching for H5, H7, H9, N1, and N7 by real-time RT-PCR performed on positive samples led to the identification of an H10N7 strain in the GBF in 2009 and further testing for H10 by real-time RT-PCR showed that the outbreak involved a single H10N7 virus (named H10N7 Camargue below). The viral strains involved in the infections observed in 2010 were LPAIV. H5, H7, H9, H10, and N7 subtypes were searched among them but none was detected.
In shot ducks, mean AIV infection rates were 15% and 5% during the 2008/2009 and the 2009/2010 hunting seasons, respectively (Table 2). During these two periods the maximum monthly infection rates (respectively 20% and 16%) were observed in September (Table 2). All the involved strains were LPAIV but an important proportion of these were of the H5 subtype (2008/2009∶17%; 2009/2010∶30%). No H7 or H10N7 subtype was detected in AIV found in shot ducks.