934 resultados para Microbial infections
Resumo:
Mucosal adjuvants are important to overcome the state of immune tolerance normally associated with mucosal delivery and to enhance adaptive immunity to often-weakly immunogenic subunit vaccine antigens. Unfortunately, adverse side effects of many experimental adjuvants limit the number of adjuvants approved for vaccination. Lipid C is a novel, non-toxic, lipid oral vaccine-delivery formulation, developed originally for oral delivery of the live Mycobacterium bovis Bacille Calmette-Guerin (BCG) vaccine. In the present study, murine models of chlamydial respiratory and genital tract infections were used to determine whether transcutaneous immunization (TCI) with Lipid C-incorporated protein antigens could elicit protective immunity at the genital and respiratory mucosae. BALB/c mice were immunized transcutaneously with Lipid C containing the chlamydial major outer membrane protein (MOMP), with and without addition of cholera toxin and CpG-ODN 1826 (CT/CpG). Both vaccine combinations induced mixed cell-mediated and mucosal antibody immune responses. Immunization with Lipid C-incorporated MOMP (Lipid C/MOMP), either alone or with CT/CpG resulted in partial protection following live challenge with Chlamydia muridarum as evidenced by a significant reduction in recoverable Chlamydia from both the genital secretions and lung tissue. Protection induced by immunization with Lipid C/MOMP alone was not further enhanced by the addition of CT/CpG. These results highlight the potential of Lipid C as a novel mucosal adjuvant capable of targeting multiple mucosal surfaces following TCI. Protection at both the respiratory and genital mucosae was achieved without the requirement for potentially toxic adjuvants, suggesting that Lipid C may provide a safe effective mucosal adjuvant for human vaccination.
Resumo:
China's National Health and Family Planning Commission announced 3 deaths caused by avian-origin influenza A(H7N9) virus in March, which was the first time that the H7N9 strain has been found in humans [1]. This is of major public health significance and raises urgent questions and global concerns [2, 3]. To explore epidemic characteristics of human infections with H7N9 virus, data on individual cases from 19 February 2013 (onset date of first case) to 14 April 2013 were collected from the China Information System for Disease Control and Prevention, which included information about sex; age; occupation; residential address; and day of symptom onset, diagnosis, and outcome for each case. The definition of an unconfirmed probable H7N9 case is a patient with epidemiologic evidence of contact …
Resumo:
It was widely anticipated that after the introduction of silicone hydrogel lenses, the risk of microbial keratitis would be lower than with hydrogel lenses because of the reduction in hypoxic effects on the corneal epithelium. Large-scale epidemiological studies have confirmed that the absolute and relative risk of microbial keratitis is unchanged with overnight use of silicone hydrogel materials. The key findings include the following: (1) The risk of infection with 30 nights of silicone hydrogel use is equivalent to 6 nights of hydrogel extended wear; (2) Occasional overnight lens use is associated with a greater risk than daily lens use; (3) The rate of vision loss due to corneal infection with silicone hydrogel contact lenses is similar to that seen in hydrogel lenses; (4) The spectrum of causative organisms is similar to that seen in hydrogel lenses, and the material type does not impact the corneal location of presumed microbial keratitis; and (5) Modifiable risk factors for infection include overnight lens use, the degree of exposure, failing to wash hands before lens handling, and storage case hygiene practice. The lack of change in the absolute risk of disease would suggest that exposure to large number of pathogenic organisms can overcome any advantages obtained from eliminating the hypoxic effects of contact lenses. Epidemiological studies remain important in the assessment of new materials and modalities. Consideration of an early adopter effect with studies involving new materials and modalities and further investigation of the impact of second-generation silicone hydrogel materials is warranted.
Resumo:
Contact lenses are a successful and popular means to correct refractive error and are worn by just under 700,000 Australians1 and approximately 125 million people worldwide. The most serious complication of contact lens wear is microbial keratitis, a potentially sight-threatening corneal infection most often caused by bacteria. Gram-negative bacteria, in particular pseudomonas species, account for the majority of severe bacterial infections. Pathogens such as fungi or amoebae, which feature less often, are associated with significant morbidity. These unusual pathogens have come into the spotlight in recent times with an apparent association with specific lens cleaning solutions...
Resumo:
The contact lens industry has evolved and now provides many choices, including continuous wear, overnight orthokeratology, frequent-replacement lenses, daily-disposable lenses, and many alternatives in systems of care and maintenance. Epidemiologic studies to date have shown that how a lens is worn, particularly if worn overnight, can increase the risk of microbial keratitis. However, the risk of silicone hydrogel contact lenses worn on a continuous-wear basis has been evaluated only recently. This article summarizes the recent research data on extended-wear silicone hydrogel lenses and discusses the challenges of early evaluations of silicone hydrogel lens safety. Finally, the relevance of this information is discussed to practitioners and contact lens wearers making choices about the risks and benefits of different products and how they are used.
Resumo:
Chlamydia pecorum is a significant pathogen of domestic livestock and wildlife. We have developed a C. pecorum-specific multilocus sequence analysis (MLSA) scheme to examine the genetic diversity of and relationships between Australian sheep, cattle, and koala isolates. An MLSA of seven concatenated housekeeping gene fragments was performed using 35 isolates, including 18 livestock isolates (11 Australian sheep, one Australian cow, and six U.S. livestock isolates) and 17 Australian koala isolates. Phylogenetic analyses showed that the koala isolates formed a distinct clade, with limited clustering with C. pecorum isolates from Australian sheep. We identified 11 MLSA sequence types (STs) among Australian C. pecorum isolates, 10 of them novel, with koala and sheep sharing at least one identical ST (designated ST2013Aa). ST23, previously identified in global C. pecorum livestock isolates, was observed here in a subset of Australian bovine and sheep isolates. Most notably, ST23 was found in association with multiple disease states and hosts, providing insights into the transmission of this pathogen between livestock hosts. The complexity of the epidemiology of this disease was further highlighted by the observation that at least two examples of sheep were infected with different C. pecorum STs in the eyes and gastrointestinal tract. We have demonstrated the feasibility of our MLSA scheme for understanding the host relationship that exists between Australian C. pecorum strains and provide the first molecular epidemiological data on infections in Australian livestock hosts.
Resumo:
OBJECTIVE: To synthesise the available evidence and estimate the comparative efficacy of control strategies to prevent total hip replacement (THR)-related surgical site infections (SSIs) using a mixed treatment comparison. DESIGN: Systematic review and mixed treatment comparison. SETTING: Hospital and other healthcare settings. PARTICIPANTS: Patients undergoing THR. PRIMARY AND SECONDARY OUTCOME MEASURES: The number of THR-related SSIs occurring following the surgical operation. RESULTS: 12 studies involving 123 788 THRs and 9 infection control strategies were identified. The strategy of 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation' significantly reduced the risk of THR-related SSI compared with the referent strategy (no systemic antibiotics+plain cement+conventional ventilation), OR 0.13 (95% credible interval (CrI) 0.03-0.35), and had the highest probability (47-64%) and highest median rank of being the most effective strategy. There was some evidence to suggest that 'systemic antibiotics+antibiotic-impregnated cement+laminar airflow' could potentially increase infection risk compared with 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation', 1.96 (95% CrI 0.52-5.37). There was no high-quality evidence that antibiotic-impregnated cement without systemic antibiotic prophylaxis was effective in reducing infection compared with plain cement with systemic antibiotics, 1.28 (95% CrI 0.38-3.38). CONCLUSIONS: We found no convincing evidence in favour of the use of laminar airflow over conventional ventilation for prevention of THR-related SSIs, yet laminar airflow is costly and widely used. Antibiotic-impregnated cement without systemic antibiotics may not be effective in reducing THR-related SSIs. The combination with the highest confidence for reducing SSIs was 'systemic antibiotics+antibiotic-impregnated cement+conventional ventilation'. Our evidence synthesis underscores the need to review current guidelines based on the available evidence, and to conduct further high-quality double-blind randomised controlled trials to better inform the current clinical guidelines and practice for prevention of THR-related SSIs.
Resumo:
Background Through clinical observation nursing staff of an inpatient rehabilitation unit identified a link between incontinence and undiagnosed urinary tract infections (UTIs). Further, clinical observation and structured continence management led to the realisation that urinary incontinence often improved, or resolved completely, after treatment with antibiotics. In 2009 a small study found that 30% of admitted rehabilitation patients had an undiagnosed UTI, with the majority admitted post-orthopaedic fracture. We suspected that the frequent use of indwelling urinary catheters (IDCs) in the orthopaedic environment may have been a contributing factor. Therefore, a second, more thorough, study was commenced in 2010 and completed in 2011. Aim The aim of this study was to identify what proportion of patients were admitted to one rehabilitation unit with an undiagnosed UTI over a 12-month period. We wanted to identify and highlight the presence of known risk factors associated with UTI and determine whether urinary incontinence was associated with the presence of UTI. Methods Data were collected from every patient that was admitted over a 12-month period (n=140). The majority of patients were over the age of 65 and had an orthopaedic fracture (36.4%) or stroke (27.1%). Mid-stream urine (MSU) samples, routinely collected and sent for culture and sensitivity as part of standard admission procedure, were used by the treating medical officer to detect the presence of UTI. A data collection sheet was developed, reviewed and trialled, before official data collection commenced. Data were collected as part of usual practice and collated by a research assistant. Inferential statistics were used to analyse the data. Results This study found that 25 (17.9%) of the 140 patients admitted to rehabilitation had an undiagnosed UTI, with a statistically significant association between prior presence of an IDC and the diagnosis of UTI. Urinary incontinence improved after the completion of treatment with antibiotics. Results further demonstrated a significant association between the confirmation of a UTI on culture and sensitivity and the absence of symptoms usually associated with UTI, such as burning or stinging on urination. Overall, this study suggests careful monitoring of urinary symptoms in patients admitted to rehabilitation, especially in patients with a prior IDC, is warranted.
Resumo:
M. fortuitum is a rapidly growing mycobacterium associated with community-acquired and nosocomial wound, soft tissue, and pulmonary infections. It has been postulated that water has been the source of infection especially in the hospital setting. The aim of this study was to determine if municipal water may be the source of community-acquired or nosocomial infections in the Brisbane area. Between 2007 and 2009, 20 strains of M. fortuitum were recovered from municipal water and 53 patients’ isolates were submitted to the reference laboratory. A wide variation in strain types was identified using repetitive element sequence-based PCR, with 13 clusters of ≥2 indistinguishable isolates, and 28 patterns consisting of individual isolates. The clusters could be grouped into seven similar groups (>95% similarity). Municipal water and clinical isolates collected during the same time period and from the same geographical area consisted of different strain types, making municipal water an unlikely source of sporadic human infection.
Resumo:
Mycobacterium kansasii is a pulmonary pathogen that has been grown readily from municipal water, but rarely isolated from natural waters. A definitive link between water exposure and disease has not been demonstrated and the environmental niche for this organism is poorly understood. Strain typing of clinical isolates has revealed seven subtypes with Type 1 being highly clonal and responsible for most infections worldwide. The prevalence of other subtypes varies geographically. In this study 49 water isolates are compared with 72 patient isolates from the same geographical area (Brisbane, Australia), using automated repetitive unit PCR (Diversilab) and ITS RFLP. The clonality of the dominant clinical strain type is again demonstrated but with rep-PCR, strain variation within this group is evident comparable with other reported methods. There is significant heterogeneity of water isolates and very few are similar or related to the clinical isolates. This suggests that if water or aerosol transmission is the mode of infection, then point source contamination likely occurs from an alternative environmental source.
Resumo:
The catalytic action of putrescine specific amine oxidases acting in tandem with 4-aminobutyraldehyde dehydrogenase is explored as a degradative pathway in Rhodococcus opacus. By limiting the nitrogen source, increased catalytic activity was induced leading to a coordinated response in the oxidative deamination of putrescine to 4-aminobutyraldehyde and subsequent dehydrogenation to 4-aminobutyrate. Isolating the dehydrogenase by ion exchange chromatography and gel filtration revealed that the enzyme acts principally on linear aliphatic aldehydes possessing an amino moiety. Michaelis-Menten kinetic analysis delivered a Michaelis constant (KM=0.014mM) and maximum rate (Vmax=11.2μmol/min/mg) for the conversion of 4-aminobutyraldehyde to 4-aminobutyrate. The dehydrogenase identified by MALDI-TOF mass spectrometric analysis (E value=0.031, 23% coverage) belongs to a functionally related genomic cluster that includes the amine oxidase, suggesting their association in a directed cell response. Key regulatory, stress and transport encoding genes have been identified, along with candidate dehydrogenases and transaminases for the further conversion of 4-aminobutyrate to succinate. Genomic analysis has revealed highly similar metabolic gene clustering among members of Actinobacteria, providing insight into putrescine degradation notably among Micrococcaceae, Rhodococci and Corynebacterium by a pathway that was previously uncharacterised in bacteria.
Resumo:
The impact of acid rock drainage (ARD) and eutrophication on microbial communities in stream sediments above and below an abandoned mine site in the Adelaide Hills, South Australia, was quantified by PLFA analysis. Multivariate analysis of water quality parameters, including anions, soluble heavy metals, pH, and conductivity, as well as total extractable metal concentrations in sediments, produced clustering of sample sites into three distinct groups. These groups corresponded with levels of nutrient enrichment and/or concentration of pollutants associated with ARD. Total PLFA concentration, which is indicative of microbial biomass, was reduced by >70% at sites along the stream between the mine site and as far as 18 km downstream. Further downstream, however, recovery of the microbial abundance was apparent, possibly reflecting dilution effect by downstream tributaries. Total PLFA was >40% higher at, and immediately below, the mine site (0-0.1 km), compared with sites further downstream (2.5-18 km), even after accounting for differences in specific surface area of different sediment samples. The increased microbial population in the proximity of the mine source may be associated with the presence of a thriving iron-oxidizing bacteria community as a consequence of optimal conditions for these organisms while the lower microbial population further downstream corresponded with greater sediments' metal concentrations. PCA of relative abundance revealed a number of PLFAs which were most influential in discriminating between ARD-polluted sites and the rest of the sites. These PLFA included the hydroxy fatty acids: 2OH12:0, 3OH12:0, 2OH16:0; the fungal marker: 18:2ω6; the sulfate-reducing bacteria marker 10Me16:1ω7; and the saturated fatty acids 12:0, 16:0, 18:0. Partial constrained ordination revealed that the environmental parameters with the greatest bearing on the PLFA profiles included pH, soluble aluminum, total extractable iron, and zinc. The study demonstrated the successful application of PLFA analysis to rapidly assess the toxicity of ARD-affected waters and sediments and to differentiate this response from the effects of other pollutants, such as increased nutrients and salinity.
Resumo:
One DDT-contaminated soil and two uncontaminated soils were used to enumerate DDT-resistant microbes (bacteria, actinomycetes and fungi) by using soil dilution agar plates in media either with 150 μg DDT ml -1 or without DDT at different temperatures (25, 37 and 55°C). Microbial populations in this study were significantly (p<0.001) affected by DDT in the growth medium. However, the numbers of microbes in long-term contaminated and uncontaminated soils were similar, presumably indicating that DDT-resistant microbes had developed over a long time exposure. The tolerance of isolated soil microbes to DDT varied in the order fungi>actinomycetes>bacteria. Bacteria from contaminated soil were more resistant to DDT than bacteria from uncontaminated soils. Microbes isolated at different temperatures also demonstrated varying degrees of DDT resistance. For example, bacteria and actinomycetes isolated at all incubation temperatures were sensitive to DDT. Conversely fungi isolated at all temperatures were unaffected by DDT.
Resumo:
The cotton strip assay (CSA) is an established technique for measuring soil microbial activity. The technique involves burying cotton strips and measuring their tensile strength after a certain time. This gives a measure of the rotting rate, R, of the cotton strips. R is then a measure of soil microbial activity. This paper examines properties of the technique and indicates how the assay can be optimised. Humidity conditioning of the cotton strips before measuring their tensile strength reduced the within and between day variance and enabled the distribution of the tensile strength measurements to approximate normality. The test data came from a three-way factorial experiment (two soils, two temperatures, three moisture levels). The cotton strips were buried in the soil for intervals of time ranging up to 6 weeks. This enabled the rate of loss of cotton tensile strength with time to be studied under a range of conditions. An inverse cubic model accounted for greater than 90% of the total variation within each treatment combination. This offers support for summarising the decomposition process by a single parameter R. The approximate variance of the decomposition rate was estimated from a function incorporating the variance of tensile strength and the differential of the function for the rate of decomposition, R, with respect to tensile strength. This variance function has a minimum when the measured strength is approximately 2/3 that of the original strength. The estimates of R are almost unbiased and relatively robust against the cotton strips being left in the soil for more or less than the optimal time. We conclude that the rotting rate X should be measured using the inverse cubic equation, and that the cotton strips should be left in the soil until their strength has been reduced to about 2/3.