950 resultados para infection rate
Resumo:
There are approximately 92 million new chlamydial infections of the genital tract in humans diagnosed each year, costing health care systems billions of dollars in treatment not only of acute infections, but also of associated inflammatory sequelae, such as pelvic inflammatory disease (PID) and ectopic pregnancy. These numbers are increasing at a steady rate and, due to the asymptomatic nature of infections, the incidence may be underestimated and the costs of treatment therefore higher. Over the previous few decades there has been a large amount of research into the development of an efficacious vaccine against genital tract chlamydial infections. The majority of this research has focused on females, due to the high rate of development of associated diseases, including PID, which can lead to ectopic pregnancy and infertility. In light of the increasing infection rates that have occurred despite the availability of antibiotics, and the asymptomatic nature of chlamydial infections, it is imperative that an efficacious vaccine that protects against infection and associated pathology be developed.
Resumo:
Background: Room ventilation is a key determinant of airborne disease transmission. Despite this, ventilation guidelines in hospitals are not founded on robust scientific evidence related to prevention of airborne transmission. Methods: We sought to assess the effect of ventilation rates on influenza, tuberculosis (TB) and rhinovirus infection risk within three distinct rooms in a major urban hospital; a Lung Function Laboratory, Emergency Department (ED) Negative-pressure Isolation Room and an Outpatient Consultation Room were investigated. Air exchange rate measurements were performed in each room using CO2 as a tracer. Gammaitoni and Nucci’s model was employed to estimate infection risk. Results: Current outdoor air exchange rates in the Lung Function Laboratory and ED Isolation Room limited infection risks to between 0.1 and 3.6%. Influenza risk for individuals entering an Outpatient Consultation Room after an infectious individual departed ranged from 3.6 to 20.7%, depending on the duration for which each person occupied the room. Conclusions: Given the absence of definitive ventilation guidelines for hospitals, air exchange measurements combined with modelling afford a useful means of assessing, on a case-by-case basis, the suitability of room ventilation at preventing airborne disease transmission.
Resumo:
Objective To examine the risk factors for Mycobacterium tuberculosis infection (MTI) among Greenlandic children for the purpose of identifying those at highest risk of infection. Methods Between 2005 and 2007, 1797 Greenlandic schoolchildren in five different areas were tested for MTI with an interferon gamma release assay (IGRA) and a tuberculin skin test (TST). Parents or guardians were surveyed using a standardized self-administered questionnaire to obtain data on crowding in the household, parents’ educational level and the child’s health status. Demographic data for each child – i.e. parents’ place of birth, number of siblings, distance between siblings (next younger and next older), birth order and mother’s age when the child was born – were also extracted from a public registry. Logistic regression was used to check for associations between these variables and MTI, and all results were expressed as odds ratios (ORs) and 95% confidence intervals (CIs). Children were considered to have MTI if they tested positive on both the IGRA assay and the TST. Findings The overall prevalence of MTI was 8.5% (152/1797). MTI was diagnosed in 26.7% of the children with a known TB contact, as opposed to 6.4% of the children without such contact. Overall, the MTI rate was higher among Inuit children (OR: 4.22; 95% CI: 1.55–11.5) and among children born less than one year after the birth of the next older sibling (OR: 2.48; 95% CI: 1.33–4.63). Self-reported TB contact modified the profile to include household crowding and low mother’s education. Children who had an older MTI-positive sibling were much more likely to test positive for MTI themselves (OR: 14.2; 95% CI: 5.75–35.0) than children without an infected older sibling. Conclusion Ethnicity, sibling relations, number of household residents and maternal level of education are factors associated with the risk of TB infection among children in Greenland. The strong household clustering of MTI suggests that family sources of exposure are important.
Resumo:
The 'open window' theory is characterised by short term suppression of the immune system following an acute bout of endurance exercise. This window of opportunity may allow for an increase in susceptibility to upper respiratory illness (URI). Many studies have indicated a decrease in immune function in response to exercise. However, many studies do not indicate changes in immune function past 2 hours after the completion of exercise, consequently failing to determine whether these immune cells numbers, or importantly their function, return to resting levels before the start of another bout of exercise. Ten male 'A' grade cyclists (age 24.2 +/- 5.3 years; body mass 73.8 +/- 6.5 kg; VO(2peak) 65.9 +/- 7.1 mL.kg(-1).min(-1)) exercised for two hours at 90% of their second ventilatory threshold. Blood samples were collected pre-, immediately post-, 2 hours, 4 hours, 6 hours, 8 hours, and 24 hours post-exercise. Immune variables examined included total leukocyte counts, neutrophil function (oxidative burst and phagocytic function), lymphocyte subset counts (CD4(+), CD8(+), and CD16(+)/56(+)), natural killer cell activity (NKCA), and NK phenotypes (CD56(dim)CD16(+), and CD56(bright)CD16(-)). There was a significant increase in total lymphocyte numbers from pre-, to immediately post-exercise (p<0.01), followed by a significant decrease at 2 hours post-exercise (p<0.001). CD4(+) T-cell counts significantly increased from pre-exercise, to 4 hours post- (p<0.05), and 6 hours post-exercise (p<0.01). However, NK (CD16(+)/56(+)) cell numbers decreased significantly from pre-exercise to 4 h post-exercise (p<0.05), to 6 h post-exercise (p<0.05), and to 8 h post-exercise (p<0.01). In contrast, CD56(bright)CD16- NK cell counts significantly increased from pre-exercise to immediately post-exercise (p<0.01). Neutrophil oxidative burst activity did not significantly change in response to exercise, while neutrophil cell counts significantly increased from pre-exercise, to immediately post-exercise (p<0.05), and 2 hours post-exercise (p<0.01), and remained significantly above pre-exercise levels to 8 hours post-exercise (p<0.01). Neutrophil phagocytic function significantly decreased from 2 hours post-exercise, to 6 hours post- (p<0.05), and 24 hours post-exercise (p<0.05). Finally, eosinophil cell counts significantly increased from 2 hours post to 6 hours post- (p<0.05), and 8 hours post-exercise (p<0.05). This is the first study to show changes in immunological variables up to 8 hours post-exercise, including significant NK cell suppression, NK cell phenotype changes, a significant increase in total lymphocyte counts, and a significant increase in eosinophil cell counts all at 8 hours post-exercise. Suppression of total lymphocyte counts, NK cell counts and neutrophil phagocytic function following exercise may be important in the increased rate of URI in response to regular intense endurance training.
Resumo:
Infection control practitioners (ICPs) work across the full spectrum of health care settings and carry out a broad range of practice activities. Whilst several studies have reported on the role of the ICP, there has been little investigation of the scope of infection control practice. This knowledge is essential to inform the professional, legal, educational and financial implications of this specialist role. One hundred and thirteen ICPs from a range of health care settings across Queensland were surveyed. Respondents were asked to rate the extent to which they were and should be engaging in the range of practices identified by Gardner, Jones & Olesen (1999). Significant differences were evident between what ICPs said was their actual practice versus what they thought they should be doing. Overall, the respondents consistently reported that they should be engaging in more of the range of infection control activities than they were, particularly with regard to management practices. A number of differences were found according to the context in which the practitioners worked, such as the type and size of facility and their employment status. The results of this study indicate that the scope of infection control practice has clearly moved beyond those practices that are confined by the hospital wall and defined by surveillance activities.
Resumo:
Objectives: To report the quarterly incidence of hospital-identified Clostridium difficile infection (HI-CDI) in Australia, and to estimate the burden ascribed to hospital-associated (HA) and community-associated (CA) infections. Design, setting and patients: Prospective surveillance of all cases of CDI diagnosed in hospital patients from 1 January 2011 to 31 December 2012 in 450 public hospitals in all Australian states and the Australian Capital Territory. All patients admitted to inpatient wards or units in acute public hospitals, including psychiatry, rehabilitation and aged care, were included, as well as those attending emergency departments and outpatient clinics. Main outcome measures: Incidence of HI-CDI (primary outcome); proportion and incidence of HA-CDI and CA-CDI (secondary outcomes). Results: The annual incidence of HI-CDI increased from 3.25/10 000 patient-days (PD) in 2011 to 4.03/10 000 PD in 2012. Poisson regression modelling demonstrated a 29% increase (95% CI, 25% to 34%) per quarter between April and December 2011, with a peak of 4.49/10 000 PD in the October–December quarter. The incidence plateaued in January–March 2012 and then declined by 8% (95% CI, − 11% to − 5%) per quarter to 3.76/10 000 PD in July–September 2012, after which the rate rose again by 11% (95% CI, 4% to 19%) per quarter to 4.09/10 000 PD in October–December 2012. Trends were similar for HA-CDI and CA-CDI. A subgroup analysis determined that 26% of cases were CA-CDI. Conclusions: A significant increase in both HA-CDI and CA-CDI identified through hospital surveillance occurred in Australia during 2011–2012. Studies are required to further characterise the epidemiology of CDI in Australia.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.
Resumo:
In preparation for the introduction of human papillomavirus (HPV) vaccine, we investigated awareness and knowledge of HPV/HPV vaccine and potential acceptability to HPV vaccine among mothers with a teenage daughter in Weihai, Shandong, China. A cross-sectional survey was conducted in 2013 with a sample of 1850 mothers who had a daughter (aged 9–17 years) attending primary, junior and senior high schools. In the final sample (N = 1578, response rate 85.30%), awareness of HPV was reported by 305 (19.32%) mothers. Awareness varied significantly by daughter’s age (P<0.01), mother’s education level (P<0.01), mother’s occupation (P<0.01), household income (P<0.01) and residence type (P<0.01). Knowledge about HPV/HPV vaccine was poor with a mean total score of 3.56 (SD = 2.40) out of a possible score of 13. Mothers with a higher education level reported higher levels of knowledge (P = 0.02). Slightly more than one-fourth (26.49%) of mothers expressed their potential acceptability of HPV vaccine for their daughters. Acceptability increased along with increased daughters’ age (P<0.01), household income (P<0.01) and knowledge level (P<0.01). House wives and unemployed mothers had the highest acceptability (P<0.01). The most common reasons for not accepting HPV vaccination were “My daughter is too young to have risk of cervical cancer (30.95%)”, “The vaccine has not been widely used, and the decision will be made after it is widely used (24.91%)”, “Worry about the safety of the vaccine (22.85%)”. Awareness and knowledge of HPV/HPV vaccines are poor and HPV vaccine acceptability is low among these Chinese mothers. These results may help inform appropriate health education programs in this population.
Resumo:
Rabbit haemorrhagic disease is a major tool for the management of introduced, wild rabbits in Australia. However, new evidence suggests that rabbits may be developing resistance to the disease. Rabbits sourced from wild populations in central and southeastern Australia, and domestic rabbits for comparison, were experimentally challenged with a low 60 ID50 oral dose of commercially available Czech CAPM 351 virus - the original strain released in Australia. Levels of resistance to infection were generally higher than for unselected domestic rabbits and also differed (0-73% infection rates) between wild populations. Resistance was lower in populations from cooler, wetter regions and also low in arid regions with the highest resistance seen within zones of moderate rainfall. These findings suggest the external influences of non-pathogenic calicivirus in cooler, wetter areas and poor recruitment in arid populations may influence the development rate of resistance in Australia.
Resumo:
Q fever is a vaccine-preventable disease; despite this, high annual notification numbers are still recorded in Australia. We have previously shown seroprevalence in Queensland metropolitan regions is approaching that of rural areas. This study investigated the presence of nucleic acid from Coxiella burnetii, the agent responsible for Q fever, in a number of animal and environmental samples collected throughout Queensland, to identify potential sources of human infection. Samples were collected from 129 geographical locations and included urine, faeces and whole blood from 22 different animal species; 45 ticks were removed from two species, canines and possums; 151 soil samples; 72 atmospheric dust samples collected from two locations and 50 dust swabs collected from domestic vacuum cleaners. PCR testing was performed targeting the IS1111 and COM1 genes for the specific detection of C.burnetii DNA. There were 85 detections from 1318 animal samples, giving a detection rate for each sample type ranging from 2.1 to 6.8%. Equine samples produced a detection rate of 11.9%, whilst feline and canine samples showed detection rates of 7.8% and 5.2%, respectively. Native animals had varying detection rates: pooled urines from flying foxes had 7.8%, whilst koalas had 5.1%, and 6.7% of ticks screened were positive. The soil and dust samples showed the presence of C.burnetii DNA ranging from 2.0 to 6.9%, respectively. These data show that specimens from a variety of animal species and the general environment provide a number of potential sources for C.burnetii infections of humans living in Queensland. These previously unrecognized sources may account for the high seroprevalence rates seen in putative low-risk communities, including Q fever patients with no direct animal contact and those subjects living in a low-risk urban environment.
Resumo:
The object of this study is a tailless internal membrane-containing bacteriophage PRD1. It has a dsDNA genome with covalently bound terminal proteins required for replication. The uniqueness of the structure makes this phage a desirable object of research. PRD1 has been studied for some 30 years during which time a lot of information has accumulated on its structure and life-cycle. The two least characterised steps of the PRD1 life-cycle, the genome packaging and virus release are investigated here. PRD1 shares the main principles of virion assembly (DNA packaging in particular) and host cell lysis with other dsDNA bacteriophages. However, this phage has some fascinating individual peculiarities, such as DNA packaging into a membrane vesicle inside the capsid, absence of apparent portal protein, holin inhibitor and procapsid expansion. In the course of this study we have identified the components of the DNA packaging vertex of the capsid, and determined the function of protein P6 in packaging. We managed to purify the procapsids for an in vitro packaging system, optimise the reaction and significantly increase its efficiency. We developed a new method to determine DNA translocation and were able to quantify the efficiency and the rate of packaging. A model for PRD1 DNA packaging was also proposed. Another part of this study covers the lysis of the host cell. As other dsDNA bacteriophages PRD1 has been proposed to utilise a two-component lysis system. The existence of this lysis system in PRD1 has been proven by experiments using recombinant proteins and the multi-step nature of the lysis process has been established.
Resumo:
The purpose of the present study was to examine the outcome of pregnancies among HIV-infected women in Helsinki, use of the levonorgestrel-releasing intrauterine system (LNG-IUS) among HIV-infected women and the prevalence and risk factors of cytological and histologically proven cervical lesions in this population. Between 1993 and 2003 a total of 45 HIV-infected women delivered 52 singleton infants. HIV infection was diagnosed during pregnancy in 40% of the mothers. Seventeen of the mothers received antiretroviral (ARV) medication prior to pregnancy and in 34 cases, the medication was started during pregnancy. A good virological response (i.e. HIV RNA load <1000/mL during the last trimester) to ARV medication was achieved in 36/40 (90%) of the patients in whom HI viral load measurements were performed. Of the infants, 92% were born at term, and their mean (±SD) birth weight was 3350±395 g. The Caesarean section rate was low, 25%. All newborns received ARV medication and none of the infants born to mothers with pre-delivery diagnosis of maternal HIV infection were infected. The safety and advantages of the LNG-IUS were studied prospectively (n=12) and retrospectively (n=6). The LNG-IUS was well tolerated and no cases of PID or pregnancy were noted. Menstrual bleeding was reduced significantly during use of the LNG-IUS; this was associated with a slight increase in haemoglobin levels. Serum oestradiol concentrations remained in the follicular range in all subjects. The key finding was that genital shedding of HIV RNA did not change after the insertion of the LNG-IUS. The mean annual prevalence of low-grade squamous intraepithelial lesions (SIL) was 15% and that of high-grade SIL was 5% among 108 systematically followed HIV-infected women during 1989 2003. A reduced CD4 lymphocyte count was associated with an increased prevalence of SIL, whereas duration of HIV infection, use of ARV medication and HI viral load were not. The cumulative risk of any type of SIL was 17% after one year and 48% after five years among patients with initially normal Pap smears. The risk of developing SIL was associated with young age and a high initial HI viral load. During the follow-up 51 subjects (n=153) displayed cervical intraepithelial neoplasia (CIN), (16% CIN1 and 18% CIN 2-3). Only one case of cancer of the uterine cervix was detected. Pap smears were reliable in screening for CIN. Both nulliparity (p<0.01) and bacterial vaginosis (p<0.04) emerged as significant risk factors of CIN. In conclusion, a combination of universal antenatal screening and multidisciplinary management allows individualized treatment and prevents vertical transmission of HIV. Use of the LNG-IUS is safe among HIV-infected women and cervicovaginal shedding of HIV RNA is not affected by use of the LNG-IUS. The risk of cervical pre-malignant lesions is high among HIV-infected women despite systematic follow-up.
Resumo:
Objective: To identify key stakeholder preferences and priorities when considering a national healthcare-associated infection (HAI) surveillance programme through the use of a discrete choice experiment (DCE). Setting: Australia does not have a national HAI surveillance programme. An online web-based DCE was developed and made available to participants in Australia. Participants: A sample of 184 purposively selected healthcare workers based on their senior leadership role in infection prevention in Australia. Primary and secondary outcomes: A DCE requiring respondents to select 1 HAI surveillance programme over another based on 5 different characteristics (or attributes) in repeated hypothetical scenarios. Data were analysed using a mixed logit model to evaluate preferences and identify the relative importance of each attribute. Results: A total of 122 participants completed the survey (response rate 66%) over a 5-week period. Excluding 22 who mismatched a duplicate choice scenario, analysis was conducted on 100 responses. The key findings included: 72% of stakeholders exhibited a preference for a surveillance programme with continuous mandatory core components (mean coefficient 0.640 (p<0.01)), 65% for a standard surveillance protocol where patient-level data are collected on infected and non-infected patients (mean coefficient 0.641 (p<0.01)), and 92% for hospital-level data that are publicly reported on a website and not associated with financial penalties (mean coefficient 1.663 (p<0.01)). Conclusions: The use of the DCE has provided a unique insight to key stakeholder priorities when considering a national HAI surveillance programme. The application of a DCE offers a meaningful method to explore and quantify preferences in this setting.
Resumo:
Most women acquire genital high risk human papillomavirus (HPV) infection during their lifetime, but seldom the infection persists and leads to cervical cancer. However, currently it is not possible to identify the women who will develop HPV mediated cervical cancer and this often results to large scale follow-up and overtreatment of the likely spontaneously regressing infection. Thus, it is important to obtain more information on the course of HPV and find markers that could help to identify HPV infected women in risk for progression of cervical lesions and ultimately cancer. Nitric oxide is a free radical gas that takes part both in immune responses and carcinogenesis. Nitric oxide is produced also by cervical cells and therefore, it is possible that cervical nitric oxide could affect also HPV infection. In the present study, including 801 women from the University of Helsinki between years of 2006 and 2011, association between HPV and cervical nitric oxide was evaluated. The levels of nitric oxide were measured as its metabolites nitrate and nitirite (NOx) by spectrophotometry and the expression of nitric oxide producing enzymes endothelial and inducible synthases (eNOS, iNOS) by Western blotting. Women infected with HPV had two-times higher cervical fluid NOx levels compared with non-infected ones. The expression levels of both eNOS and iNOS were higher in HPV-infected women compared with non-infected. Another sexually transmitted disease Chlamydia trachomatis that is an independent risk factor for cervical cancer was also accompanied with elevated NOx levels, whereas vaginal infections, bacterial vaginosis and candida, did not have any effect on NOx levels. The meaning of the elevated HPV related cervical nitric oxide was evaluated in a 12 months follow-up study. It was revealed that high baseline cervical fluid NOx levels favored HPV persistence with OR 4.1. However, low sensitivity (33%) and high false negative rate (67%) restrict the clinical use of the current NOx test. This study indicated that nitric oxide favors HPV persistence and thus it seems to be one of the cofactor associated with a risk of carcinogenesis.
Resumo:
Ecoepidemiology is a well-developed branch of theoretical ecology, which explores interplay between the trophic interactions and the disease spread. In most ecoepidemiological models, however, the authors assume the predator to be a specialist, which consumes only a single prey species. In few existing papers, in which the predator was suggested to be a generalist, the alternative food supply was always considered to be constant. This is obviously a simplification of reality, since predators can often choose between a number of different prey. Consumption of these alternative prey can dramatically change their densities and strongly influence the model predictions. In this paper, we try to bridge the gap and explore a generic ecoepidemiological system with a generalist predator, where the densities of all prey are dynamical variables. The model consists of two prey species, one of which is subject to an infectious disease, and a predator, which consumes both prey species. We investigate two main scenarios of infection transmission mode: (i) the disease transmission rate is predator independent and (ii) the transmission rate is a function of predator density. For both scenarios we fulfil an extensive bifurcation analysis. We show that including a second dynamical prey in the system can drastically change the dynamics of the single prey case. In particular, the presence of a second prey impedes disease spread by decreasing the basic reproduction number and can result in a substantial drop of the disease prevalence. We demonstrate that with efficient consumption of the second prey species by the predator, the predator-dependent disease transmission can not destabilize interactions, as in the case with a specialist predator. Interestingly, even if the population of the second prey eventually vanishes and only one prey species finally remains, the system with two prey species may exhibit different properties to those of the single prey system.