317 resultados para QRS DURATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The preferential invasion of particular red blood cell (RBC) age classes may offer a mechanism by which certain species of Plasmodia regulate their population growth. Asexual reproduction of the parasite within RBCs exponentially increases the number of circulating parasites; limiting this explosion in parasite density may be key to providing sufficient time for the parasite to reproduce, and for the host to develop a specific immune response. It is critical that the role of preferential invasion in infection is properly understood to model the within-host dynamics of different Plasmodia species. We develop a simulation model to show that limiting the range of RBC age classes available for invasion is a credible mechanism for restricting parasite density, one which is equally as important as the maximum parasite replication rate and the duration of the erythrocytic cycle. Different species of Plasmodia that regularly infect humans exhibit different preferences for RBC invasion, with all species except P. falciparum appearing to exhibit a combination of characteristics which are able to selfregulate parasite density.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Standard operating procedures state that police officers should not drive while interacting with their mobile data terminal (MDT) which provides in-vehicle information essential to police work. Such interactions do however occur in practice and represent a potential source of driver distraction. The MDT comprises visual output with manual input via touch screen and keyboard. This study investigated the potential for alternative input and output methods to mitigate driver distraction with specific focus on eye movements. Method Nineteen experienced drivers of police vehicles (one female) from the NSW Police Force completed four simulated urban drives. Three drives included a concurrent secondary task: imitation licence plate search using an emulated MDT. Three different interface methods were examined: Visual-Manual, Visual-Voice, and Audio-Voice (“Visual” and “Audio” = output modality; “Manual” and “Voice” = input modality). During each drive, eye movements were recorded using FaceLAB™ (Seeing Machines Ltd, Canberra, ACT). Gaze direction and glances on the MDT were assessed. Results The Visual-Voice and Visual-Manual interfaces resulted in a significantly greater number of glances towards the MDT than Audio-Voice or Baseline. The Visual-Manual and Visual-Voice interfaces resulted in significantly more glances to the display than Audio-Voice or Baseline. For longer duration glances (>2s and 1-2s) the Visual-Manual interface resulted in significantly more fixations than Baseline or Audio-Voice. The short duration glances (<1s) were significantly greater for both Visual-Voice and Visual-Manual compared with Baseline and Audio-Voice. There were no significant differences between Baseline and Audio-Voice. Conclusion An Audio-Voice interface has the greatest potential to decrease visual distraction to police drivers. However, it is acknowledged that an audio output may have limitations for information presentation compared with visual output. The Visual-Voice interface offers an environment where the capacity to present information is sustained, whilst distraction to the driver is reduced (compared to Visual-Manual) by enabling adaptation of fixation behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Migraine is classified by the World Health Organization (WHO) as being one of the top 20 most debilitating diseases. According to the neurovascular hypothesis, neuroinflammation may promote the activation and sensitisation of meningeal nociceptors, inducing the persistent throbbing headache characterized in migraine. The tumor necrosis factor (TNF) gene cluster, made up of TNFα, lymphotoxin α (LTA), and lymphotoxin β (LTB), has been implicated to influence the intensity and duration of local inflammation. It is thought that sterile inflammation mediated by LTA, LTB, and TNFα contributes to threshold brain excitability, propagation of neuronal hyperexcitability and thus initiation and maintenance of a migraine attack. Previous studies have investigated variants within the TNF gene cluster region in relation to migraine susceptibility, with largely conflicting results. The aim of this study was to expand on previous research and utilize a large case-control cohort and range of variants within the TNF gene cluster to investigate the role of the TNF gene cluster in migraine. Nine single nucleotide polymorphisms (SNPs) were selected for investigation as follows: rs1800683, rs2229094, rs2009658, rs2071590, rs2239704, rs909253, rs1800630, rs1800629, and rs3093664. No significant association with migraine susceptibility was found for any of the SNPs tested, with further testing according to migraine subtype and gender also showing no association for disease risk. Haplotype analysis showed that none of the tested haplotypes were significantly associated with migraine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been reported that poor nutritional status, in the form of weight loss and resulting body mass index (BMI) changes, is an issue in people with Parkinson's disease (PWP). The symptoms resulting from Parkinson's disease (PD) and the side effects of PD medication have been implicated in the aetiology of nutritional decline. However, the evidence on which these claims are based is, on one hand, contradictory, and on the other, restricted primarily to otherwise healthy PWP. Despite the claims that PWP suffer from poor nutritional status, evidence is lacking to inform nutrition-related care for the management of malnutrition in PWP. The aims of this thesis were to better quantify the extent of poor nutritional status in PWP, determine the important factors differentiating the well-nourished from the malnourished and evaluate the effectiveness of an individualised nutrition intervention on nutritional status. Phase DBS: Nutritional status in people with Parkinson's disease scheduled for deep-brain stimulation surgery The pre-operative rate of malnutrition in a convenience sample of people with Parkinson's disease (PWP) scheduled for deep-brain stimulation (DBS) surgery was determined. Poorly controlled PD symptoms may result in a higher risk of malnutrition in this sub-group of PWP. Fifteen patients (11 male, median age 68.0 (42.0 – 78.0) years, median PD duration 6.75 (0.5 – 24.0) years) participated and data were collected during hospital admission for the DBS surgery. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference, waist circumference, body mass index (BMI)) were taken, and body composition was measured using bioelectrical impedance spectroscopy (BIS). Six (40%) of the participants were malnourished (SGA-B) while 53% reported significant weight loss following diagnosis. BMI was significantly different between SGA-A and SGA-B (25.6 vs 23.0kg/m 2, p<.05). There were no differences in any other variables, including PG-SGA score and the presence of non-motor symptoms. The conclusion was that malnutrition in this group is higher than that in other studies reporting malnutrition in PWP, and it is under-recognised. As poorer surgical outcomes are associated with poorer pre-operative nutritional status in other surgeries, it might be beneficial to identify patients at nutritional risk prior to surgery so that appropriate nutrition interventions can be implemented. Phase I: Nutritional status in community-dwelling adults with Parkinson's disease The rate of malnutrition in community-dwelling adults (>18 years) with Parkinson's disease was determined. One hundred twenty-five PWP (74 male, median age 70.0 (35.0 – 92.0) years, median PD duration 6.0 (0.0 – 31.0) years) participated. The scored PG-SGA was used to assess nutritional status, anthropometric measures (weight, height, mid-arm circumference (MAC), calf circumference, waist circumference, body mass index (BMI)) were taken. Nineteen (15%) of the participants were malnourished (SGA-B). All anthropometric indices were significantly different between SGA-A and SGA-B (BMI 25.9 vs 20.0kg/m2; MAC 29.1 – 25.5cm; waist circumference 95.5 vs 82.5cm; calf circumference 36.5 vs 32.5cm; all p<.05). The PG-SGA score was also significantly lower in the malnourished (2 vs 8, p<.05). The nutrition impact symptoms which differentiated between well-nourished and malnourished were no appetite, constipation, diarrhoea, problems swallowing and feel full quickly. This study concluded that malnutrition in community-dwelling PWP is higher than that documented in community-dwelling elderly (2 – 11%), yet is likely to be under-recognised. Nutrition impact symptoms play a role in reduced intake. Appropriate screening and referral processes should be established for early detection of those at risk. Phase I: Nutrition assessment tools in people with Parkinson's disease There are a number of validated and reliable nutrition screening and assessment tools available for use. None of these tools have been evaluated in PWP. In the sample described above, the use of the World Health Organisation (WHO) cut-off (≤18.5kg/m2), age-specific BMI cut-offs (≤18.5kg/m2 for under 65 years, ≤23.5kg/m2 for 65 years and older) and the revised Mini-Nutritional Assessment short form (MNA-SF) were evaluated as nutrition screening tools. The PG-SGA (including the SGA classification) and the MNA full form were evaluated as nutrition assessment tools using the SGA classification as the gold standard. For screening, the MNA-SF performed the best with sensitivity (Sn) of 94.7% and specificity (Sp) of 78.3%. For assessment, the PG-SGA with a cut-off score of 4 (Sn 100%, Sp 69.8%) performed better than the MNA (Sn 84.2%, Sp 87.7%). As the MNA has been recommended more for use as a nutrition screening tool, the MNA-SF might be more appropriate and take less time to complete. The PG-SGA might be useful to inform and monitor nutrition interventions. Phase I: Predictors of poor nutritional status in people with Parkinson's disease A number of assessments were conducted as part of the Phase I research, including those for the severity of PD motor symptoms, cognitive function, depression, anxiety, non-motor symptoms, constipation, freezing of gait and the ability to carry out activities of daily living. A higher score in all of these assessments indicates greater impairment. In addition, information about medical conditions, medications, age, age at PD diagnosis and living situation was collected. These were compared between those classified as SGA-A and as SGA-B. Regression analysis was used to identify which factors were predictive of malnutrition (SGA-B). Differences between the groups included disease severity (4% more severe SGA-A vs 21% SGA-B, p<.05), activities of daily living score (13 SGA-A vs 18 SGA-B, p<.05), depressive symptom score (8 SGA-A vs 14 SGA-B, p<.05) and gastrointestinal symptoms (4 SGA-A vs 6 SGA-B, p<.05). Significant predictors of malnutrition according to SGA were age at diagnosis (OR 1.09, 95% CI 1.01 – 1.18), amount of dopaminergic medication per kg body weight (mg/kg) (OR 1.17, 95% CI 1.04 – 1.31), more severe motor symptoms (OR 1.10, 95% CI 1.02 – 1.19), less anxiety (OR 0.90, 95% CI 0.82 – 0.98) and more depressive symptoms (OR 1.23, 95% CI 1.07 – 1.41). Significant predictors of a higher PG-SGA score included living alone (β=0.14, 95% CI 0.01 – 0.26), more depressive symptoms (β=0.02, 95% CI 0.01 – 0.02) and more severe motor symptoms (OR 0.01, 95% CI 0.01 – 0.02). More severe disease is associated with malnutrition, and this may be compounded by lack of social support. Phase II: Nutrition intervention Nineteen of the people identified in Phase I as requiring nutrition support were included in Phase II, in which a nutrition intervention was conducted. Nine participants were in the standard care group (SC), which received an information sheet only, and the other 10 participants were in the intervention group (INT), which received individualised nutrition information and weekly follow-up. INT gained 2.2% of starting body weight over the 12 week intervention period resulting in significant increases in weight, BMI, mid-arm circumference and waist circumference. The SC group gained 1% of starting weight over the 12 weeks which did not result in any significant changes in anthropometric indices. Energy and protein intake (18.3kJ/kg vs 3.8kJ/kg and 0.3g/kg vs 0.15g/kg) increased in both groups. The increase in protein intake was only significant in the SC group. The changes in intake, when compared between the groups, were no different. There were no significant changes in any motor or non-motor symptoms or in "off" times or dyskinesias in either group. Aspects of quality of life improved over the 12 weeks as well, especially emotional well-being. This thesis makes a significant contribution to the evidence base for the presence of malnutrition in Parkinson's disease as well as for the identification of those who would potentially benefit from nutrition screening and assessment. The nutrition intervention demonstrated that a traditional high protein, high energy approach to the management of malnutrition resulted in improved nutritional status and anthropometric indices with no effect on the presence of Parkinson's disease symptoms and a positive effect on quality of life.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality can be influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigation of four urban residential catchments and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling outcomes indicate that selecting smaller average recurrence interval (ARI) events with high intensity-short duration as the threshold for the treatment system design is the most feasible since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of rainfall events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose Obstructive sleep apnoea (OSA) patients effectively treated by and compliant with continuous positive air pressure (CPAP) occasionally miss a night’s treatment. The purpose of this study was to use a real car interactive driving simulator to assess the effects of such an occurrence on the next day’s driving, including the extent to which these drivers are aware of increased sleepiness. Methods Eleven long-term compliant CPAP-treated 50–75-year-old male OSA participants completed a 2-h afternoon, simulated, realistic monotonous drive in an instrumented car, twice, following one night: (1) normal sleep with CPAP and (2) nil CPAP. Drifting out of road lane (‘incidents’), subjective sleepiness every 200 s and continuous electroencephalogram (EEG) activities indicative of sleepiness and compensatory effort were monitored. Results Withdrawal of CPAP markedly increased sleep disturbance and led to significantly more incidents, a shorter ‘safe’ driving duration, increased alpha and theta EEG power and greater subjective sleepiness. However, increased EEG beta activity indicated that more compensatory effort was being applied. Importantly, under both conditions, there was a highly significant correlation between subjective and EEG measures of sleepiness, to the extent that participants were well aware of the effects of nil CPAP. Conclusions Patients should be aware that compliance with treatment every night is crucial for safe driving.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation seeks to define and classify potential forms of Nonlinear structure and explore the possibilities they afford for the creation of new musical works. It provides the first comprehensive framework for the discussion of Nonlinear structure in musical works and provides a detailed overview of the rise of nonlinearity in music during the 20th century. Nonlinear events are shown to emerge through significant parametrical discontinuity at the boundaries between regions of relatively strong internal cohesion. The dissertation situates Nonlinear structures in relation to linear structures and unstructured sonic phenomena and provides a means of evaluating Nonlinearity in a musical structure through the consideration of the degree to which the structure is integrated, contingent, compressible and determinate as a whole. It is proposed that Nonlinearity can be classified as a three dimensional space described by three continuums: the temporal continuum, encompassing sequential and multilinear forms of organization, the narrative continuum encompassing processual, game structure and developmental narrative forms and the referential continuum encompassing stylistic allusion, adaptation and quotation. The use of spectrograms of recorded musical works is proposed as a means of evaluating Nonlinearity in a musical work through the visual representation of parametrical divergence in pitch, duration, timbre and dynamic over time. Spectral and structural analysis of repertoire works is undertaken as part of an exploration of musical nonlinearity and the compositional and performative features that characterize it. The contribution of cultural, ideological, scientific and technological shifts to the emergence of Nonlinearity in music is discussed and a range of compositional factors that contributed to the emergence of musical Nonlinearity is examined. The evolution of notational innovations from the mobile score to the screen score is plotted and a novel framework for the discussion of these forms of musical transmission is proposed. A computer coordinated performative model is discussed, in which a computer synchronises screening of notational information, provides temporal coordination of the performers through click-tracks or similar methods and synchronises the audio processing and synthesized elements of the work. It is proposed that such a model constitutes a highly effective means of realizing complex Nonlinear structures. A creative folio comprising 29 original works that explore nonlinearity is presented, discussed and categorised utilising the proposed classifications. Spectrograms of these works are employed where appropriate to illustrate the instantiation of parametrically divergent substructures and examples of structural openness through multiple versioning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chlamydia trachomatis infections of the male and female reproductive tracts are the world's leading sexually transmitted bacterial disease, and can lead to damaging pathology, scarring and infertility. The resolution of chlamydial infection requires the development of adaptive immune responses to infection, and includes cell-mediated and humoral immunity. Whilst cluster of differentiation (CD)4+ T cells are known to be essential in clearance of infection [1], they are also associated with immune cell infiltration, autoimmunity and infertility in the testes [2-3]. Conversely, antibodies are less associated with inflammation, are readily transported into the reproductive tracts, and can offer lumenal neutralization of chlamydiae prior to infection. Antibodies, or immunoglobulins (Ig), play a supportive role in the resolution of chlamydial infections, and this thesis sought to define the function of IgA and IgG, against a variety of chlamydial antigens expressed during the intracellular and extracellular stages of the chlamydial developmental cycle. Transport of IgA and IgG into the mucosal lumen is facilitated by receptor-mediated transcytosis yet the expression profile (under normal conditions and during urogenital chlamydial infection) of the polymeric immunoglobulin receptor (pIgR) and the neonatal Fc receptor (FcRn) remains unknown. The expression profile of pIgR and FcRn in the murine male reproductive tract was found to be polarized to the lower and upper reproductive tract tissues respectively. This demonstrates that the two receptors have a tissue tropism, which must be considered when targeting pathogens that colonize different sites. In contrast, the expression of pIgR and FcRn in the female mouse was found to be distributed in both the upper and lower reproductive tracts. When urogenitally infected with Chlamydia muridarum, both male and female reproductive tracts up-regulated expression of pIgR and down-regulated expression of FcRn. Unsurprisingly, the up-regulation of pIgR increased the concentration of IgA in the lumen. However, down-regulation of FcRn, prevented IgG uptake and led to an increase or pooling of IgG in lumenal secretions. As previous studies have identified the importance of pIgR-mediated delivery of IgA, as well as the potential of IgA to bind and neutralize intracellular pathogens, IgA against a variety of chlamydial antigens was investigated. The protection afforded by IgA against the extracellular antigen major outer membrane protein (MOMP), was found to be dependent on pIgR expression in vitro and in vivo. It was also found that in the absence of pIgR, no protection was afforded to mice previously immunized with MOMP. The protection afforded from polyclonal IgA against the intracellular chlamydial antigens; inclusion membrane protein A (IncA), inclusion membrane proteins (IncMem) and secreted chlamydial protease-like activity factor (CPAF) were produced and investigated in vitro. Antigen-specific intracellular IgA was found to bind to the respective antigen within the infected cell, but did not significantly reduce inclusion formation (p > 0.05). This suggests that whilst IgA specific for the selected antigens was transported by pIgR to the chlamydial inclusion, it was unable to prevent growth. Similarly, immunization of male mice with intracellular chlamydial antigens (IncA or IncMem), followed by depletion CD4+ T cells, and subsequent urogenital C. muridarum challenge, provided minimal pIgR-mediated protection. Wild type male mice immunized with IncA showed a 57 % reduction (p < 0.05), and mice deficient in pIgR showed a 35 % reduction (p < 0.05) in reproductive tract chlamydial burden compared to control antigen, and in the absence of CD4+ T cells. This suggests that pIgR and secretory IgA (SIgA) were playing a protective role (21 % pIgR-mediated) in unison with another antigen-specific immune mechanism (36 %). Interestingly, IgA generated during a primary respiratory C. muridarum infection did not provide a significant amount of protection to secondary urogenital C. muridarum challenge. Together, these data suggest that IgA specific for an extracellular antigen (MOMP) can play a strong protective role in chlamydial infections, and that IgA targeting intracellular antigens is also effective but dependent on pIgR expression in tissues. However, whilst not investigated here, IgA targeting and blocking other intracellular chlamydial antigens, that are more essential for replication or type III secretion, may be more efficacious in subunit vaccines. Recently, studies have demonstrated that IgG can neutralize influenza virus by trafficking IgG-bound virus to lysosomes [4]. We sought to determine if this process could also traffic chlamydial antigens for degradation by lysosomes, despite Chlamydia spp. actively inhibiting fusion with the host endocytic pathway. As observed in pIgR-mediated delivery of anti-IncA IgA, FcRn similarly transported IgG specific for IncA which bound the inclusion membrane. Interestingly, FcRn-mediated delivery of anti-IncA IgG significantly decreased inclusion formation by 36 % (p < 0.01), and induced aberrant inclusion morphology. This suggests that unlike IgA, IgG can facilitate additional host cellular responses which affect the intracellular niche of chlamydial growth. Fluorescence microscopy revealed that IgG also bound the inclusion, but unlike influenza studies, did not induce the recruitment of lysosomes. Notably, anti-IncA IgG recruited sequestosomes to the inclusion membrane, markers of the ubiquitin/proteasome pathway and major histocompatibility complex (MHC) class I loading. To determine if the protection against C. muridarum infection afforded by IncA IgG in vitro translated in vivo, wild type mice and mice deficient in functional FcRn and MHC-I, were immunized, depleted of CD4+, and urogenitally infected with C. muridarum. Unlike in pIgR-deficient mice, the protection afforded from IncA immunization was completely abrogated in mice lacking functional FcRn and MHC-I/CD8+. Thus, both anti-IncA IgA and IgG can bind the inclusion in a pIgR and FcRn-mediated manner, respectively. However, only IgG mediates a higher reduction in chlamydial infection in vitro and in vivo suggesting more than steric blocking of IncA had occurred. Unlike anti-MOMP IgA, which reduced chlamydial infection of epithelial cells and male mouse tissues, IgG was found to enhance infectivity in vitro, and in vivo. Opsonization of EBs with MOMP-IgG enhanced inclusion formation of epithelial cells in a MOMP-IgG dose-dependent and FcRn-dependent manner. When MOMP-IgG opsonized EBs were inoculated into the vagina of female mice, a small but non-significant (p > 0.05) enhancement of cervicovaginal C. muridarum shedding was observed three days post infection in mice with functional FcRn. Interestingly, infection with opsonized EBs reduced the intensity of the peak of infection (day six) but protracted the duration of infection by 60 % in wild type mice only. Infection with EBs opsonized in IgG also significantly increased (p < 0.05) hydrosalpinx formation in the oviducts and induced lymphocyte infiltration uterine horns. As MOMP is an immunodominant antigen, and is widely used in vaccines, the ability of IgG specific to extracellular chlamydial antigens to enhance infection and induce pathology needs to be considered. Together, these data suggest that immunoglobulins play a dichotomous role in chlamydial infections, and are dependent on antigen specificity, FcRn and pIgR expression. FcRn was found to be highly expressed in upper male reproductive tract, whilst pIgR was dominantly expressed in the lower reproductive tract. Conversely, female mice expressed FcRn and pIgR in both the lower and upper reproductive tracts. In response to a normal chlamydial infection, pIgR is up-regulated increasing secretory IgA release, but FcRn is down-regulated preventing IgG uptake. Similarly to other studies [5-6], we demonstrate that IgA and IgG generated during primary chlamydial infections plays a minor role in recall immunity, and that antigen-specific subunit vaccines can offer more protection. We also show that both IgA and IgG can be used to target intracellular chlamydial antigens, but that IgG is more effective. Finally, IgA against the extracellular antigen MOMP can afford protection, whist IgG plays a deleterious role by increasing infectivity and inducing damaging immunopathology. Further investigations with additional antigens or combination subunit vaccines will enhance our understanding the protection afforded by antibodies against intracellular and extracellular pathogenic antigens, and help improve the development of an efficacious chlamydial vaccine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose/Objective: The basis for poor outcomes in some patients post transfusion remains largely unknown. Despite leukodepletion, there is still evidence of immunomodulatory effects of transfusion that require further study. In addition, there is evidence that the age of blood components transfused significantly affects patient outcomes. Myeloid dendritic cell (DC) and monocyte immune function were studied utilising an in vitro whole blood model of transfusion. Materials and methods: Freshly collected (‘recipient’) whole blood was cultured with ABO compatible leukodepleted PRBC at 25% blood replacement-volume (6hrs). PRBC were assayed at [Day (D) 2, 14, 28and 42 (date-of expiry)]. In parallel, LPS or Zymosan (Zy) were added to mimic infection. Recipients were maintained for the duration of the time course (2 recipients, 4 PRBC units, n = 8).Recipient DC and monocyte intracellular cytokines and chemokines (IL-6, IL-10, IL-12,TNF-a, IL-1a, IL-8, IP-10, MIP-1a, MIP-1b, MCP-1) were measured using flow cytometry. Changes in immune response were calculated by comparison to a parallel no transfusion control (Wilcoxin matched pairs). Influence of storage age was calculated using ANOVA. Results: Significant suppression of DC and monocyte inflammatory responses were evident. DC and monocyte production of IL-1a was reduced following exposure to PRBC regardless of storage age (P < 0.05 at all time points). Storage independent PRBC mediated suppression of DC and monocyte IL-1a was also evident in cultures costimulated with Zy. In cultures co-stimulated with either LPS or Zy, significant suppression of DC and monocyte TNF-a and IL-6 was also evident. PRBC storage attenuated monocyte TNF-a production when co-cultured with LPS (P < 0.01 ANOVA). DC and monocyte production of MIP-1a was significantly reduced following exposure to PRBC (DC: P < 0.05 at D2, 28, 42; Monocyte P < 0.05 all time points). In cultures co-stimulated with LPS and zymosan, a similar suppression of MIP-1a production was also evident, and production of both DC and monocyte MIP-1b and IP-10 were also significantly reduced. Conclusions: The complexity of the transfusion context was reflected in the whole blood approach utilised. Significant suppression of these key DC and monocyte immune responses may contribute to patient outcomes, such as increased risk of infection and longer hospital stay, following blood transfusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Macroscopic Fundamental Diagram (MFD) has been proved to exist in large urban road and freeway networks by theoretic method and real data in cities. However hysteresis and scatters have also been found existed both on motorway network and urban road. This paper investigates how the incident variables affect the scatter and shape of the MFD using both the simulated data and the real data collected from the Pacific Motorway M3 in Brisbane, Australia. Three key components of incident are investigated based on the simulated data: incident location, incident duration time and traffic demand. Results based on the simulated data indicate that MFD shape is a property not only of the network itself but also of the incident characteristics variables. MFDs for three types of real incidents (crash, hazard and breakdown) are explored separately. The results based on the empirical data are consistent with the simulated results. The hysteresis phenomenon occurs on both the upstream and the downstream of the incident location, but for opposite hysteresis loops. Gradient of the MFD for the upstream is more than that for the downstream on the incident site, when traffic demand is off peak.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Emergency management and climate change adaptation will increasingly challenge all levels of government because of three main factors. First, Australia is extremely vulnerable to the impacts of climate change, particularly through the increasing frequency, duration and/or intensity of disasters such as floods and bushfires. Second, the system of government that divides powers by function and level can often act as a barrier to a well-integrated response. Third, policymaking processes struggle to cope with such complex inter-jurisdictional issues. This paper discusses these factors and explores the nature of the challenge for Australian governments. Investigations into the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods offer an indication of the challenges ahead and it is argued that there is a need to: improve community engagement and communication; refocus attention on resilience; improve interagency communication and collaboration; and, develop institutional arrangements that support continual improvement and policy learning. These findings offer an opportunity for improving responses as well as a starting point for integrating disaster risk management and climate change adaptation policies. The paper is based on the preliminary findings of an NCCARF funded research project: The Right Tool for the Job: Achieving climate change adaptation outcomes through improved disaster management policies, planning and risk management strategies involving Griffith University and RMIT. It should be noted from the outset that the purpose of this research project is not to criticise the actions of emergency service workers and volunteers who do an incredible job under extreme circumstances, often risking their own lives in the process. The aim is simply to offer emergency management agencies the opportunity to step back and rethink their overall approach to the challenge they face in the light of the impacts of climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design Semi-structured interviews. Setting 2 open, acute care units of a large tertiary mental health facility in Queensland, Australia. Patients 12 patients (58% men) who were 18–52 years of age and were secluded in the previous 7 days (mean duration 3.4 h). Methods Semi-structured, thematically organised interviews were audiotaped and transcribed. Transcripts were checked for errors against the audiotaped versions and were analysed using the process of meaning categorisation. Themes were identified and coded to produce categories. All members of the research team agreed on the final categorisations. These broad categories were further analysed, and themes were used to reflect patients' experiences of seclusion. Main findings 5 recurrent themes emerged. (1) Patients described the use of seclusion. Some patients thought that seclusion was used inappropriately and that the seclusion period was of more benefit to …

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Bronchiolitis, one of the most common reasons for hospitalisation in young children, is particularly problematic in Indigenous children. Macrolides may be beneficial in settings where children have high rates of nasopharyngeal bacterial carriage and frequent prolonged illness. The aim of our double-blind placebo-controlled randomised trial was to determine if a large single dose of azithromycin (compared to placebo) reduced length of stay (LOS), duration of oxygen (O2) and respiratory readmissions within 6 months of children hospitalised with bronchiolitis. We also determined the effect of azithromycin on nasopharyngeal microbiology. Methods Children aged ≤18 months were randomised to receive a single large dose (30 mg/kg) of either azithromycin or placebo within 24 hrs of hospitalisation. Nasopharyngeal swabs were collected at baseline and 48hrs later. Primary endpoints (LOS, O2) were monitored every 12 hrs. Hospitalised respiratory readmissions 6-months post discharge was collected. Results 97 children were randomised (n = 50 azithromycin, n = 47 placebo). Median LOS was similar in both groups; azithromycin = 54 hours, placebo = 58 hours (difference between groups of 4 hours 95%CI -8, 13, p = 0.6). O2 requirement was not significantly different between groups; Azithromycin = 35 hrs; placebo = 42 hrs (difference 7 hours, 95%CI -9, 13, p = 0.7). Number of children re-hospitalised was similar 10 per group (OR = 0.9, 95%CI 0.3, 2, p = 0.8). At least one virus was detected in 74% of children. The azithromycin group had reduced nasopharyngeal bacterial carriage (p = 0.01) but no difference in viral detection at 48 hours. Conclusion Although a single dose of azithromycin reduces carriage of bacteria, it is unlikely to be beneficial in reducing LOS, duration of O2 requirement or readmissions in children hospitalised with bronchiolitis. It remains uncertain if an earlier and/or longer duration of azithromycin improves clinical and microbiological outcomes for children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background & aims Depression has a complex association with cardiometabolic risk, both directly as an independent factor and indirectly through mediating effects on other risk factors such as BMI, diet, physical activity, and smoking. Since changes to many cardiometabolic risk factors involve behaviour change, the rise in depression prevalence as a major global health issue may present further challenges to long-term behaviour change to reduce such risk. This study investigated associations between depression scores and participation in a community-based weight management intervention trial. Methods A group of 64 overweight (BMI > 27), otherwise healthy adults, were recruited and randomised to follow either their usual diet, or an isocaloric diet in which saturated fat was replaced with monounsaturated fat (MUFA), to a target of 50% total fat, by adding macadamia nuts to the diet. Subjects were assessed for depressive symptoms at baseline and at ten weeks using the Beck Depression Inventory (BDI-II). Both control and intervention groups received advice on National Guidelines for Physical Activity and adhered to the same protocol for food diary completion and trial consultations. Anthropometric and clinical measurements (cholesterol, inflammatory mediators) also were taken at baseline and 10 weeks. Results During the recruitment phase, pre-existing diagnosed major depression was one of a range of reasons for initial exclusion of volunteers from the trial. Amongst enrolled participants, there was a significant correlation (R = −0.38, p < 0.05) between BDI-II scores at baseline and duration of participation in the trial. Subjects with a baseline BDI ≥10 (moderate to severe depression symptoms) were more likely to dropout of the trial before week 10 (p < 0.001). BDI-II scores in the intervention (MUFA) diet group decreased, but increased in the control group over the 10-week period. Univariate analysis of variance confirmed these observations (adjusted R2 = 0.257, p = 0.01). Body weight remained static over the 10-week period in the intervention group, corresponding to a relative increase in the control group (adjusted R2 = 0.097, p = 0.064). Conclusions Depression symptoms have the potential to affect enrolment in and adherence to dietbased risk reduction interventions, and may consequently influence the generalisability of such trials. Depression scores may therefore be useful for characterising, screening and allocating subjects to appropriate treatment pathways.