919 resultados para Following up
Resumo:
Background During a global influenza pandemic, the vaccine requirements of developing countries can surpass their supply capabilities, if these exist at all, compelling them to rely on developed countries for stocks that may not be available in time. There is thus a need for developing countries in general to produce their own pandemic and possibly seasonal influenza vaccines. Here we describe the development of a plant-based platform for producing influenza vaccines locally, in South Africa. Plant-produced influenza vaccine candidates are quicker to develop and potentially cheaper than egg-produced influenza vaccines, and their production can be rapidly upscaled. In this study, we investigated the feasibility of producing a vaccine to the highly pathogenic avian influenza A subtype H5N1 virus, the most generally virulent influenza virus identified to date. Two variants of the haemagglutinin (HA) surface glycoprotein gene were synthesised for optimum expression in plants: these were the full-length HA gene (H5) and a truncated form lacking the transmembrane domain (H5tr). The genes were cloned into a panel of Agrobacterium tumefaciens binary plant expression vectors in order to test HA accumulation in different cell compartments. The constructs were transiently expressed in tobacco by means of agroinfiltration. Stable transgenic tobacco plants were also generated to provide seed for stable storage of the material as a pre-pandemic strategy. Results For both transient and transgenic expression systems the highest accumulation of full-length H5 protein occurred in the apoplastic spaces, while the highest accumulation of H5tr was in the endoplasmic reticulum. The H5 proteins were produced at relatively high concentrations in both systems. Following partial purification, haemagglutination and haemagglutination inhibition tests indicated that the conformation of the plant-produced HA variants was correct and the proteins were functional. The immunisation of chickens and mice with the candidate vaccines elicited HA-specific antibody responses. Conclusions We managed, after synthesis of two versions of a single gene, to produce by transient and transgenic expression in plants, two variants of a highly pathogenic avian influenza virus HA protein which could have vaccine potential. This is a proof of principle of the potential of plant-produced influenza vaccines as a feasible pandemic response strategy for South Africa and other developing countries.
Resumo:
BACKGROUND/OBJECTIVES: To describe the diet quality of a national sample of Australian women with a recent history of gestational diabetes mellitus (GDM) and determine factors associated with adherence to national dietary recommendations. SUBJECTS/METHODS: A postpartum lifestyle survey with 1499 Australian women diagnosed with GDM p3 years previously. Diet quality was measured using the Australian recommended food score (ARFS) and weighted by demographic and diabetes management characteristics. Multinominal logistic regression analysis was used to determine the association between diet quality and demographic characteristics, health seeking behaviours and diabetes-related risk factors. RESULTS: Mean (±s.d.) ARFS was 30.9±8.1 from a possible maximum score of 74. Subscale component scores demonstrated that the nuts/legumes, grains and fruits were the most poorly scored. Factors associated with being in the highest compared with the lowest ARFS quintile included age (odds ratio (OR) 5-year increase=1.40; 95% (confidence interval) CI:1.16–1.68), tertiary education (OR=2.19; 95% CI:1.52–3.17), speaking only English (OR=1.92; 95% CI:1.19–3.08), being sufficiently physically active (OR=2.11; 95% CI:1.46–3.05), returning for postpartum blood glucose testing (OR=1.75; 95% CI:1.23–2.50) and receiving riskreduction advice from a health professional (OR=1.80; 95% CI:1.24–2.60). CONCLUSIONS: Despite an increased risk of type 2 diabetes, women in this study had an overall poor diet quality as measured by the ARFS. Women with GDM should be targeted for interventions aimed at achieving a postpartum diet consistent with the guidelines for chronic disease prevention. Encouraging women to return for follow-up and providing risk reduction advice may be positive initial steps to improve diet quality, but additional strategies need to be identified.
Resumo:
Decades of research has now produced a rich description of the destruction child sexual assault (CSA) can cause in an individual’s life. Post-Traumatic Stress Disorder (PTSD), Dissociative Identity Disorder, Borderline Personality Disorder, depression, anxiety, Panic Disorder, intimacy issues, substance abuse, self-harm, and suicidal ideation and attempts, are some of the negative outcomes that have been attributed to this type of traumatic experience. Psychology's tendency to dwell within a pathological paradigm, along with popular media who espouse a similar rhetoric, would lead to the belief that once exposed to CSA, an individual is forever at the mercy of dealing with a massive array of accompanying negative effects. While the possibility of these outcomes in those who have experienced CSA is not at all denied, it is also timely to consider an alternative paradigm that up until now has received a paucity of attention in the sexual assault literature. That is to say, not only do people have the ability to work through the painful and personal impacts of CSA, but for some people the process of recovery may provide a catalyst for positive life changes that have been termed post-traumatic growth (Tedeschi & Calhoun, 1995). To begin with in this chapter, the negative sequale’ of childhood sexual assault it discussed initially. Inherent to this discussion are questions of measurement and definitions of sexual assault. The chapter highlights ways in which the term CSA has been defined and hence operationalised in research, and the myriad problems, confusions, and inconclusive findings that have plagued the sexual assault literature. Following this is a review of the sparse literature that has conceptualised CSA from a more salutogenic (Antonovsky, 1979) theoretical orientation. It is argued that a salutogenic approach to intervention and to research in this area, provides a more useful way of promoting healing and the gaining of wisdom, but importantly does not negate the very real distress that may accompany growth. This chapter will then present a case study to elucidate the theoretical and empirical literature discussed using the words of a survivor. Finally, the chapter concludes with implications for therapeutic practice, which includes some practical ways in which to promote adaptation to life within the context of having survived this insidious crime.
Resumo:
LIP emplacement is linked to the timing and evolution of supercontinental break-up. LIP-related break-up produces volcanic rifted margins, new and large (up to 108 km2) ocean basins, and new, smaller continents that undergo dispersal and potentially reassembly (e.g., India). However, not all continental LIPs lead to continental rupture. We analysed the <330 Ma continental LIP record(following final assembly of Pangea) to find relationships between LIP event attributes (e.g., igneous volume, extent, distance from pre-existing continental margin) and ocean basin attributes (e.g., length of new ocean basin/rifted margin) and how these varied during the progressive break up of Pangea. No correlation exists between LIP magnitude and size of the subsequent ocean basin or rifted margin. Our review suggests a three-phased break-up history of Pangea: 1) “Preconditioning” phase (∼330–200 Ma): LIP events (n=7) occurred largely around the supercontinental margin clustering today in Asia, with a low (<20%) rifting success rate. The Panjal Traps at ∼280 Ma may represent the first continental rupturing event of Pangea, resulting in continental ribboning along the Tethyan margin; 2) “Main Break-up” phase (∼200–100 Ma): numerous large LIP events(n=10) in the supercontinent interior, resulting in highly successful fragmentation (90%) and large, new ocean basins(e.g., Central/South Atlantic, Indian, >3000 km long); 3) “Waning” phase (∼100–0 Ma): Declining LIP magnitudes (n=6), greater proximity to continental margins (e.g., Madagascar, North Atlantic, Afro-Arabia, Sierra Madre) producing smaller ocean basins (<2600 km long). How Pangea broke up may thus have implications for earlier supercontinent reconstructions and LIP record.
Resumo:
Early preterm birth (<32 weeks) is associated with in utero infection and inflammation. We used an ovine model of in utero infection to ask if exposure to Ureaplasma serovar 3 (UP) modulated the response of the fetal skin to LPS.
Resumo:
Post–disaster reconstruction projects are often considered ineffectual or unproductive because on many occasions in the past they have performed extremely poorly during post-contract occupation, or have failed altogether to deliver acceptable outcomes. In some cases, these projects have already failed even before their completion, leading many sponsor aid organisations to hold these projects up as examples of how not to deliver housing reconstruction. Research into some previous unsuccessful projects has revealed that often the lack of adequate knowledge regarding the context and complexity involved in the implementation of these projects is generally responsible for their failure. Post-disaster reconstruction projects are certainly very complex in nature, often very context-specific and they can vary widely in magnitude. Despite such complexity, reconstruction projects can still have a high likelihood of success if adequate consideration is given to the importance of factors which are known to positively influence reconstruction efforts. Good outcomes can be achieved when planners and practitioners ensure best practices are embedded in the design of reconstruction projects at the time reconstruction projects they are first instigated. This paper outlines and discusses factors that significantly contribute to the successful delivery of post-disaster housing reconstruction projects.
Resumo:
Penalties and sanctions to deter risky/illegal behaviours are important components of traffic law enforcement. Sanctions can be applied to the vehicle (e.g., impoundment), the person (e.g., remedial programs or jail), or the licence (e.g., disqualification). For licence sanctions, some offences attract automatic suspension while others attract demerit points which can indirectly lead to licence loss. In China, a licence is suspended when a driver accrues twelve demerit points within one year. When this occurs, the person must undertake a one-week retraining course at their own expense and successfully pass an examination to become relicensed. Little is known about the effectiveness of this program. A pilot study was conducted in Zhejiang Province to examine basic information about participants of a retraining course. The aim was to gather baseline data for future comparison. Participants were recruited at a driver retraining centre in a large city in Zhejiang Province. In total, 239 suspended drivers completed an anonymous questionnaire which included demographic information, driving history, and crash involvement. Overall, 87% were male with an overall mean age of 35.02 years (SD=8.77; range 21-60 years). A large proportion (83.3%) of participants owned a vehicle. Commuting to work was reported by 64% as their main reason for driving, while 16.3% reported driving for work. Only 6.4% reported holding a licence for 1 year or less (M=8.14 years, SD=6.5, range 1-31 years) and people reported driving an average of 18.06 hours/week (SD=14.4, range 1-86 hours). This represents a relatively experienced group, especially given the increase in new drivers in China. The number of infringements reportedly received in the previous year ranged from 2 to 18 (M=4.6, SD=3.18); one third of participants reported having received 5 or more infringements. Approximately one third also reported having received infringements in the previous year but not paid them. Various strategies for avoiding penalties were reported. The most commonly reported traffic violations were: drink driving (DUI; 0.02-0.08 mg/100ml) with 61.5% reporting 1 such violation; and speeding (47.7% reported 1-10 violations). Only 2.2% of participants reported the more serious drunk driving violation (DWI; above 0.08mg/100ml). Other violations included disobeying traffic rules, using inappropriate licence, and licence plate destroyed/not displayed. Two-thirds of participants reported no crash involvement in the previous year while 14.2% reported involvement in 2-5 crashes. The relationship between infringements and crashes was limited, however there was a small, positive significant correlation between crashes and speeding infringements (r=.2, p=.004). Overall, these results indicate the need for improved compliance with the law among this sample of traffic offenders. For example, lower level drink driving (DUI) and speeding were the most commonly reported violations with some drivers having committed a large number in the previous year. It is encouraging that the more serious offence of drunk driving (DWI) was rarely reported. The effectiveness of this driver retraining program and the demerit point penalty system in China is currently unclear. Future research including driver follow up via longitudinal study is recommended to determine program effectiveness to enhance road safety in China.
Resumo:
The objective of exercise training is to initiate desirable physiological adaptations that ultimately enhance physical work capacity. Optimal training prescription requires an individualized approach, with an appropriate balance of training stimulus and recovery and optimal periodization. Recovery from exercise involves integrated physiological responses. The cardiovascular system plays a fundamental role in facilitating many of these responses, including thermoregulation and delivery/removal of nutrients and waste products. As a marker of cardiovascular recovery, cardiac parasympathetic reactivation following a training session is highly individualized. It appears to parallel the acute/intermediate recovery of the thermoregulatory and vascular systems, as described by the supercompensation theory. The physiological mechanisms underlying cardiac parasympathetic reactivation are not completely understood. However, changes in cardiac autonomic activity may provide a proxy measure of the changes in autonomic input into organs and (by default) the blood flow requirements to restore homeostasis. Metaboreflex stimulation (e.g. muscle and blood acidosis) is likely a key determinant of parasympathetic reactivation in the short term (0–90 min post-exercise), whereas baroreflex stimulation (e.g. exercise-induced changes in plasma volume) probably mediates parasympathetic reactivation in the intermediate term (1–48 h post-exercise). Cardiac parasympathetic reactivation does not appear to coincide with the recovery of all physiological systems (e.g. energy stores or the neuromuscular system). However, this may reflect the limited data currently available on parasympathetic reactivation following strength/resistance-based exercise of variable intensity. In this review, we quantitatively analyse post-exercise cardiac parasympathetic reactivation in athletes and healthy individuals following aerobic exercise, with respect to exercise intensity and duration, and fitness/training status. Our results demonstrate that the time required for complete cardiac autonomic recovery after a single aerobic-based training session is up to 24 h following low-intensity exercise, 24–48 h following threshold-intensity exercise and at least 48 h following high-intensity exercise. Based on limited data, exercise duration is unlikely to be the greatest determinant of cardiac parasympathetic reactivation. Cardiac autonomic recovery occurs more rapidly in individuals with greater aerobic fitness. Our data lend support to the concept that in conjunction with daily training logs, data on cardiac parasympathetic activity are useful for individualizing training programmes. In the final sections of this review, we provide recommendations for structuring training microcycles with reference to cardiac parasympathetic recovery kinetics. Ultimately, coaches should structure training programmes tailored to the unique recovery kinetics of each individual.
Resumo:
Background Acute respiratory illness, a leading cause of cough in children, accounts for a substantial proportion of childhood morbidity and mortality worldwide. In some children acute cough progresses to chronic cough (> 4 weeks duration), impacting on morbidity and decreasing quality of life. Despite the importance of chronic cough as a cause of substantial childhood morbidity and associated economic, family and social costs, data on the prevalence, predictors, aetiology and natural history of the symptom are scarce. This study aims to comprehensively describe the epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children presenting to a tertiary paediatric emergency department. Methods/design A prospective cohort study of children aged <15 years attending the Royal Children's Hospital Emergency Department, Brisbane, for a respiratory illness that includes parent reported cough (wet or dry) as a symptom. The primary objective is to determine the prevalence and predictors of chronic cough (>= 4 weeks duration) post presentation with acute respiratory illness. Demographic, epidemiological, risk factor, microbiological and clinical data are completed at enrolment. Subjects complete daily cough dairies and weekly follow-up contacts for 28(+/-3) days to ascertain cough persistence. Children who continue to cough for 28 days post enrolment are referred to a paediatric respiratory physician for review. Primary analysis will be the proportion of children with persistent cough at day 28(+/-3). Multivariate analyses will be performed to evaluate variables independently associated with chronic cough at day 28(+/-3). Discussion Our protocol will be the first to comprehensively describe the natural history, epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children. The results will contribute to studies leading to the development of evidence-based clinical guidelines to improve the early detection and management of chronic cough in children during and after acute respiratory illness.
Resumo:
OBJECTIVE: We present and analyze long-term outcomes following multimodal therapy for esophageal cancer, in particular the relative impact of histomorphologic tumor regression and nodal status. PATIENTS AND METHODS: A total of 243 patients [(adenocarcinoma (n = 170) and squamous cell carcinoma (n = 73)] treated with neoadjuvant chemoradiotherapy in the period 1990 to 2004 were followed prospectively with a median follow-up of 60 months. Pathologic stage and tumor regression grade (TRG) were documented, the site of first failure was recorded, and Kaplan-Meier survival curves were plotted. RESULTS: Thirty patients (12%) did not undergo surgery due to disease progression or deteriorated performance status. Forty-one patients (19%) had a complete pathologic response (pCR), and there were 31(15%) stage I, 69 (32%) stage II, and 72 (34%) stage III cases. The overall median survival was 18 months, and the 5-year survival was 27%. The 5-year survival of patients achieving a pCR was 50% compared with 37% in non-pCR patients who were node-negative (P = 0.86). Histomorphologic tumor regression was not associated with pre-CRT cTN stage but was significantly (P < 0.05) associated with ypN stage. By multivariate analysis, ypN status (P = 0.002) was more predictive of overall survival than TRG (P = 0.06) or ypT stage (P = 0.39). CONCLUSION: Achieving a node-negative status is the major determinant of outcome following neoadjuvant chemoradiotherapy. Histomorphologic tumor regression is less predictive of outcome than pathologic nodal status (ypN), and the need to include a primary site regression score in a new staging classification is unclear. © 2007 Lippincott Williams & Wilkins, Inc.
Resumo:
Background. This study evaluated the time course of recovery of transverse strain in the Achilles and patellar tendons following a bout of resistance exercise. Methods. Seventeen healthy adults underwent sonographic examination of the right patellar (n = 9) or Achilles (n = 8) tendons immediately prior to and following 90 repetitions of weight–bearing exercise. Quadriceps and gastrocnemius exercise were performed against an effective resistance of 175% and 250% body weight, respectively. Sagittal tendon thickness was determined 20 mm from the tendon enthesis and transverse strain was repeatedly monitored over a 24 hour recovery period. Results. Resistance exercise resulted in an immediate decrease in Achilles (t7 = 10.6, P<.01) and patellar (t8 = 8.9, P<.01) tendon thickness, resulting in an average transverse strain of 0.14 ± 0.04 and 0.18 ± 0.05. While the average strain was not significantly different between tendons, older age was associated with a reduced transverse strain response (r=0.63, P<.01). Recovery of transverse strain, in contrast, was prolonged compared with the duration of loading and exponential in nature. The mean primary recovery time was not significantly different between Achilles (6.5 ± 3.2 hours) and patellar (7.1 ± 3.2 hours) tendons and body weight accounted for 62% and 64% of the variation in recovery time, respectively. Discussion. Despite structural and biochemical differences between the Achilles and patellar tendons [1], the mechanisms underlying transverse creep–recovery in vivo appear similar and are highly time dependent. Primary recovery required about 7 hours in healthy tendons, with full recovery requiring up to 24 hours. These in vivo recovery times are similar to those reported for axial creep recovery of the vertebral disc in vitro [2], and may be used clinically to guide physical activity to rest ratios in healthy adults. Optimal ratios for high–stress tendons in clinical populations, however, remain unknown and require further attention in light of the knowledge gained in this study.
Resumo:
Cryotherapy is currently used in various clinical, rehabilitative, and sporting settings. However, very little is known regarding the impact of cooling on the microcirculatory response. Objectives: The present study sought to examine the influence of two commonly employed modalities of cryotherapy, whole body cryotherapy (WBC; -110°C) and cold water immersion(CWI; 8±1°C), on skin microcirculation in the mid- thigh region. Methods: The skin area examined was a 3 × 3 cm located between the most anterior aspect of the inguinal fold and the patella. Following 10 minutes of rest, 5 healthy, active males were exposed to either WBC for 3 minutes or CWI for 5 minutes in a randomised order. Volunteers lay supine for five minutes after treatment, in order to monitor the variation of red blood cell (RBC) concentration in the region of interest for a duration of 40 minutes. Microcirculation response was assessed using a non-invasive, portable instrument known as a Tissue Viability imaging system. After a minimum of seven days, the protocol was repeated. Subjective assessment of the volunteer’s thermal comfort and thermal sensation was also recorded. Results: RBC was altered following exposure to both WBC and CWI but appeared to stabilise approximately 35 minutes after treatments. Both WBC and CWI affected thermal sensation (p < 0.05); however no betweengroup differences in thermal comfort or sensation were recorded (p > 0.05). Conclusions: As both WBC and CWI altered RBC, further study is necessary to examine the mechanism for this alteration during whole body cooling.
Resumo:
Impaction bone grafting for reconstitution of bone stock in revision hip surgery has been used for nearly 30 years. We used this technique, in combination with a cemented acetabular component, in the acetabula of 304 hips in 292 patients revised for aseptic loosening between 1995 and 2001. The only additional supports used were stainless steel meshes placed against the medial wall or laterally around the acetabular rim to contain the graft. All Paprosky grades of defect were included. Clinical and radiographic outcomes were collected in surviving patients at a minimum of 10 years following the index operation. Mean follow-up was 12.4 years (SD 1.5; range 10.0-16.0). Kaplan-Meier survivorship with revision for aseptic loosening as the endpoint was 85.9% (95% CI 81.0 to 90.8%) at 13.5 years. Clinical scores for pain relief remained satisfactory, and there was no difference in clinical scores between cups that appeared stable and those that appeared loose radiographically.
Resumo:
This paper will identify and discuss the major occupational health and safety (OHS) hazards and risks for clean-up and recovery workers. The lessons learned from previous disasters including; the Exxon Valdez oil spill, World Trade Centre (WTC) terrorist attack, Hurricane Katrina and the Deepwater Horizon Gulf of Mexico oil spill will be discussed. The case for an increased level of preparation and planning to mitigate the health risks for clean-up and recovery workers will be presented, based on recurring themes identified in the peer reviewed literature. There are a number of important issues pertaining to the occupational health and safety of workers who are engaged in clean-up and recovery operations following natural and technological disasters. These workers are often exposed to a wide range of occupational health and safety hazards, some of which may be unknown at the time. It is well established that clean-up and recovery operations involve risks of physical injury, for example, from manual handling, mechanical equipment, extreme temperatures, slips, trips and falls. In addition to these well established physical injury risks there are now an increasing number of studies which highlight the risks of longer term or chronic health effects arising from clean-up and recovery work. In particular, follow up studies from the Exxon Valdez oil spill, Hurricane Katrina and the World Trade Centre (WTC) terrorism attack have documented the longer term health consequences of these events. These health effects include respiratory symptoms and musculoskeletal disorders, as well as post traumatic stress disorder (PTSD). In large scale operations many of those workers and supervisors involved have not had any specific occupational health and safety (OHS) training and may not have access to the necessary instruction, personal protective equipment or other appropriate equipment, this is especially true when volunteers are used to form part of the clean-up and recovery workforce. In general, first responders are better equipped and trained than clean-up and recovery workers and some of the training approaches used for the traditional first responders would be relevant for clean-up and recovery workers.
Resumo:
The following research reports the emergence of Leptospira borgpetersenii serovar Arborea as the dominant infecting serovar following the summer of disasters and the ensuing clean up in Queensland, Australia during 2011. For the 12 month period (1 January to 31 December) L. borgpetersenii serovar Arborea accounted for over 49% of infections. In response to a flooding event public health officials need to issue community wide announcements warning the population about the dangers of leptospirosis and other water borne diseases. Communication with physicians working in the affected community should also be increased to update physicians with information such as clinical presentation of leptospirosis and other waterborne diseases. These recommendations will furnish public health officials with considerations for disease management when dealing with future disaster management programs.