857 resultados para E-Inclusion Research Network
Resumo:
Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.
Resumo:
This is a European cohort study on predictors of spinal injury in adult (≥16 years) major trauma patients, using prospectively collected data of the Trauma Audit and Research Network from 1988 to 2009. Predictors for spinal fractures/dislocations or spinal cord injury were determined using univariate and multivariate logistic regression analysis. 250,584 patients were analysed. 24,000 patients (9.6%) sustained spinal fractures/dislocations alone and 4,489 (1.8%) sustained spinal cord injury with or without fractures/dislocations. Spinal injury patients had a median age of 44.5 years (IQR = 28.8-64.0) and Injury Severity Score of 9 (IQR = 4-17). 64.9% were male. 45% of patients suffered associated injuries to other body regions. Age <45 years (≥45 years OR 0.83-0.94), Glasgow Coma Score (GCS) 3-8 (OR 1.10, 95% CI 1.02-1.19), falls >2 m (OR 4.17, 95% CI 3.98-4.37), sports injuries (OR 2.79, 95% CI 2.41-3.23) and road traffic collisions (RTCs) (OR 1.91, 95% CI 1.83-2.00) were predictors for spinal fractures/dislocations. Age <45 years (≥45 years OR 0.78-0.90), male gender (female OR 0.78, 95% CI 0.72-0.85), GCS <15 (OR 1.36-1.93), associated chest injury (OR 1.10, 95% CI 1.01-1.20), sports injuries (OR 3.98, 95% CI 3.04-5.21), falls >2 m (OR 3.60, 95% CI 3.21-4.04), RTCs (OR 2.20, 95% CI 1.96-2.46) and shooting (OR 1.91, 95% CI 1.21-3.00) were predictors for spinal cord injury. Multilevel injury was found in 10.4% of fractures/dislocations and in 1.3% of cord injury patients. As spinal trauma occurred in >10% of major trauma patients, aggressive evaluation of the spine is warranted, especially, in males, patients <45 years, with a GCS <15, concomitant chest injury and/or dangerous injury mechanisms (falls >2 m, sports injuries, RTCs and shooting). Diagnostic imaging of the whole spine and a diligent search for associated injuries are substantial.
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
Emphasizing the global and regional importance of mountain ecosystem services and referring to the anticipated future environmental changes affecting the provision of these services, this chapter takes a closer look at the Carpathian Mountains. In addition to climate change and general effects of globalization, rapid socioeconomic transformations after the fall of the Iron Curtain pose an extra challenge to the sustainable development of the region. Describing the early efforts of organizing mountain science through programs such as UNESCO MAB and UNEP at the global scale, this chapter focuses on the recent history of research coordination for the European mountains, in particular on the activities of the Carpathian Convention and the European Program of the Mountain Research Initiative, which were among main driving factors for the initiation of the Science for the Carpathians (S4C) network. This regional mountain research network was established in 2008 to foster scientific collaboration and communication and to promote applied research and capacity building, which in turn would support sustainable development in the Carpathian Mountains. Forum Carpaticum, a biennial open science conference, has become a central activity of the S4C network counting more than 400 members today.
Resumo:
Die dicht besiedelten lateinamerikanischen Metropolräume zeichnen sich heutzutage durch eine komplexe Dynamik der urbanen Sozialstruktur aus. Räumliche und soziokulturelle Segregationsprozesse – ausgelöst durch politische und ökonomische Umstrukturierungen und verstärkt durch die Einflüsse der Globalisierung – führten zu einer ausgeprägten Fragmentierung des urbanen Raumes. Soziale Ausgrenzung einerseits und bewusste Abschottung andererseits lassen innerhalb der einzelnen, immer stärker getrennten Stadtteile einen Drang zur Homogenisierung und Konzentration von Bevölkerungsgruppen mit ähnlichem sozioökonomischen Status feststellen. Im Zuge dieser urbanen Transformationen kommt der Identifikation mit einem bestimmten Ort innerhalb der Stadt eine stets größere Bedeutung zu. Dieses räumliche Verhalten ist Teil eines Lebensstils als ganzheitliches und vielschichtiges Phänomen (Auer 2007: 11), welches neben der Ortsgebundenheit auch durch soziokulturelle Verhaltensmuster, durch Konsumverhalten und schließlich durch den Sprachgebrauch zum Ausdruck kommt. Sprachliche Varietäten oder Merkmale als semiotisches Element konstituieren zusammen mit weiteren Faktoren den Lebensstil sozialer Gruppen im Raum und bilden demnach sowohl eine zentrale Komponente als auch ein Medium der sozialräumlichen Identitätskonstruktion. Unterschiede im Sprachgebrauch werden als Teil einer (räumlichen) Identität wahrgenommen, in Abhängigkeit der Repräsentationen und mentalen Bildern der Sprecher bezüglich des urbanen Raumes beurteilt und schließlich stilisiert. So spielen sie eine zentrale Rolle beim Ausdruck von Abgrenzung oder Zugehörigkeit zu einer sozialen Gruppe und geben vor dem historischen und geopolitischen Hintergrund einer Stadt Aufschluss über deren Sozialstruktur. Ausgehend von diesem konstruktivistischen Verständnis des Raumes als mehrdimensionales soziales Produkt im Sinne von Lefebvre (1974) werden an dem für die Entwicklungen der lateinamerikanischen Städte paradigmatischen Beispiel von Buenos Aires theoretische Überlegungen und methodische Herangehensweisen für die Erforschung des Zusammenhangs von sprachlicher Variation und sozialräumlicher Segregation dargestellt.
Resumo:
BACKGROUND: Decisions regarding whether to administer intensive care to extremely premature infants are often based on gestational age alone. However, other factors also affect the prognosis for these patients. METHODS: We prospectively studied a cohort of 4446 infants born at 22 to 25 weeks' gestation (determined on the basis of the best obstetrical estimate) in the Neonatal Research Network of the National Institute of Child Health and Human Development to relate risk factors assessable at or before birth to the likelihood of survival, survival without profound neurodevelopmental impairment, and survival without neurodevelopmental impairment at a corrected age of 18 to 22 months. RESULTS: Among study infants, 3702 (83%) received intensive care in the form of mechanical ventilation. Among the 4192 study infants (94%) for whom outcomes were determined at 18 to 22 months, 49% died, 61% died or had profound impairment, and 73% died or had impairment. In multivariable analyses of infants who received intensive care, exposure to antenatal corticosteroids, female sex, singleton birth, and higher birth weight (per each 100-g increment) were each associated with reductions in the risk of death and the risk of death or profound or any neurodevelopmental impairment; these reductions were similar to those associated with a 1-week increase in gestational age. At the same estimated likelihood of a favorable outcome, girls were less likely than boys to receive intensive care. The outcomes for infants who underwent ventilation were better predicted with the use of the above factors than with use of gestational age alone. CONCLUSIONS: The likelihood of a favorable outcome with intensive care can be better estimated by consideration of four factors in addition to gestational age: sex, exposure or nonexposure to antenatal corticosteroids, whether single or multiple birth, and birth weight. (ClinicalTrials.gov numbers, NCT00063063 [ClinicalTrials.gov] and NCT00009633 [ClinicalTrials.gov].).
Evolution of capital cities economies: Towards a knowledge intensive and thus more resilient economy
Resumo:
The aim of this study was to improve cage systems for maintaining adult honey bee (Apis mellifera L.) workers under in vitro laboratory conditions. To achieve this goal, we experimentally evaluated the impact of different cages, developed by scientists of the international research network COLOSS (Prevention of honey bee COlony LOSSes), on the physiology and survival of honey bees. We identified three cages that promoted good survival of honey bees. The bees from cages that exhibited greater survival had relatively lower titers of deformed wing virus, suggesting that deformed wing virus is a significant marker reflecting stress level and health status of the host. We also determined that a leak- and drip-proof feeder was an integral part of a cage system and a feeder modified from a 20-ml plastic syringe displayed the best result in providing steady food supply to bees. Finally, we also demonstrated that the addition of protein to the bees' diet could significantly increase the level ofvitellogenin gene expression and improve bees' survival. This international collaborative study represents a critical step toward improvement of cage designs and feeding regimes for honey bee laboratory experiments.
Resumo:
BACKGROUND Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone. METHODS Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544). FINDINGS 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc. INTERPRETATION Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy. FUNDING Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
production, during the summer of 2010. This farm is integrated at the Spanish research network for the sugar beet development (AIMCRA) which regarding irrigation, focuses on maximizing water saving and cost reduction. According to AIMCRA 0 s perspective for promoting irrigation best practices, it is essential to understand soil response to irrigation i.e. maximum irrigation length for each soil infiltration capacity. The Use of Humidity Sensors provides foundations to address soil 0 s behavior at the irrigation events and, therefore, to establish the boundaries regarding irrigation length and irrigation interval. In order to understand to what extent farmer 0 s performance at Tordesillas farm could have been potentially improved, this study aims to address suitable irrigation length and intervals for the given soil properties and evapotranspiration rates. In this sense, several humidity sensors were installed: (1) A Frequency Domain Reflectometry (FDR) EnviroScan Probe taking readings at 10, 20, 40 and 60cm depth and (2) different Time Domain Reflectometry (TDR) Echo 2 and Cr200 probes buried in a 50cm x 30cm x 50cm pit and placed along the walls at 10, 20, 30 and 40 cm depth. Moreover, in order to define soil properties, a textural analysis at the Tordesillas Farm was conducted. Also, data from the Tordesillas meteorological station was utilized.