47 resultados para Multiple-use forestry
Resumo:
OBJECTIVE: To determine the accuracy of magnetic resonance imaging criteria for the early diagnosis of multiple sclerosis in patients with suspected disease. DESIGN: Systematic review. DATA SOURCES: 12 electronic databases, citation searches, and reference lists of included studies. Review methods Studies on accuracy of diagnosis that compared magnetic resonance imaging, or diagnostic criteria incorporating such imaging, to a reference standard for the diagnosis of multiple sclerosis. RESULTS: 29 studies (18 cohort studies, 11 other designs) were included. On average, studies of other designs (mainly diagnostic case-control studies) produced higher estimated diagnostic odds ratios than did cohort studies. Among 15 studies of higher methodological quality (cohort design, clinical follow-up as reference standard), those with longer follow-up produced higher estimates of specificity and lower estimates of sensitivity. Only two such studies followed patients for more than 10 years. Even in the presence of many lesions (> 10 or > 8), magnetic resonance imaging could not accurately rule multiple sclerosis in (likelihood ratio of a positive test result 3.0 and 2.0, respectively). Similarly, the absence of lesions was of limited utility in ruling out a diagnosis of multiple sclerosis (likelihood ratio of a negative test result 0.1 and 0.5). CONCLUSIONS: Many evaluations of the accuracy of magnetic resonance imaging for the early detection of multiple sclerosis have produced inflated estimates of test performance owing to methodological weaknesses. Use of magnetic resonance imaging to confirm multiple sclerosis on the basis of a single attack of neurological dysfunction may lead to over-diagnosis and over-treatment.
Resumo:
BACKGROUND: Epidemiological data for south Asian children in the United Kingdom are contradictory, showing a lower prevalence of wheeze, but a higher rate of medical consultations and admissions for asthma compared with white children. These studies have not distinguished different asthma phenotypes or controlled for varying environmental exposures. OBJECTIVE: To compare the prevalence of wheeze and related health-service use in south Asian and white pre-schoolchildren in the United Kingdom, taking into account wheeze phenotype (viral and multiple wheeze) and environmental exposures. METHODS: A postal questionnaire was completed by parents of a population-based sample of 4366 white and 1714 south Asian children aged 1-4 years in Leicestershire, UK. Children were classified as having viral wheeze or multiple trigger wheeze. RESULTS: The prevalence of current wheeze was 35.6% in white and 25.5% in south Asian 1-year-olds (P<0.001), and 21.9% and 20.9%, respectively, in children aged 2-4 years. Odds ratios (ORs) (95% confidence interval) for multiple wheeze and for viral wheeze, comparing south Asian with white children, were 2.21 (1.19-4.09) and 1.43 (0.77-2.65) in 2-4-year-olds after controlling for socio-economic conditions, environmental exposures and family history. In 1-year-olds, the respective ORs for multiple and viral wheeze were 0.66 (0.47-0.92) and 0.81 (0.64-1.03). Reported GP consultation rates for wheeze and hospital admissions were greater in south Asian children aged 2-4 years, even after adjustment for severity, but the use of inhaled corticosteroids was lower. CONCLUSIONS: South Asian 2-4-year-olds are more likely than white children to have multiple wheeze (a condition with many features of chronic atopic asthma), after taking into account ethnic differences in exposure to some environmental agents. Undertreatment with inhaled corticosteroids might partly explain their greater use of health services.
Resumo:
The IkappaB kinase (IKK) complex controls processes such as inflammation, immune responses, cell survival and the proliferation of both normal and tumor cells. By activating NFkappaB, the IKK complex contributes to G1/S transition and first evidence has been presented that IKKalpha also regulates entry into mitosis. At what stage IKK is required and whether IKK also contributes to progression through mitosis and cytokinesis, however, has not yet been determined. In this study, we use BMS-345541, a potent allosteric small molecule inhibitor of IKK, to inhibit IKK specifically during G2 and during mitosis. We show that BMS-345541 affects several mitotic cell cycle transitions, including mitotic entry, prometaphase to anaphase progression and cytokinesis. Adding BMS-345541 to the cells released from arrest in S-phase blocked the activation of Aurora A, B and C, Cdk1 activation and histone H3 phosphorylation. Additionally, treatment of the mitotic cells with BMS-345541 resulted in precocious cyclin B1 and securin degradation, defective chromosome separation and improper cytokinesis. BMS-345541 was also found to override the spindle checkpoint in nocodazole-arrested cells. In vitro kinase assays using BMS-345541 indicate that these effects are not primarily due to a direct inhibitory effect of BMS-345541 on mitotic kinases such as Cdk1, Aurora A or B, Plk1 or NEK2. This study points towards a new potential role of IKK in cell cycle progression. Since deregulation of the cell cycle is one of the hallmarks of tumor formation and progression, the newly discovered level of BMS-345541 function could be useful for cell cycle control studies and may provide valuable clues for the design of future therapeutics.
Resumo:
BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
OBJECTIVE: To test the feasibility of and interactions among three software-driven critical care protocols. DESIGN: Prospective cohort study. SETTING: Intensive care units in six European and American university hospitals. PATIENTS: 174 cardiac surgery and 41 septic patients. INTERVENTIONS: Application of software-driven protocols for cardiovascular management, sedation, and weaning during the first 7 days of intensive care. MEASUREMENTS AND RESULTS: All protocols were used simultaneously in 85% of the cardiac surgery and 44% of the septic patients, and any one of the protocols was used for 73 and 44% of study duration, respectively. Protocol use was discontinued in 12% of patients by the treating clinician and in 6% for technical/administrative reasons. The number of protocol steps per unit of time was similar in the two diagnostic groups (n.s. for all protocols). Initial hemodynamic stability (a protocol target) was achieved in 26+/-18 min (mean+/-SD) in cardiac surgery and in 24+/-18 min in septic patients. Sedation targets were reached in 2.4+/-0.2h in cardiac surgery and in 3.6 +/-0.2h in septic patients. Weaning protocol was started in 164 (94%; 154 extubated) cardiac surgery and in 25 (60%; 9 extubated) septic patients. The median (interquartile range) time from starting weaning to extubation (a protocol target) was 89 min (range 44-154 min) for the cardiac surgery patients and 96 min (range 56-205 min) for the septic patients. CONCLUSIONS: Multiple software-driven treatment protocols can be simultaneously applied with high acceptance and rapid achievement of primary treatment goals. Time to reach these primary goals may provide a performance indicator.
Resumo:
Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.
Resumo:
Through the concerted evaluations of thousands of commercial substances for the qualities of persistence, bioaccumulation, and toxicity as a result of the United Nations Environment Program's Stockholm Convention, it has become apparent that fewer empirical data are available on bioaccumulation than other endpoints and that bioaccumulation models were not designed to accommodate all chemical classes. Due to the number of chemicals that may require further assessment, in vivo testing is cost prohibitive and discouraged due to the large number of animals needed. Although in vitro systems are less developed and characterized for fish, multiple high-throughput in vitro assays have been used to explore the dietary uptake and elimination of pharmaceuticals and other xenobiotics by mammals. While similar processes determine bioaccumulation in mammalian species, a review of methods to measure chemical bioavailability in fish screening systems, such as chemical biotransformation or metabolism in tissue slices, perfused tissues, fish embryos, primary and immortalized cell lines, and subcellular fractions, suggest quantitative and qualitative differences between fish and mammals exist. Using in vitro data in assessments for whole organisms or populations requires certain considerations and assumptions to scale data from a test tube to a fish, and across fish species. Also, different models may incorporate the predominant site of metabolism, such as the liver, and significant presystemic metabolism by the gill or gastrointestinal system to help accurately convert in vitro data into representative whole-animal metabolism and subsequent bioaccumulation potential. The development of animal alternative tests for fish bioaccumulation assessment is framed in the context of in vitro data requirements for regulatory assessments in Europe and Canada.
Resumo:
BACKGROUND A newly developed collagen matrix (CM) of porcine origin has been shown to represent a potential alternative to palatal connective tissue grafts (CTG) for the treatment of single Miller Class I and II gingival recessions when used in conjunction with a coronally advanced flap (CAF). However, at present it remains unknown to what extent CM may represent a valuable alternative to CTG in the treatment of Miller Class I and II multiple adjacent gingival recessions (MAGR). The aim of this study was to compare the clinical outcomes following treatment of Miller Class I and II MAGR using the modified coronally advanced tunnel technique (MCAT) in conjunction with either CM or CTG. METHODS Twenty-two patients with a total of 156 Miller Class I and II gingival recessions were included in this study. Recessions were randomly treated according to a split-mouth design by means of MCAT + CM (test) or MCAT + CTG (control). The following measurements were recorded at baseline (i.e. prior to surgery) and at 12 months: Gingival Recession Depth (GRD), Probing Pocket Depth (PD), Clinical Attachment Level (CAL), Keratinized Tissue Width (KTW), Gingival Recession Width (GRW) and Gingival Thickness (GT). GT was measured 3-mm apical to the gingival margin. Patient acceptance was recorded using a Visual Analogue Scale (VAS). The primary outcome variable was Complete Root Coverage (CRC), secondary outcomes were Mean Root Coverage (MRC), change in KTW, GT, patient acceptance and duration of surgery. RESULTS Healing was uneventful in both groups. No adverse reactions at any of the sites were observed. At 12 months, both treatments resulted in statistically significant improvements of CRC, MRC, KTW and GT compared with baseline (p < 0.05). CRC was found at 42% of test sites and at 85% of control sites respectively (p < 0.05). MRC measured 71 ± 21% mm at test sites versus 90 ± 18% mm at control sites (p < 0.05). Mean KTW measured 2.4 ± 0.7 mm at test sites versus 2.7 ± 0.8 mm at control sites (p > 0.05). At test sites, GT values changed from 0.8 ± 0.2 to 1.0 ± 0.3 mm, and at control sites from 0.8 ± 0.3 to 1.3 ± 0.4 mm (p < 0.05). Duration of surgery and patient morbidity was statistically significantly lower in the test compared with the control group respectively (p < 0.05). CONCLUSIONS The present findings indicate that the use of CM may represent an alternative to CTG by reducing surgical time and patient morbidity, but yielded lower CRC than CTG in the treatment of Miller Class I and II MAGR when used in conjunction with MCAT.
Resumo:
Wireless Multimedia Sensor Networks (WMSNs) promise a wide scope of emerging potential applications in both civilian and military areas, which require visual and audio information to enhance the level of collected information. The transmission of multimedia content requires a minimal video quality level from the user’s perspective. However, links in WMSN communi- cations are typically unreliable, as they often experience fluctuations in quality and weak connectivity, and thus, the routing protocol must evaluate the routes by using end-to-end link quality information to increase the packet delivery ratio. Moreover, the use multiple paths together with key video metrics can enhance the video quality level. In this paper, we propose a video-aware multiple path hierarchical routing protocol for efficient multimedia transmission over WMSN, called video-aware MMtransmission. This protocol finds node-disjoint multiple paths, and implements an end-to-end link quality estimation with minimal over- head to score the paths. Thus, our protocol assures multimedia transmission with Quality of Experience (QoE) and energy-efficiency support. The simula- tion results show the benefits of video-aware MMtransmission for disseminating video content by means of energy-efficiency and QoE analysis.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.
Resumo:
Atrial fibrillation (AF) is associated with an increased risk of thromboembolism, and is the most prevalent factor for cardioembolic stroke. Vitamin K antagonists (VKAs) have been the standard of care for stroke prevention in patients with AF since the early 1990s. They are very effective for the prevention of cardioembolic stroke, but are limited by factors such as drug-drug interactions, food interactions, slow onset and offset of action, haemorrhage and need for routine anticoagulation monitoring to maintain a therapeutic international normalised ratio (INR). Multiple new oral anticoagulants have been developed as potential replacements for VKAs for stroke prevention in AF. Most are small synthetic molecules that target thrombin (e.g. dabigatran etexilate) or factor Xa (e.g. rivaroxaban, apixaban, edoxaban, betrixaban, YM150). These drugs have predictable pharmacokinetics that allow fixed dosing without routine laboratory monitoring. Dabigatran etexilate, the first of these new oral anticoagulants to be approved by the United States Food and Drug Administration and the European Medicines Agency for stroke prevention in patients with non-valvular AF, represents an effective and safe alternative to VKAs. Under the auspices of the Regional Anticoagulation Working Group, a multidisciplinary group of experts in thrombosis and haemostasis from Central and Eastern Europe, an expert panel with expertise in AF convened to discuss practical, clinically important issues related to the long-term use of dabigatran for stroke prevention in non-valvular AF. The practical information reviewed in this article will help clinicians make appropriate use of this new therapeutic option in daily clinical practice.
Resumo:
Atmospheric concentrations of the three important greenhouse gases (GHGs) CO2, CH4 and N2O are mediated by processes in the terrestrial biosphere that are sensitive to climate and CO2. This leads to feedbacks between climate and land and has contributed to the sharp rise in atmospheric GHG concentrations since pre-industrial times. Here, we apply a process-based model to reproduce the historical atmospheric N2O and CH4 budgets within their uncertainties and apply future scenarios for climate, land-use change and reactive nitrogen (Nr) inputs to investigate future GHG emissions and their feedbacks with climate in a consistent and comprehensive framework1. Results suggest that in a business-as-usual scenario, terrestrial N2O and CH4 emissions increase by 80 and 45%, respectively, and the land becomes a net source of C by AD 2100. N2O and CH4 feedbacks imply an additional warming of 0.4–0.5 °C by AD 2300; on top of 0.8–1.0 °C caused by terrestrial carbon cycle and Albedo feedbacks. The land biosphere represents an increasingly positive feedback to anthropogenic climate change and amplifies equilibrium climate sensitivity by 22–27%. Strong mitigation limits the increase of terrestrial GHG emissions and prevents the land biosphere from acting as an increasingly strong amplifier to anthropogenic climate change.
Resumo:
Agricultural and forest productive diversification depends on multiple socioeconomic drivers—like knowledge, migration, productive capacity, and market—that shape productive strategies and influence their ecological impacts. Our comparison of indigenous and settlers allows a better understanding of how societies develop different diversification strategies in similar ecological contexts and how the related socioeconomic aspects of diversification are associated with land cover change. Our results suggest that although indigenous people cause less deforestation and diversify more, diversification is not a direct driver of deforestation reduction. A multidimensional approach linking sociocognitive, economic, and ecological patterns of diversification helps explain this contradiction.
Resumo:
The jatropha plant produces seeds containing 25–40% oil by weight. This oil can be made into biodiesel. During the recent global fuel crisis, the price of crude oil peaked at over USD 130 per barrel. Jatropha attracted huge interest – it was touted as a wonder crop that could generate biodiesel oil on “marginal lands” in semi-arid areas. Its promise appeared especially great in East Africa. Today, however, jatropha’s value in East Africa appears to lie primarily in its multipurpose use by small-scale farmers, not in large-scale biofuel production.