968 resultados para STANDARDIZATION
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
OBJECTIVES The aim of the current Valve Academic Research Consortium (VARC)-2 initiative was to revisit the selection and definitions of transcatheter aortic valve implantation (TAVI) clinical endpoints to make them more suitable to the present and future needs of clinical trials. In addition, this document is intended to expand the understanding of patient risk stratification and case selection. BACKGROUND A recent study confirmed that VARC definitions have already been incorporated into clinical and research practice and represent a new standard for consistency in reporting clinical outcomes of patients with symptomatic severe aortic stenosis (AS) undergoing TAVI. However, as the clinical experience with this technology has matured and expanded, certain definitions have become unsuitable or ambiguous. METHODS AND RESULTS Two in-person meetings (held in September 2011 in Washington, DC, and in February 2012 in Rotterdam, The Netherlands) involving VARC study group members, independent experts (including surgeons, interventional and noninterventional cardiologists, imaging specialists, neurologists, geriatric specialists, and clinical trialists), the US Food and Drug Administration (FDA), and industry representatives, provided much of the substantive discussion from which this VARC-2 consensus manuscript was derived. This document provides an overview of risk assessment and patient stratification that need to be considered for accurate patient inclusion in studies. Working groups were assigned to define the following clinical endpoints: mortality, stroke, myocardial infarction, bleeding complications, acute kidney injury, vascular complications, conduction disturbances and arrhythmias, and a miscellaneous category including relevant complications not previously categorized. Furthermore, comprehensive echocardiographic recommendations are provided for the evaluation of prosthetic valve (dys)function. Definitions for the quality of life assessments are also reported. These endpoints formed the basis for several recommended composite endpoints. CONCLUSIONS This VARC-2 document has provided further standardization of endpoint definitions for studies evaluating the use of TAVI, which will lead to improved comparability and interpretability of the study results, supplying an increasingly growing body of evidence with respect to TAVI and/or surgical aortic valve replacement. This initiative and document can furthermore be used as a model during current endeavors of applying definitions to other transcatheter valve therapies (for example, mitral valve repair).
Resumo:
A large body of published work shows that proton (hydrogen 1 [(1)H]) magnetic resonance (MR) spectroscopy has evolved from a research tool into a clinical neuroimaging modality. Herein, the authors present a summary of brain disorders in which MR spectroscopy has an impact on patient management, together with a critical consideration of common data acquisition and processing procedures. The article documents the impact of (1)H MR spectroscopy in the clinical evaluation of disorders of the central nervous system. The clinical usefulness of (1)H MR spectroscopy has been established for brain neoplasms, neonatal and pediatric disorders (hypoxia-ischemia, inherited metabolic diseases, and traumatic brain injury), demyelinating disorders, and infectious brain lesions. The growing list of disorders for which (1)H MR spectroscopy may contribute to patient management extends to neurodegenerative diseases, epilepsy, and stroke. To facilitate expanded clinical acceptance and standardization of MR spectroscopy methodology, guidelines are provided for data acquisition and analysis, quality assessment, and interpretation. Finally, the authors offer recommendations to expedite the use of robust MR spectroscopy methodology in the clinical setting, including incorporation of technical advances on clinical units. © RSNA, 2014 Online supplemental material is available for this article.
Resumo:
Internet of Things based systems are anticipated to gain widespread use in industrial applications. Standardization efforts, like 6L0WPAN and the Constrained Application Protocol (CoAP) have made the integration of wireless sensor nodes possible using Internet technology and web-like access to data (RESTful service access). While there are still some open issues, the interoperability problem in the lower layers can now be considered solved from an enterprise software vendors' point of view. One possible next step towards integration of real-world objects into enterprise systems and solving the corresponding interoperability problems at higher levels is to use semantic web technologies. We introduce an abstraction of real-world objects, called Semantic Physical Business Entities (SPBE), using Linked Data principles. We show that this abstraction nicely fits into enterprise systems, as SPBEs allow a business object centric view on real-world objects, instead of a pure device centric view. The interdependencies between how currently services in an enterprise system are used and how this can be done in a semantic real-world aware enterprise system are outlined, arguing for the need of semantic services and semantic knowledge repositories. We introduce a lightweight query language, which we use to perform a quantitative analysis of our approach to demonstrate its feasibility.
Resumo:
This article describes the outcome and follow-up discussions of an expert group meeting (Amsterdam, October 9, 2009) on the applicability of toxicity profiling for diagnostic environmental risk assessment. A toxicity profile was defined as a toxicological "fingerprint" of a sample, ranging from a pure compound to a complex mixture, obtained by testing the sample or its extract for its activity toward a battery of biological endpoints. The expert group concluded that toxicity profiling is an effective first tier tool for screening the integrated hazard of complex environmental mixtures with known and unknown toxicologically active constituents. In addition, toxicity profiles can be used for prioritization of sampling locations, for identification of hot spots, and--in combination with effect-directed analysis (EDA) or toxicity identification and evaluation (TIE) approaches--for establishing cause-effect relationships by identifying emerging pollutants responsible for the observed toxic potency. Small volume in vitro bioassays are especially applicable for these purposes, as they are relatively cheap and fast with costs comparable to chemical analyses, and the results are toxicologically more relevant and more suitable for realistic risk assessment. For regulatory acceptance in the European Union, toxicity profiling terminology should keep as close as possible to the European Water Framework Directive (WFD) terminology, and validation, standardization, statistical analyses, and other quality aspects of toxicity profiling should be further elaborated.
Resumo:
This study describes the development and validation of a gas chromatography-mass spectrometry (GC-MS) method to identify and quantitate phenytoin in brain microdialysate, saliva and blood from human samples. A solid-phase extraction (SPE) was performed with a nonpolar C8-SCX column. The eluate was evaporated with nitrogen (50°C) and derivatized with trimethylsulfonium hydroxide before GC-MS analysis. As the internal standard, 5-(p-methylphenyl)-5-phenylhydantoin was used. The MS was run in scan mode and the identification was made with three ion fragment masses. All peaks were identified with MassLib. Spiked phenytoin samples showed recovery after SPE of ≥94%. The calibration curve (phenytoin 50 to 1,200 ng/mL, n = 6, at six concentration levels) showed good linearity and correlation (r² > 0.998). The limit of detection was 15 ng/mL; the limit of quantification was 50 ng/mL. Dried extracted samples were stable within a 15% deviation range for ≥4 weeks at room temperature. The method met International Organization for Standardization standards and was able to detect and quantify phenytoin in different biological matrices and patient samples. The GC-MS method with SPE is specific, sensitive, robust and well reproducible, and is therefore an appropriate candidate for the pharmacokinetic assessment of phenytoin concentrations in different human biological samples.
Resumo:
Multimodal therapy concepts have been successfully implemented in the treatment of locally advanced gastrointestinal malignancies. The effects of neoadjuvant chemo- or radiochemotherapy such as scarry fibrosis or resorptive changes and inflammation can be determined by histopathological investigation of the subsequent resection specimen. Tumor regression grading (TRG) systems which aim to categorize the amount of regressive changes after cytotoxic treatment mostly refer onto the amount of therapy induced fibrosis in relation to residual tumor or the estimated percentage of residual tumor in relation to the previous tumor site. Commonly used TRGs for upper gastrointestinal carcinomas are the Mandard grading and the Becker grading system, e.g., and for rectal cancer the Dworak or the Rödel grading system, or other systems which follow similar definitions. Namely for gastro-esophageal carcinomas these TRGs provide important prognostic information since complete or subtotal tumor regression has shown to be associated with better patient's outcome. The prognostic value of TRG may even exceed those of currently used staging systems (e.g., TNM staging) for tumors treated by neoadjuvant therapy. There have been some limitations described regarding interobserver variability especially in borderline cases, which may be improved by standardization of work up of resection specimen and better training of histopathologic determination of regressive changes. It is highly recommended that TRG should be implemented in every histopathological report of neoadjuvant treated gastrointestinal carcinomas. The aim of this review is to disclose the relevance of histomorphological TRG to accomplish an optimal therapy for patients with gastrointestinal carcinomas.
Resumo:
INTRODUCTION In patients with metastatic colorectal cancers, multimodal management and the use of biological agents such as monoclonal antibodies have had major positive effects on survival. The ability to predict which patients may be at 'high risk' of distant metastasis could have major implications on patient management. Histomorphological, immunohistochemical or molecular biomarkers are currently being investigated in order to test their potential value as predictors of metastasis. AREAS COVERED Here, the author reviews the clinical and functional data supporting the investigation of three novel promising biomarkers for the prediction of metastasis in patients with colorectal cancer: tumor budding, Raf1 kinase inhibitor protein (RKIP) and metastasis-associated in colon cancer-1 (MACC1). EXPERT OPINION The lifespan of most potential biomarkers is short as evidenced by the rare cases that have successfully made their way into daily practice such as KRAS or microsatellite instability (MSI) status. Although the three biomarkers reviewed herein have the potential to become important predictive biomarkers of metastasis, they have similar hurdles to overcome before they can be implemented into clinical management: standardization and validation in prospective patient cohorts.
Resumo:
BACKGROUND While the assessment of analytical precision within medical laboratories has received much attention in scientific enquiry, the degree of as well as the sources causing variation between them remains incompletely understood. In this study, we quantified the variance components when performing coagulation tests with identical analytical platforms in different laboratories and computed intraclass correlations coefficients (ICC) for each coagulation test. METHODS Data from eight laboratories measuring fibrinogen twice in twenty healthy subjects with one out of 3 different platforms and single measurements of prothrombin time (PT), and coagulation factors II, V, VII, VIII, IX, X, XI and XIII were analysed. By platform, the variance components of (i) the subjects, (ii) the laboratory and the technician and (iii) the total variance were obtained for fibrinogen as well as (i) and (iii) for the remaining factors using ANOVA. RESULTS The variability for fibrinogen measurements within a laboratory ranged from 0.02 to 0.04, the variability between laboratories ranged from 0.006 to 0.097. The ICC for fibrinogen ranged from 0.37 to 0.66 and from 0.19 to 0.80 for PT between the platforms. For the remaining factors the ICC's ranged from 0.04 (FII) to 0.93 (FVIII). CONCLUSIONS Variance components that could be attributed to technicians or laboratory procedures were substantial, led to disappointingly low intraclass correlation coefficients for several factors and were pronounced for some of the platforms. Our findings call for sustained efforts to raise the level of standardization of structures and procedures involved in the quantification of coagulation factors.
Resumo:
A complex of interrelated factors including minority status, poverty, education, health status, and other factors determine the general welfare of children in America, particularly in heavily diverse states such as Texas. Although racial/ethnic status is clearly only a concomitant factor in that determination it is a factor for which future projections are available and for which the relationships with the other factors in the complex can be assessed. After examining the nature of the interrelationships between these factors we utilize direct standardization techniques to examine how the future diversification of the United States and Texas will affect the number of children in poverty, the educational status of the householders in households in which children in poverty live and the health status of children in 2040 assuming that the current relationships between minority status and these socioeconomic factors continue into the future. In the results of the analyses, data are compared with the total population of the United States and Texas in 2040 assumed in the first simulation scenario, to have the race/ethnicity characteristics of 2008 and in the second those projected for 2040 by the U.S. Census Bureau for the nation and by the Texas State Data Center for Texas in 2040. The results show that the diversification of the population could increase the number of children in poverty in the United States by nearly 1.8 million more than would occur with the lower levels of diversification evident in 2008. In addition, poverty would become increasingly concentrated among minority children with minority children accounting for 76.2 percent of all children in poverty by 2040 and with Hispanic children accounting for nearly half of the children in poverty by 2040. Results for educational attainment show an increasing concentration of minority children in households with householders with very low levels of education such that by 2040, 85.2 percent of the increase in the number of children in poverty would be in households with a householder with less than a high school level of education. Finally, the results related to several health status factors show that children in poverty will have a higher prevalence of nearly all health conditions. For example, the number of children with untreated dental conditions could increase to more than 4 million in the United States and to nearly 500,000 in Texas. The results clearly show that improving the welfare of children in America will require concerted efforts to change the poverty, educational, and health status characteristics associated with minority status and particularly Hispanic status. Failing to do so will lead to a future in which America’s children are increasingly impoverished, more poorly educated, and less healthy and which, as a result, is an America with a more tentative future.
Resumo:
After the introduction of the liberal-democratic constitutions in the Swiss cantons in the first half of the 1830ies the grid of existing schools has been systemized and broadly expanded. The school systems have ever since been characterized by one key element: a special local authority type called „Schulkommission“ or „Schulpflege“. They take the form of committees consisting of laymen that are appointed by democratic elections like all the other executive bodies on the different federal levels in Switzerland. When it comes to their obligations and activities these community level school committees conform very much to the school boards in the American and Canadian school systems. They are accountable for the selection and supervision of the teachers. They approve decisions about the school careers of pupils and about curricular matters like the choice of school books. Normally their members are elected by the local voters for four year terms of office (reelection remains possible) and with regard to pedagogics they normally are non-professionals. The board members are responsible for classes and teachers assigned to them and they have to go to see them periodically. These visitations and the board meetings each month together with the teachers enable the board members to attain a deep insight into what happens in their schools over the course of their term of office. But they are confronted as laymen with a professional teaching staff and with educational experts in the public administration. Nevertheless this form of executive power by non-professionals is constitutive for the state governance in the Swiss as well as in other national political environments. It corresponds to the principles of subsidiarity and militia and therefore allows for a strong accentuation of liberty and the right of self-determination, two axioms at the very base of democratic federalist ideology. This governance architecture with this strong accent on local anchorage features substantial advantages for the legitimacy and acceptability of political and administrative decisions. And this is relevant especially in the educational area because the rearing of the offspring is a project of hope and, besides, quite costly. In the public opinion such supervision bodies staffed by laymen seem to have certain credibility advances in comparison with the professional administration. They are given credit to be capable of impeding the waste of common financial resources and of warranting the protection and the fostering of the community’s children at once. Especially because of their non-professional character they are trusted to be reliably immune against organizational blindness and they seem to be able to defend the interests of the local community against the standardization and centralization aspirations originating from the administrational expertocracy. In the paper these common rationales will be underpinned by results of a comprehensive historical analysis of the Session protocols of three Bernese school commissions from 1835 to 2005.
Resumo:
To improve our understanding of the Asian monsoon system, we developed a hydroclimate reconstruction in a marginal monsoon shoulder region for the period prior to the industrial era. Here, we present the first moisture sensitive tree-ring chronology, spanning 501 years for the Dieshan Mountain area, a boundary region of the Asian summer monsoon in the northeastern Tibetan Plateau. This reconstruction was derived from 101 cores of 68 old-growth Chinese pine (Pinus tabulaeformis) trees. We introduce a Hilbert–Huang Transform (HHT) based standardization method to develop the tree-ring chronology, which has the advantages of excluding non-climatic disturbances in individual tree-ring series. Based on the reliable portion of the chronology, we reconstructed the annual (prior July to current June) precipitation history since 1637 for the Dieshan Mountain area and were able to explain 41.3% of the variance. The extremely dry years in this reconstruction were also found in historical documents and are also associated with El Niño episodes. Dry periods were reconstructed for 1718–1725, 1766–1770 and 1920–1933, whereas 1782–1788 and 1979–1985 were wet periods. The spatial signatures of these events were supported by data from other marginal regions of the Asian summer monsoon. Over the past four centuries, out-of-phase relationships between hydroclimate variations in the Dieshan Mountain area and far western Mongolia were observed during the 1718–1725 and 1766–1770 dry periods and the 1979–1985 wet period.
Resumo:
This paper presents a survey on the usage, opportunities and pitfalls of semantic technologies in the Internet of Things. The survey was conducted in the context of a semantic enterprise integration platform. In total we surveyed sixty-one individuals from industry and academia on their views and current usage of IoT technologies in general, and semantic technologies in particular. Our semantic enterprise integration platform aims for interoperability at a service level, as well as at a protocol level. Therefore, also questions regarding the use of application layer protocols, network layer protocols and management protocols were integrated into the survey. The survey suggests that there is still a lot of heterogeneity in IoT technologies, but first indications of the use of standardized protocols exist. Semantic technologies are being recognized as of potential use, mainly in the management of things and services. Nonetheless, the participants still see many obstacles which hinder the widespread use of semantic technologies: Firstly, a lack of training as traditional embedded programmers are not well aware of semantic technologies. Secondly, a lack of standardization in ontologies, which would enable interoperability and thirdly, a lack of good tooling support.
Resumo:
OBJECTIVE To review systematic reviews and meta-analyses of integrated care programmes in chronically ill patients, with a focus on methodological quality, elements of integration assessed and effects reported. DESIGN Meta-review of systematic reviews and meta-analyses identified in Medline (1946-March 2012), Embase (1980-March 2012), CINHAL (1981-March 2012) and the Cochrane Library of Systematic Reviews (issue 1, 2012). MAIN OUTCOME MEASURES Methodological quality assessed by the 11-item Assessment of Multiple Systematic Reviews (AMSTAR) checklist; elements of integration assessed using a published list of 10 key principles of integration; effects on patient-centred outcomes, process quality, use of healthcare and costs. RESULTS Twenty-seven systematic reviews were identified; conditions included chronic heart failure (CHF; 12 reviews), diabetes mellitus (DM; seven reviews), chronic obstructive pulmonary disease (COPD; seven reviews) and asthma (five reviews). The median number of AMSTAR checklist items met was five: few reviewers searched for unpublished literature or described the primary studies and interventions in detail. Most reviews covered comprehensive services across the care continuum or standardization of care through inter-professional teams, but organizational culture, governance structure or financial management were rarely assessed. A majority of reviews found beneficial effects of integration, including reduced hospital admissions and re-admissions (in CHF and DM), improved adherence to treatment guidelines (DM, COPD and asthma) or quality of life (DM). Few reviews showed reductions in costs. CONCLUSIONS Systematic reviews of integrated care programmes were of mixed quality, assessed only some components of integration of care, and showed consistent benefits for some outcomes but not others.
Resumo:
Histomorphometric evaluation of the buccal aspects of periodontal tissues in rodents requires reproducible alignment of maxillae and highly precise sections containing central sections of buccal roots; this is a cumbersome and technically sensitive process due to the small specimen size. The aim of the present report is to describe and analyze a method to transfer virtual sections of micro-computer tomographic (CT)-generated image stacks to the microtome for undecalcified histological processing and to describe the anatomy of the periodontium in rat molars. A total of 84 undecalcified sections of all buccal roots of seven untreated rats was analyzed. The accuracy of section coordinate transfer from virtual micro-CT slice to the histological slice, right-left side differences and the measurement error for linear and angular measurements on micro-CT and on histological micrographs were calculated using the Bland-Altman method, interclass correlation coefficient and the method of moments estimator. Also, manual alignment of the micro-CT-scanned rat maxilla was compared with multiplanar computer-reconstructed alignment. The supra alveolar rat anatomy is rather similar to human anatomy, whereas the alveolar bone is of compact type and the keratinized gingival epithelium bends apical to join the junctional epithelium. The high methodological standardization presented herein ensures retrieval of histological slices with excellent display of anatomical microstructures, in a reproducible manner, minimizes random errors, and thereby may contribute to the reduction of number of animals needed.