884 resultados para Validation of analytical methodology
Resumo:
In the U.S., coal fired power plants produce over 136 million tons of coal combustion residuals (CCRs) annually. CCRs are enriched in toxic elements, and their leachates can have significant impacts on water quality. Here we report the boron and strontium isotopic ratios of leaching experiments on CCRs from a variety of coal sources (Appalachian, Illinois, and Powder River Basins). CCR leachates had a mostly negative δ(11)B, ranging from -17.6 to +6.3‰, and (87)Sr/(86)Sr ranging from 0.70975 to 0.71251. Additionally, we utilized these isotopic ratios for tracing CCR contaminants in different environments: (1) the 2008 Tennessee Valley Authority (TVA) coal ash spill affected waters; (2) CCR effluents from power plants in Tennessee and North Carolina; (3) lakes and rivers affected by CCR effluents in North Carolina; and (4) porewater extracted from sediments in lakes affected by CCRs. The boron isotopes measured in these environments had a distinctive negative δ(11)B signature relative to background waters. In contrast (87)Sr/(86)Sr ratios in CCRs were not always exclusively different from background, limiting their use as a CCR tracer. This investigation demonstrates the validity of the combined geochemical and isotopic approach as a unique and practical identification method for delineating and evaluating the environmental impact of CCRs.
Resumo:
BACKGROUND: Administrative or quality improvement registries may or may not contain the elements needed for investigations by trauma researchers. International Classification of Diseases Program for Injury Categorisation (ICDPIC), a statistical program available through Stata, is a powerful tool that can extract injury severity scores from ICD-9-CM codes. We conducted a validation study for use of the ICDPIC in trauma research. METHODS: We conducted a retrospective cohort validation study of 40,418 patients with injury using a large regional trauma registry. ICDPIC-generated AIS scores for each body region were compared with trauma registry AIS scores (gold standard) in adult and paediatric populations. A separate analysis was conducted among patients with traumatic brain injury (TBI) comparing the ICDPIC tool with ICD-9-CM embedded severity codes. Performance in characterising overall injury severity, by the ISS, was also assessed. RESULTS: The ICDPIC tool generated substantial correlations in thoracic and abdominal trauma (weighted κ 0.87-0.92), and in head and neck trauma (weighted κ 0.76-0.83). The ICDPIC tool captured TBI severity better than ICD-9-CM code embedded severity and offered the advantage of generating a severity value for every patient (rather than having missing data). Its ability to produce an accurate severity score was consistent within each body region as well as overall. CONCLUSIONS: The ICDPIC tool performs well in classifying injury severity and is superior to ICD-9-CM embedded severity for TBI. Use of ICDPIC demonstrates substantial efficiency and may be a preferred tool in determining injury severity for large trauma datasets, provided researchers understand its limitations and take caution when examining smaller trauma datasets.
Resumo:
Thin-layer and high-performance thin-layer chromatography (TLC/HPTLC) methods for assaying compound(s) in a sample must be validated to ensure that they are fit for their intended purpose and, where applicable, meet the strict regulatory requirements for controlled products. Two validation approaches are identified in the literature, i.e. the classic and the alternative, which is using accuracy profiles.Detailed procedures of the two approaches are discussed based on the validation of methods for pharmaceutical analysis, which is an area considered having more strict requirements. Estimation of the measurement uncertainty from the validation approach using accuracy profiles is also described.Examples of HPTLC methods, developed and validated to assay sulfamethoxazole and trimethoprim on the one hand and lamivudine, stavudine, and nevirapine on the other, in their fixed-dose combination tablets, are further elaborated.
Resumo:
En este trabajo se describe una experiencia llevada a cabo con profesores de matemáticas en formación, sobre el papel que pueden desarrollar las nuevas tecnologías para llevar a cabo procesos de demostración y prueba en el aula de secundaria.
Resumo:
En este trabajo se describe detalladamente una experiencia llevada a cabo con profesores de matemáticas en formación, sobre el papel que pueden desarrollar las nuevas tecnologías para llevar a cabo procesos de demostración y prueba en el aula de secundaria.
Resumo:
In this paper, the buildingEXODUS (V1.1) evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data set used for the validation is the Tsukuba pavilion evacuation data. This data set is of particular interest as the evacuation was influenced by external conditions, namely inclement weather. As part of the validation exercise, the sensitivity of the buildingEXODUS predictions to a range of variables and conditions is examined, including; exit flow capacity, occupant response times and the impact of external conditions on the developing evacuation. The buildingEXODUS evacuation model was found to be able to produce good qualitative and quantitative agreement with the experimental data.
Resumo:
In this paper, the buildingEXODUS (V1.1) evacuation model is described and discussed and attempts at qualitative and quantitative model validation are presented. The data sets used for validation are the Stapelfeldt and Milburn House evacuation data. As part of the validation exercise, the sensitivity of the buildingEXODUS predictions to a range of variables is examined, including: occupant drive, occupant location, exit flow capacity, exit size, occupant response times and geometry definition. An important consideration that has been highlighted by this work is that any validation exercise must be scrutinised to identify both the results generated and the considerations and assumptions on which they are based. During the course of the validation exercise, both data sets were found to be less than ideal for the purpose of validating complex evacuation models. However, the buildingEXODUS evacuation model was found to be able to produce reasonable qualitative and quantitative agreement with the experimental data.
Resumo:
Magnetic fields are used in a number of processes related to the extraction of metals, production of alloys and the shaping of metal components. Computational techniques have an increasingly important role to play in the simulation of such processes, since it is often difficult or very costly to conduct experiments in the high temperature conditions encountered and the complex interaction of fluid flow, heat transfer and magnetic fields means simple analytic models are often far removed from reality. In this paper an overview of the computational activity at the University of Greenwich is given in this area, covering the past ten years. The overview is given from the point of view of the modeller and within the space limitations imposed by the format it covers the numerical methods used, attempts at validation against experiments or analytic procedures; it highlights successes, but also some failures. A broad range of models is covered in the review (and accompanying lecture), used to simulate (a) A-C field applications: induction melting, magnetic confinement and levitation, casting and (b) D-C field applications such as: arc welding and aluminium electroloysis. Most of these processes involve phase change of the metal (melting or solidification), the presence of a dynamic free surface and turbulent flow. These issues affect accuracy and need to be address by the modeller.
Resumo:
A finite volume computer model of the continuous casting process for steel flat products has been developed. In this first stage, the model concentrates on the hydrodynamic aspects of the process and in particular the dynamic behavior of the metal/slag interface. The model was validated against experimental measurements obtained in a water model apparatus.
Resumo:
Today, the key to commercial success in manufacturing is the timely development of new products that are not only functionally fit for purpose but offer high performance and quality throughout their entire lifecycle. In principle, this demands the introduction of a fully developed and optimised product from the outset. To accomplish this, manufacturing companies must leverage existing knowledge in their current technical, manufacturing and service capabilities. This is especially true in the field of tolerance selection and application, the subject area of this research. Tolerance knowledge must be readily available and deployed as an integral part of the product development process. This paper describes a methodology and framework,currently under development in a UK manufacturer, to achieve this objective.