943 resultados para INTERLABORATORY COMPARISONS
Resumo:
Since the 1990s, regular comparisons of gamma-ray spectrometry in Switzerland were organized to improve laboratory abilities to measure the radioactivity in the environment and food stuffs at typical routine levels. The activity concentration of the test samples and the evaluation of the associated uncertainties remained each year the main required test result. Over the years, the comparisons used certified reference solutions as well as environmental samples. The aim of this study is to research the effect of the comparisons on measurement quality. An analysis of the seven last interlaboratory comparisons revealed that the Swiss measurement capability is up to date. In addition, the results showed that the participants now have an improved evaluation of the uncertainties associated with their measurement.
Resumo:
The increasing importance of pollutant noise has led to the creation of many new noise testing laboratories in recent years. For this reason and due to the legal implications that noise reporting may have, it is necessary to create procedures intended to guarantee the quality of the testing and its results. For instance, the ISO/IEC standard 17025:2005 specifies general requirements for the competence of testing laboratories. In this standard, interlaboratory comparisons are one of the main measures that must be applied to guarantee the quality of laboratories when applying specific methodologies for testing. In the specific case of environmental noise, round robin tests are usually difficult to design, as it is difficult to find scenarios that can be available and controlled while the participants carry out the measurements. Monitoring and controlling the factors that can influence the measurements (source emissions, propagation, background noise…) is not usually affordable, so the most extended solution is to create very effortless scenarios, where most of the factors that can have an influence on the results are excluded (sampling, processing of results, background noise, source detection…) The new approach described in this paper only requires the organizer to make actual measurements (or prepare virtual ones). Applying and interpreting a common reference document (standard, regulation…), the participants must analyze these input data independently to provide the results, which will be compared among the participants. The measurement costs are severely reduced for the participants, there is no need to monitor the scenario conditions, and almost any relevant factor can be included in this methodology
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensic science denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstrated its potential to distinguish chemically identical compounds coming from different sources. Despite the numerous applications of IRMS to a wide range of forensic materials, its implementation in a forensic framework is less straightforward than it appears. In addition, each laboratory has developed its own strategy of analysis on calibration, sequence design, standards utilisation and data treatment without a clear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose a methodological framework of the whole process using IRMS methods. We emphasize the importance of considering isotopic results as part of a whole approach, when applying this technology to a particular forensic issue. The process is divided into six different steps, which should be considered for a thoughtful and relevant application. The dissection of this process into fundamental steps, further detailed, enables a better understanding of the essential, though not exhaustive, factors that have to be considered in order to obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratory comparisons.
Resumo:
Heretofore the issue of quality in forensic science is approached through a quality management policy whose tenets are ruled by market forces. Despite some obvious advantages of standardization of methods allowing interlaboratory comparisons and implementation of databases, this approach suffers from a serious lack of consideration for forensic science as a science. A critical study of its principles and foundations, which constitutes its culture, enables to consider the matter of scientific quality through a new dimension. A better understanding of what pertains to forensic science ensures a better application and improves elementary actions within the investigative and intelligence processes as well as the judicial process. This leads to focus the attention on the core of the subject matter: the physical remnants of the criminal activity, namely, the traces that produce information in understanding this activity. Adapting practices to the detection and recognition of relevant traces relies on the apprehension of the processes underlying forensic science tenets (Locard, Kirk, relevancy issue) and a structured management of circumstantial information (directindirect information). This is influenced by forensic science education and training. However, the lack of homogeneity with regard to the scientific nature and culture of the discipline within forensic science practitioners and partners represents a real challenge. A sound and critical reconsideration of the forensic science practitioner's roles (investigator, evaluator, intelligence provider) and objectives (prevention, strategies, evidence provider) within the criminal justice system is a means to strengthen the understanding and the application of forensic science. Indeed, the whole philosophy is aimed at ensuring a high degree of excellence, namely, a dedicated scientific quality.
Resumo:
The use of inter-laboratory test comparisons to determine the performance of individual laboratories for specific tests (or for calibration) [ISO/IEC Guide 43-1, 1997. Proficiency testing by interlaboratory comparisons - Part 1: Development and operation of proficiency testing schemes] is called Proficiency Testing (PT). In this paper we propose the use of the generalized likelihood ratio test to compare the performance of the group of laboratories for specific tests relative to the assigned value and illustrate the procedure considering an actual data from the PT program in the area of volume. The proposed test extends the test criteria in use allowing to test for the consistency of the group of laboratories. Moreover, the class of elliptical distributions are considered for the obtained measurements. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.
Resumo:
Dissertação (Mestrado em Tecnologia Nuclear)
Resumo:
Universidade Estadual de Campinas, Faculdade de Educação Física
Resumo:
The metrological principles of neutron activation analysis are discussed. It has been demonstrated that this method can provide elemental amount of substance with values fully traceable to the SI. The method has been used by several laboratories worldwide in a number of CCQM key comparisons - interlaboratory comparison tests at the highest metrological level - supplying results equivalent to values from other methods for elemental or isotopic analysis in complex samples without the need to perform chemical destruction and dissolution of these samples. The CCOM accepted therefore in April 2007 the claim that neutron activation analysis should have the similar status as the methods originally listed by the CCOM as `primary methods of measurement`. Analytical characteristics and scope of application are given.
Resumo:
In this paper results of tests on 32 concrete-filled steel tubular columns under axial load are reported. The test parameters were the concrete compressive strength, the column slenderness (L/D) and the wall thickness (t). The test results were compared with predictions from the codes NBR 8800:2008 and EN 1994-1-1:2004 (EC4). The columns were 3, 5, 7 and 10 length to diameter ratios (L/D) and were tested with 30MPa, 60MPa, 80MPa and 100MPa concrete compressive strengths. The results of ultimate strength predicted by codes showed good agreement with experimental results. The results of NBR 8800 code were the most conservative and the EC4 showed the best results, in mean, but it was not conservative for usual concrete-filled short columns.
Resumo:
The primary objective of this study was to assess the lingual kinematic strategies used by younger and older adults to increase rate of speech. It was hypothesised that the strategies used by the older adults would differ from the young adults either as a direct result of, or in response to a need to compensate for, age-related changes in the tongue. Electromagnetic articulography was used to examine the tongue movements of eight young (M526.7 years) and eight older (M567.1 years) females during repetitions of /ta/ and /ka/ at a controlled moderate rate and then as fast as possible. The younger and older adults were found to significantly reduce consonant durations and increase syllable repetition rate by similar proportions. To achieve these reduced durations both groups appeared to use the same strategy, that of reducing the distances travelled by the tongue. Further comparisons at each rate, however, suggested a speed-accuracy trade-off and increased speech monitoring in the older adults. The results may assist in differentiating articulatory changes associated with normal aging from pathological changes found in disorders that affect the older population.
Resumo:
Matthiessen's ratio (distance from centre of lens to retina: lens radius) was measured in developing black bream, Acanthopagrus butcheri (Sparidae, Teleostei). The value decreased over the first 10 days post-hatch from 3.6 to 2.3 along the nasal and from four to 2.6 along temporal axis. Coincidentally, there was a decrease in the focal ratio of the lens (focal length:lens radius). Morphologically, the accommodatory retractor lentis muscle appeared to become functional between 10-12 days post-hatch. The results suggest that a higher focal ratio compensates for the relatively high Matthiessen's ratio brought about by constraints of small eye size during early development. Combined with differences in axial length, this provides a means for larval fish to focus images from different distances prior to the ability to accommodate. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
Dual-energy X-ray absorptiometry (DXA) is a widely used method for measuring bone mineral in the growing skeleton. Because scan analysis in children offers a number of challenges, we compared DXA results using six analysis methods at the total proximal femur (PF) and five methods at the femoral neck (FN), In total we assessed 50 scans (25 boys, 25 girls) from two separate studies for cross-sectional differences in bone area, bone mineral content (BMC), and areal bone mineral density (aBMD) and for percentage change over the short term (8 months) and long term (7 years). At the proximal femur for the short-term longitudinal analysis, there was an approximate 3.5% greater change in bone area and BMC when the global region of interest (ROI) was allowed to increase in size between years as compared with when the global ROI was held constant. Trend analysis showed a significant (p < 0.05) difference between scan analysis methods for bone area and BMC across 7 years. At the femoral neck, cross-sectional analysis using a narrower (from default) ROI, without change in location, resulted in a 12.9 and 12.6% smaller bone area and BMC, respectively (both p < 0.001), Changes in FN area and BMC over 8 months were significantly greater (2.3 %, p < 0.05) using a narrower FN rather than the default ROI, Similarly, the 7-year longitudinal data revealed that differences between scan analysis methods were greatest when the narrower FN ROI was maintained across all years (p < 0.001), For aBMD there were no significant differences in group means between analysis methods at either the PF or FN, Our findings show the need to standardize the analysis of proximal femur DXA scans in growing children.