877 resultados para Measuring standard
Resumo:
This paper proposes the establishment of a second diameter measuring standard at 30cm shoot extension ('diam30') as input variable for allometric biomass estimation of small and mid-sized plant shoots. This diameter standard is better suited than the diameter at breast height (DBH, i.e. diameter at 1.30m shoot extension) for adequate characterization of plant dimensions in low bushy vegetation or in primary forest undergrowth. The relationships between both diameter standards are established based on a dataset of 8645 tree, liana and palm shoots in secondary and primary forests of central Amazonia (ranging from 1-150mm dbh). Dbh can be predicted from the diam(30) with high precision, the error introduced by diameter transformation is only 2-3% for trees and palms, and 5% for lianas. This is well acceptable for most field study purposes. Relationships deviate slightly from linearity and differ between growth forms. Relationships were markedly similar for different vegetation types (low secondary regrowth vs. primary forests), soils, and selected genera or species. This points to a general validity and applicability of diameter transformations for other field studies. This study provides researchers with a tool for the allometric estimation of biomass in low or structurally heterogeneous vegetation. Rather than applying a uniform diameter standard, the measuring position which best represents the respective plant can be decided on shoot-by-shoot. Plant diameters measured at 30cm height can be transformed to dbh for subsequent allometric biomass estimation. We recommend the use of these diameter transformations only for plants extending well beyond the theoretical minimum shoot length (i.e., >2m height). This study also prepares the ground for the comparability and compatability of future allometric equations specifically developed for small- to mid-sized vegetation components (i.e., bushes, undergrowth) which are based on the diam(30) measuring standard.
Resumo:
The continuous development of instruments and equipment used as tools or torque measurement in the industry is demanding more accurate techniques in the use of this kind instrumentation, including development of metrological characteristics in torque measurement. The same happens with the needs in calibration services. There is a diversity of methods of hand torque tools in the market with different measuring range but without complaining with technical standards in terms of requirements of quality and reliability. However, actually there is no choice of a torque measuring standard that fulfils, with low cost, the needs for the calibration of hand torque tools in a large number of ranges. The objective of this thesis is to show the development and evaluation of a torque measuring standard device with a conception to allow the calibration of hand torque tools with three levels of torque with an single instrument, promoting reduction of costs and time in the calibration, also offering reliability for the evaluation of torque measuring instrument. To attend the demand in the calibration of hand torque tools it is necessary that the calibration laboratories have a big collection of torque measuring standards, to fulfills the needs of the costumer, what is very costly. The development of this type of torque measuring standard revealed a viable technique and economically making possible the calibration of hand torque tools in different nominal ranges through a single measurement system versatile, efficient and of easy operation
Resumo:
Advances in information technology and global data availability have opened the door for assessments of sustainable development at a truly macro scale. It is now fairly easy to conduct a study of sustainability using the entire planet as the unit of analysis; this is precisely what this work set out to accomplish. The study began by examining some of the best known composite indicator frameworks developed to measure sustainability at the country level today. Most of these were found to value human development factors and a clean local environment, but to gravely overlook consumption of (remote) resources in relation to nature’s capacity to renew them, a basic requirement for a sustainable state. Thus, a new measuring standard is proposed, based on the Global Sustainability Quadrant approach. In a two‐dimensional plot of nations’ Human Development Index (HDI) vs. their Ecological Footprint (EF) per capita, the Sustainability Quadrant is defined by the area where both dimensions satisfy the minimum conditions of sustainable development: an HDI score above 0.8 (considered ‘high’ human development), and an EF below the fair Earth‐share of 2.063 global hectares per person. After developing methods to identify those countries that are closest to the Quadrant in the present‐day and, most importantly, those that are moving towards it over time, the study tackled the question: what indicators of performance set these countries apart? To answer this, an analysis of raw data, covering a wide array of environmental, social, economic, and governance performance metrics, was undertaken. The analysis used country rank lists for each individual metric and compared them, using the Pearson Product Moment Correlation function, to the rank lists generated by the proximity/movement relative to the Quadrant measuring methods. The analysis yielded a list of metrics which are, with a high degree of statistical significance, associated with proximity to – and movement towards – the Quadrant; most notably: Favorable for sustainable development: use of contraception, high life expectancy, high literacy rate, and urbanization. Unfavorable for sustainable development: high GDP per capita, high language diversity, high energy consumption, and high meat consumption. A momentary gain, but a burden in the long‐run: high carbon footprint and debt. These results could serve as a solid stepping stone for the development of more reliable composite index frameworks for assessing countries’ sustainability.
Resumo:
Mode of access: Internet.
Resumo:
We have developed a highly sensitive cytolysis test, the fluorolysis assay, as a simple nonradioactive and inexpensive alternative to the standard Cr-51-release assay. P815 cells were stably transfected with a plasmid expressing the enhanced green fluorescent protein (EGFP) gene. These target cells were coated with or without cognate peptide or anti-CD3 Ab and then incubated with CD8(+) T cells to allow antigen-specific or nonspecific lysis. The degree of target cell lysis was measured using flow cytometry to count the percentage of viable propidium iodide(-) EGFP(+) cells, whose numbers were standardized to a reference number of fluorochrome-linked beads. By using small numbers of target cells (200-800 per reaction) and extended incubation times (up to 2 days), the antigen-specific cytolytic activity of one to two activated CD8(+) T cells of a CTL line could be detected. The redirected fluorolysis assay also measured the activity of very few ( greater than or equal to6) primary CD8(+) T cells following polyclonal activation. Importantly, antigen-specific lysis by small numbers ( greater than or equal to 25) of primary CD8(+) T cells could be directly measured ex vivo. This exquisite sensitivity of the fluorolysis assay, which was at least 8-33-folds higher than an optimized 51 Cr-release assay, allows in vitro and ex vivo studies of immune responses that would otherwise not be possible due to low CTL numbers or frequencies. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.
Resumo:
The use of adhesive joints has increased in recent decades due to its competitive features compared with traditional methods. This work aims to estimate the tensile critical strain energy release rate (GIC) of adhesive joints by the Double-Cantilever Beam (DCB) test. The J-integral is used since it enables obtaining the tensile Cohesive Zone Model (CZM) law. An optical measuring method was developed for assessing the crack tip opening (δn) and adherends rotation (θo). The proposed CZM laws were best approximated by a triangular shape for the brittle adhesive and a trapezoidal shape for the two ductile adhesives.
Resumo:
Doxorubicin is an antineoplasic agent active against sarcoma pulmonary metastasis, but its clinical use is hampered by its myelotoxicity and its cumulative cardiotoxicity, when administered systemically. This limitation may be circumvented using the isolated lung perfusion (ILP) approach, wherein a therapeutic agent is infused locoregionally after vascular isolation of the lung. The influence of the mode of infusion (anterograde (AG): through the pulmonary artery (PA); retrograde (RG): through the pulmonary vein (PV)) on doxorubicin pharmacokinetics and lung distribution was unknown. Therefore, a simple, rapid and sensitive high-performance liquid chromatography method has been developed to quantify doxorubicin in four different biological matrices (infusion effluent, serum, tissues with low or high levels of doxorubicin). The related compound daunorubicin was used as internal standard (I.S.). Following a single-step protein precipitation of 500 microl samples with 250 microl acetone and 50 microl zinc sulfate 70% aqueous solution, the obtained supernatant was evaporated to dryness at 60 degrees C for exactly 45 min under a stream of nitrogen and the solid residue was solubilized in 200 microl of purified water. A 100 microl-volume was subjected to HPLC analysis onto a Nucleosil 100-5 microm C18 AB column equipped with a guard column (Nucleosil 100-5 microm C(6)H(5) (phenyl) end-capped) using a gradient elution of acetonitrile and 1-heptanesulfonic acid 0.2% pH 4: 15/85 at 0 min-->50/50 at 20 min-->100/0 at 22 min-->15/85 at 24 min-->15/85 at 26 min, delivered at 1 ml/min. The analytes were detected by fluorescence detection with excitation and emission wavelength set at 480 and 550 nm, respectively. The calibration curves were linear over the range of 2-1000 ng/ml for effluent and plasma matrices, and 0.1 microg/g-750 microg/g for tissues matrices. The method is precise with inter-day and intra-day relative standard deviation within 0.5 and 6.7% and accurate with inter-day and intra-day deviations between -5.4 and +7.7%. The in vitro stability in all matrices and in processed samples has been studied at -80 degrees C for 1 month, and at 4 degrees C for 48 h, respectively. During initial studies, heparin used as anticoagulant was found to profoundly influence the measurements of doxorubicin in effluents collected from animals under ILP. Moreover, the strong matrix effect observed with tissues samples indicate that it is mandatory to prepare doxorubicin calibration standard samples in biological matrices which would reflect at best the composition of samples to be analyzed. This method was successfully applied in animal studies for the analysis of effluent, serum and tissue samples collected from pigs and rats undergoing ILP.
Resumo:
The objectives of this study were to develop a computerized method to screen for potentially avoidable hospital readmissions using routinely collected data and a prediction model to adjust rates for case mix. We studied hospital information system data of a random sample of 3,474 inpatients discharged alive in 1997 from a university hospital and medical records of those (1,115) readmitted within 1 year. The gold standard was set on the basis of the hospital data and medical records: all readmissions were classified as foreseen readmissions, unforeseen readmissions for a new affection, or unforeseen readmissions for a previously known affection. The latter category was submitted to a systematic medical record review to identify the main cause of readmission. Potentially avoidable readmissions were defined as a subgroup of unforeseen readmissions for a previously known affection occurring within an appropriate interval, set to maximize the chance of detecting avoidable readmissions. The computerized screening algorithm was strictly based on routine statistics: diagnosis and procedures coding and admission mode. The prediction was based on a Poisson regression model. There were 454 (13.1%) unforeseen readmissions for a previously known affection within 1 year. Fifty-nine readmissions (1.7%) were judged avoidable, most of them occurring within 1 month, which was the interval used to define potentially avoidable readmissions (n = 174, 5.0%). The intra-sample sensitivity and specificity of the screening algorithm both reached approximately 96%. Higher risk for potentially avoidable readmission was associated with previous hospitalizations, high comorbidity index, and long length of stay; lower risk was associated with surgery and delivery. The model offers satisfactory predictive performance and a good medical plausibility. The proposed measure could be used as an indicator of inpatient care outcome. However, the instrument should be validated using other sets of data from various hospitals.
Resumo:
A simple tool to quantify discrepancies between knowledge, preoccupation and fear regarding hiv and aids is presented. This tool is based on standard questions available in health surveys. Some results using recent Swiss data are presented, and the method is discussed.
Resumo:
Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.
Resumo:
This paper describes a mesurement system designed to register the displacement of the legs using a two-dimensional laser range sensor with a scanning plane parallel to the ground and extract gait parameters. In the proposed methodology, the position of the legs is estimated by fitting two circles with the laser points that define their contour and the gait parameters are extracted applying a step-line model to the estimated displacement of the legs to reduce uncertainty in the determination of the stand and swing phase of the gait. Results obtained in a range up to 8 m shows that the systematic error in the location of one static leg is lower than 10 mm with and standard deviation lower than 8 mm; this deviation increases to 11 mm in the case of a moving leg. The proposed measurement system has been applied to estimate the gait parameters of six volunteers in a preliminary walking experiment.
Resumo:
The functional method is a new test theory using a new scoring method that assumes complexity in test structure, and thus takes into account every correlation between factors and items. The main specificity of the functional method is to model test scores by multiple regression instead of estimating them by using simplistic sums of points. In order to proceed, the functional method requires the creation of hyperspherical measurement space, in which item responses are expressed by their correlation with orthogonal factors. This method has three main qualities. First, measures are expressed in the absolute metric of correlations; therefore, items, scales and persons are expressed in the same measurement space using the same single metric. Second, factors are systematically orthogonal and without errors, which is optimal in order to predict other outcomes. Such predictions can be performed to estimate how one would answer to other tests, or even to model one's response strategy if it was perfectly coherent. Third, the functional method provides measures of individuals' response validity (i.e., control indices). Herein, we propose a standard procedure in order to identify whether test results are interpretable and to exclude invalid results caused by various response biases based on control indices.
Resumo:
The RFLP/PCR approach (restriction fragment length polymorphism/polymerase chain reaction) to genotypic mutation analysis described here measures mutations in restriction recognition sequences. Wild-type DNA is restricted before the resistant, mutated sequences are amplified by PCR and cloned. We tested the capacity of this experimental design to isolate a few copies of a mutated sequence of the human c-Ha-ras1 gene from a large excess of wild-type DNA. For this purpose we constructed a 272 bp fragment with 2 mutations in the PvuII recognition sequence 1727-1732 and studied the rescue by RFLP/PCR of a few copies of this 'PvuII mutant standard'. Following amplification with Taq-polymerase and cloning into lambda gt10, plaques containing wild-type sequence, PvuII mutant standard or Taq-polymerase induced bp changes were quantitated by hybridization with specific oligonucleotide probes. Our results indicate that 10 PvuII mutant standard copies can be rescued from 10(8) to 10(9) wild-type sequences. Taq polymerase errors originating from unrestricted, residual wild-type DNA were sequence dependent and consisted mostly of transversions originating at G.C bp. In contrast to a doubly mutated 'standard' the capacity to rescue single bp mutations by RFLP/PCR is limited by Taq-polymerase errors. Therefore, we assessed the capacity of our protocol to isolate a G to T transversion mutation at base pair 1698 of the MspI-site 1695-1698 of the c-Ha-ras1 gene from excess wild-type ras1 DNA. We found that 100 copies of the mutated ras1 fragment could be readily rescued from 10(8) copies of wild-type DNA.
Resumo:
This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.