891 resultados para Units of measurement.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose The better understanding of vertebral mechanical properties can help to improve the diagnosis of vertebral fractures. As the bone mechanical competence depends not only from bone mineral density (BMD) but also from bone quality, the goal of the present study was to investigate the anisotropic indentation moduli of the different sub-structures of the healthy human vertebral body and spondylophytes by means of microindentation. Methods Six human vertebral bodies and five osteophytes (spondylophytes) were collected and prepared for microindentation test. In particular, indentations were performed on bone structural units of the cortical shell (along axial, circumferential and radial directions), of the endplates (along the anterio-posterior and lateral directions), of the trabecular bone (along the axial and transverse directions) and of the spondylophytes (along the axial direction). A total of 3164 indentations down to a maximum depth of 2.5 µm were performed and the indentation modulus was computed for each measurement. Results The cortical shell showed an orthotropic behavior (indentation modulus, Ei, higher if measured along the axial direction, 14.6±2.8 GPa, compared to the circumferential one, 12.3±3.5 GPa, and radial one, 8.3±3.1 GPa). Moreover, the cortical endplates (similar Ei along the antero-posterior, 13.0±2.9 GPa, and along the lateral, 12.0±3.0 GPa, directions) and the trabecular bone (Ei= 13.7±3.4 GPa along the axial direction versus Ei=10.9±3.7 GPa along the transverse one) showed transversal isotropy behavior. Furthermore, the spondylophytes showed the lower mechanical properties measured along the axial direction (Ei=10.5±3.3 GPa). Conclusions The original results presented in this study improve our understanding of vertebral biomechanics and can be helpful to define the material properties of the vertebral substructures in computational models such as FE analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Awareness of being monitored can influence participants' habitual physical activity (PA) behavior. This reactivity effect may threaten the validity of PA assessment. Reports on reactivity when measuring the PA of children and adolescents have been inconsistent. The aim of this study was to investigate whether PA outcomes measured by accelerometer devices differ from measurement day to measurement day and whether the day of the week and the day on which measurement started influence these differences. METHODS: Accelerometer data (counts per minute [cpm]) of children and adolescents (n = 2081) pooled from eight studies in Switzerland with at least 10 h of daily valid recording were investigated for effects of measurement day, day of the week, and start day using mixed linear regression. RESULTS: The first measurement day was the most active day. Counts per minute were significantly higher than on the second to the sixth day, but not on the seventh day. Differences in the age-adjusted means between the first and consecutive days ranged from 23 to 45 cpm (3.6%-7.1%). In preschoolchildren, the differences almost reached 10%. The start day significantly influenced PA outcome measures. CONCLUSIONS: Reactivity to accelerometer measurement of PA is likely to be present to an extent of approximately 5% on the first day and may introduce a relevant bias to accelerometer-based studies. In preschoolchildren, the effects are larger than those in elementary and secondary schoolchildren. As the day of the week and the start day significantly influence PA estimates, researchers should plan for at least one familiarization day in school-age children and randomly assign start days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES Multiple-breath washout (MBW) is an attractive test to assess ventilation inhomogeneity, a marker of peripheral lung disease. Standardization of MBW is hampered as little data exists on possible measurement bias. We aimed to identify potential sources of measurement bias based on MBW software settings. METHODS We used unprocessed data from nitrogen (N2) MBW (Exhalyzer D, Eco Medics AG) applied in 30 children aged 5-18 years: 10 with CF, 10 formerly preterm, and 10 healthy controls. This setup calculates the tracer gas N2 mainly from measured O2 and CO2concentrations. The following software settings for MBW signal processing were changed by at least 5 units or >10% in both directions or completely switched off: (i) environmental conditions, (ii) apparatus dead space, (iii) O2 and CO2 signal correction, and (iv) signal alignment (delay time). Primary outcome was the change in lung clearance index (LCI) compared to LCI calculated with the settings as recommended. A change in LCI exceeding 10% was considered relevant. RESULTS Changes in both environmental and dead space settings resulted in uniform but modest LCI changes and exceeded >10% in only two measurements. Changes in signal alignment and O2 signal correction had the most relevant impact on LCI. Decrease of O2 delay time by 40 ms (7%) lead to a mean LCI increase of 12%, with >10% LCI change in 60% of the children. Increase of O2 delay time by 40 ms resulted in mean LCI decrease of 9% with LCI changing >10% in 43% of the children. CONCLUSIONS Accurate LCI results depend crucially on signal processing settings in MBW software. Especially correct signal delay times are possible sources of incorrect LCI measurements. Algorithms of signal processing and signal alignment should thus be optimized to avoid susceptibility of MBW measurements to this significant measurement bias.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When compared to other types of occupational injuries, radiation overexposure events are somewhat rare, so health care providers may not be familiar with the actual clinical care to be provided when such an event occurs. Radiation overexposure treatment decisions are predicated on the amount of radiation dose received, which is a value many health care providers may not have the knowledge or expertise to either calculate or even estimate. Even the different units of measure for radiation exposure and dose received can be a source of confusion. The prompt treatment of radiation overexposure victims could be enhanced and facilitated through the creation of a single, simple protocol that consists of the various means of dose measurement and estimation, correlated to the corresponding appropriate clinical care measures. This culminating experience will assemble essential information currently maintained in disparate references to create a single, simplified protocol to facilitate the treatment of victims of acute external radiation overexposure. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bulk dissolution rates for sediment from ODP Site 984A in the North Atlantic are determined using the 234U/238U activity ratios of pore water, bulk sediment, and leachates. Site 984A is one of only several sites where closely spaced pore water samples were obtained from the upper 60 meters of the core; the sedimentation rate is high (11-15 cm/ka), hence the sediments in the upper 60 meters are less than 500 ka old. The sediment is clayey silt and composed mostly of detritus derived from Iceland with a significant component of biogenic carbonate (up to 30%). The pore water 234U/238U activity ratios are higher than seawater values, in the range of 1.2 to 1.6, while the bulk sediment 234U/238U activity ratios are close to 1.0. The 234U/238U of the pore water reflects a balance between the mineral dissolution rate and the supply rate of excess 234U to the pore fluid by a-recoil injection of 234Th. The fraction of 238U decays that result in a-recoil injection of 234U to pore fluid is estimated to be 0.10 to 0.20 based on the 234U/238U of insoluble residue fractions. The calculated bulk dissolution rates, in units of g/g/yr are in the range of 0.0000004 to 0.000002 1/yr. There is significant down-hole variability in pore water 234U/238U activity ratios (and hence dissolution rates) on a scale of ca. 10 m. The inferred bulk dissolution rate constants are 100 to 1000 times slower than laboratory-determined rates, 100 times faster than rates inferred for older sediments based on Sr isotopes, and similar to weathering rates determined for terrestrial soils of similar age. The results of this study suggest that U isotopes can be used to measure in situ dissolution rates in fine-grained clastic materials. The rate estimates for sediments from ODP Site 984 confirm the strong dependence of reactivity on the age of the solid material: the bulk dissolution rate (R_d) of soils and deep-sea sediments can be approximately described by the expression R_d ~ 0.1 1/age for ages spanning 1000 to 500,000,000 yr. The age of the material, which encompasses the grain size, surface area, and other chemical factors that contribute to the rate of dissolution, appears to be a much stronger determinant of dissolution rate than any single physical or chemical property of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale transport infrastructure projects such as high-speed rail (HSR) produce significant effects on the spatial distribution of accessibility. These effects, commonly known as territorial cohesion effects, are receiving increasing attention in the research literature. However, there is little empirical research into the sensitivity of these cohesion results to methodological issues such as the definition of the limits of the study area or the zoning system. In a previous paper (Ortega et al., 2012), we investigated the influence of scale issues, comparing the cohesion results obtained at four different planning levels. This paper makes an additional contribution to our research with the investigation of the influence of zoning issues. We analyze the extent to which changes in the size of the units of analysis influence the measurement of spatial inequalities. The methodology is tested by application to the Galician (north-western) HSR corridor, with a length of nearly 670 km, included in the Spanish PEIT (Strategic Transport and Infrastructure Plan) 2005-2020. We calculated the accessibility indicators for the Galician HSR corridor and assessed their corresponding territorial distribution. We used five alternative zoning systems depending on the method of data representation used (vector or raster), and the level of detail (cartographic accuracy or cell size). Our results suggest that the choice between a vector-based and raster-based system has important implications. The vector system produces a higher mean accessibility value and a more polarized accessibility distribution than raster systems. The increased pixel size of raster-based systems tends to give rise to higher mean accessibility values and a more balanced accessibility distribution. Our findings strongly encourage spatial analysts to acknowledge that the results of their analyses may vary widely according to the definition of the units of analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an approach to create what we have called a Unified Sentiment Lexicon (USL). This approach aims at aligning, unifying, and expanding the set of sentiment lexicons which are available on the web in order to increase their robustness of coverage. One problem related to the task of the automatic unification of different scores of sentiment lexicons is that there are multiple lexical entries for which the classification of positive, negative, or neutral {P, Z, N} depends on the unit of measurement used in the annotation methodology of the source sentiment lexicon. Our USL approach computes the unified strength of polarity of each lexical entry based on the Pearson correlation coefficient which measures how correlated lexical entries are with a value between 1 and -1, where 1 indicates that the lexical entries are perfectly correlated, 0 indicates no correlation, and -1 means they are perfectly inversely correlated and so is the UnifiedMetrics procedure for CPU and GPU, respectively. Another problem is the high processing time required for computing all the lexical entries in the unification task. Thus, the USL approach computes a subset of lexical entries in each of the 1344 GPU cores and uses parallel processing in order to unify 155802 lexical entries. The results of the analysis conducted using the USL approach show that the USL has 95.430 lexical entries, out of which there are 35.201 considered to be positive, 22.029 negative, and 38.200 neutral. Finally, the runtime was 10 minutes for 95.430 lexical entries; this allows a reduction of the time computing for the UnifiedMetrics by 3 times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human adult α-globin locus consists of three pairs of homology blocks (X, Y, and Z) interspersed with three nonhomology blocks (I, II, and III), and three Alu family repeats, Alu1, Alu2, and Alu3. It has been suggested that an ancient primate α-globin-containing unit was ancestral to the X, Y, and Z and the Alu1/Alu2 repeats. However, the evolutionary origin of the three nonhomologous blocks has remained obscure. We have now analyzed the sequence organization of the entire adult α-globin locus of gibbon (Hylobates lar). DNA segments homologous to human block I occur in both duplication units of the gibbon α-globin locus. Detailed interspecies sequence comparisons suggest that nonhomologous blocks I and II, as well as another sequence, IV, were all part of the ancestral α-globin-containing unit prior to its tandem duplication. However, sometime thereafter, block I was deleted from the human α1-globin-containing unit, and block II was also deleted from the α2-globin-containing unit in both human and gibbon. These were probably independent events both mediated by independent illegitimate recombination processes. Interestingly, the end points of these deletions coincide with potential insertion sites of Alu family repeats. These results suggest that the shaping of DNA segments in eukaryotic genomes involved the retroposition of repetitive DNA elements in conjunction with simple DNA recombination processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reduced set of measurement geometries allows the spectral reflectance of special effect coatings to be predicted for any other geometry. A physical model based on flake-related parameters has been used to determine nonredundant measurement geometries for the complete description of the spectral bidirectional reflectance distribution function (BRDF). The analysis of experimental spectral BRDF was carried out by means of principal component analysis. From this analysis, a set of nine measurement geometries was proposed to characterize special effect coatings. It was shown that, for two different special effect coatings, these geometries provide a good prediction of their complete color shift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite current imperatives to measure client outcomes, social workers have expressed frustration with the ability of traditional forms of quantitative methods to engage with complexity, individuality and meaning. This paper argues that the inclusion of a meaning-based as opposed to a function-based approach to quality of life (QOL) may offer a quantitative means of measurement that is congruent with social-work values and practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Intensive Care Unit (ICU) being one of those vital areas of a hospital providing clinical care, the quality of service rendered must be monitored and measured quantitatively. It is, therefore, essential to know the performance of an ICU, in order to identify any deficits and enable the service providers to improve the quality of service. Although there have been many attempts to do this with the help of illness severity scoring systems, the relative lack of success using these methods has led to the search for a form of measurement, which would encompass all the different aspects of an ICU in a holistic manner. The Analytic Hierarchy Process (AHP), a multiple-attribute, decision-making technique is utilised in this study to evolve a system to measure the performance of ICU services reliably. This tool has been applied to a surgical ICU in Barbados; we recommend AHP as a valuable tool to quantify the performance of an ICU. Copyright © 2004 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present paper we numerically study instrumental impact on statistical properties of quasi-CW Raman fiber laser using a simple model of multimode laser radiation. Effects, that have the most influence, are limited electrical bandwidth of measurement equipment and noise. To check this influence, we developed a simple model of the multimode quasi- CW generation with exponential statistics (i.e. uncorrelated modes). We found that the area near zero intensity in probability density function (PDF) is strongly affected by both factors, for example both lead to formation of a negative wing of intensity distribution. But far wing slope of PDF is not affected by noise and, for moderate mismatch between optical and electrical bandwidth, is only slightly affected by bandwidth limitation. The generation spectrum often becomes broader at higher power in experiments, so the spectral/electrical bandwidth mismatch factor increases over the power that can lead to artificial dependence of the PDF slope over the power. It was also found that both effects influence the ACF background level: noise impact decreases it, while limited bandwidth leads to its increase. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the largest source of dimensional measurement uncertainty, addressing the challenges of thermal variation is vital to ensure product and equipment integrity in the factories of the future. While it is possible to closely control room temperature, this is often not practical or economical to realise in all cases where inspection is required. This article reviews recent progress and trends in seven key commercially available industrial temperature measurement sensor technologies primarily in the range of 0 °C–50 °C for invasive, semi-invasive and non-invasive measurement. These sensors will ultimately be used to measure and model thermal variation in the assembly, test and integration environment. The intended applications for these technologies are presented alongside some consideration of measurement uncertainty requirements with regard to the thermal expansion of common materials. Research priorities are identified and discussed for each of the technologies as well as temperature measurement at large. Future developments are briefly discussed to provide some insight into which direction the development and application of temperature measurement technologies are likely to head.