913 resultados para Physiochemical measurements


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the two aims of monitoring social change and improving social measurement, the European Social Survey is now closing its third round. This paper shows how the accumulated experience of the two first rounds has been used to validate the questionnaire, better adapt the sampling design to the country characteristics and efficiently commit fieldwork in Spain. For example, the dynamic character of the population nowadays makes necessary to estimated design effects at each round from the data of the previous round. The paper also demonstrates how, starting with a response rate of 52% at first round, a 66% response rate is achieved at the third round thanks to an extensive quality control conducted by the polling agency and the ESS national team and based on a detailed analysis of the non-response cases and the incidences reported by the interviewed in the contact form.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tiivistelmä: TDR-mittausten kalibrointi viljeltyjen turvemaiden kosteuden mittaamiseen

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Canadian healthcare is changing. Over the course of the past decade, the Health Care in Canada Survey (HCIC) has annually measured the reactions of the public and professional stakeholders to many of these change forces. In HCIC 2008, for the first time, the public's perception of their health status and all stakeholders' views of the burden and effective management of chronic diseases were sought. Overall, Canadians perceive themselves as healthy, with 84% of adults reporting good-to-excellent health. However, good health decreased with age as the occurrence of chronic illness rose, from 12% in the age group 18-24 to 65% for the population =65 years. More than 70% of all stakeholders were strongly or somewhat supportive of the implementation of coordinated care, or disease management programs, to improve the care of patients with chronic illnesses. Concordant support was also expressed for key disease management components, including coordinated interventions to improve home, community and self-care; increased wellness promotion; and increased use of clinical measurements and feedback to all stakeholders. However, there were also important areas of non-concordance. For example, the public and doctors consistently expressed less support than other stakeholders for the value of team care, including the use of non-physician professionals to provide patient care; increased patient involvement in decision-making; and the use of electronic health records to facilitate communication. The actual participation in disease management programs averaged 34% for professionals and 25% for the public. We conclude that chronic diseases are common, age-related and burdensome in Canada. Disease management or coordinated intervention often delivered by teams is also relatively common, despite its less-than-universal acceptance by all stakeholders. Further insights are needed, particularly into the variable perceptions of the value and efficacy of team-delivered healthcare and its important components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Hyperoxaluria is a major risk factor for kidney stone formation. Although urinary oxalate measurement is part of all basic stone risk assessment, there is no standardized method for this measurement. METHODS: Urine samples from 24-h urine collection covering a broad range of oxalate concentrations were aliquoted and sent, in duplicates, to six blinded international laboratories for oxalate, sodium and creatinine measurement. In a second set of experiments, ten pairs of native urine and urine spiked with 10 mg/L of oxalate were sent for oxalate measurement. Three laboratories used a commercially available oxalate oxidase kit, two laboratories used a high-performance liquid chromatography (HPLC)-based method and one laboratory used both methods. RESULTS: Intra-laboratory reliability for oxalate measurement expressed as intraclass correlation coefficient (ICC) varied between 0.808 [95% confidence interval (CI): 0.427-0.948] and 0.998 (95% CI: 0.994-1.000), with lower values for HPLC-based methods. Acidification of urine samples prior to analysis led to significantly higher oxalate concentrations. ICC for inter-laboratory reliability varied between 0.745 (95% CI: 0.468-0.890) and 0.986 (95% CI: 0.967-0.995). Recovery of the 10 mg/L oxalate-spiked samples varied between 8.7 ± 2.3 and 10.7 ± 0.5 mg/L. Overall, HPLC-based methods showed more variability compared to the oxalate oxidase kit-based methods. CONCLUSIONS: Significant variability was noted in the quantification of urinary oxalate concentration by different laboratories, which may partially explain the differences of hyperoxaluria prevalence reported in the literature. Our data stress the need for a standardization of the method of oxalate measurement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Glioma cell lines are an important tool for research in basic and translational neuro-oncology. Documentation of their genetic identity has become a requirement for scientific journals and grant applications to exclude cross-contamination and misidentification that lead to misinterpretation of results. Here, we report the standard 16 marker short tandem repeat (STR) DNA fingerprints for a panel of 39 widely used glioma cell lines as reference. Comparison of the fingerprints among themselves and with the large DSMZ database comprising 9 marker STRs for 2278 cell lines uncovered 3 misidentified cell lines and confirmed previously known cross-contaminations. Furthermore, 2 glioma cell lines exhibited identity scores of 0.8, which is proposed as the cutoff for detecting cross-contamination. Additional characteristics, comprising lack of a B-raf mutation in one line and a similarity score of 1 with the original tumor tissue in the other, excluded a cross-contamination. Subsequent simulation procedures suggested that, when using DNA fingerprints comprising only 9 STR markers, the commonly used similarity score of 0.8 is not sufficiently stringent to unambiguously differentiate the origin. DNA fingerprints are confounded by frequent genetic alterations in cancer cell lines, particularly loss of heterozygosity, that reduce the informativeness of STR markers and, thereby, the overall power for distinction. The similarity score depends on the number of markers measured; thus, more markers or additional cell line characteristics, such as information on specific mutations, may be necessary to clarify the origin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Redshifts for 100 galaxies in 10 clusters of galaxies are presented based on data obtained between March 1984 and March 1985 from Calar Alto, La Palma, and ESO, and on data from Mauna Kea. Data for individual galaxies are given, and the accuracy of the velocities of the four instruments is discussed. Comparison with published data shows the present velocities to be shifted by + 4.0 km/s on average, with a standard deviation in the difference of 89.7 km/s, consistent with the rms of redshift measurements which range from 50-100 km/s.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relationships between porosity and hydraulic conductivity tend to be strongly scale- and site-dependent and are thus very difficult to establish. As a result, hydraulic conductivity distributions inferred from geophysically derived porosity models must be calibrated using some measurement of aquifer response. This type of calibration is potentially very valuable as it may allow for transport predictions within the considered hydrological unit at locations where only geophysical measurements are available, thus reducing the number of well tests required and thereby the costs of management and remediation. Here, we explore this concept through a series of numerical experiments. Considering the case of porosity characterization in saturated heterogeneous aquifers using crosshole ground-penetrating radar and borehole porosity log data, we use tracer test measurements to calibrate a relationship between porosity and hydraulic conductivity that allows the best prediction of the observed hydrological behavior. To examine the validity and effectiveness of the obtained relationship, we examine its performance at alternate locations not used in the calibration procedure. Our results indicate that this methodology allows us to obtain remarkably reliable hydrological predictions throughout the considered hydrological unit based on the geophysical data only. This was also found to be the case when significant uncertainty was considered in the underlying relationship between porosity and hydraulic conductivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The activity of radiopharmaceuticals in nuclear medicine is measured before patient injection with radionuclide calibrators. In Switzerland, the general requirements for quality controls are defined in a federal ordinance and a directive of the Federal Office of Metrology (METAS) which require each instrument to be verified. A set of three gamma sources (Co-57, Cs-137 and Co-60) is used to verify the response of radionuclide calibrators in the gamma energy range of their use. A beta source, a mixture of (90)Sr and (90)Y in secular equilibrium, is used as well. Manufacturers are responsible for the calibration factors. The main goal of the study was to monitor the validity of the calibration factors by using two sources: a (90)Sr/(90)Y source and a (18)F source. The three types of commercial radionuclide calibrators tested do not have a calibration factor for the mixture but only for (90)Y. Activity measurements of a (90)Sr/(90)Y source with the (90)Y calibration factor are performed in order to correct for the extra-contribution of (90)Sr. The value of the correction factor was found to be 1.113 whereas Monte Carlo simulations of the radionuclide calibrators estimate the correction factor to be 1.117. Measurements with (18)F sources in a specific geometry are also performed. Since this radionuclide is widely used in Swiss hospitals equipped with PET and PET-CT, the metrology of the (18)F is very important. The (18)F response normalized to the (137)Cs response shows that the difference with a reference value does not exceed 3% for the three types of radionuclide calibrators.