997 resultados para emission measurements
Resumo:
We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.
Resumo:
Given the adverse impact of image noise on the perception of important clinical details in digital mammography, routine quality control measurements should include an evaluation of noise. The European Guidelines, for example, employ a second-order polynomial fit of pixel variance as a function of detector air kerma (DAK) to decompose noise into quantum, electronic and fixed pattern (FP) components and assess the DAK range where quantum noise dominates. This work examines the robustness of the polynomial method against an explicit noise decomposition method. The two methods were applied to variance and noise power spectrum (NPS) data from six digital mammography units. Twenty homogeneously exposed images were acquired with PMMA blocks for target DAKs ranging from 6.25 to 1600 µGy. Both methods were explored for the effects of data weighting and squared fit coefficients during the curve fitting, the influence of the additional filter material (2 mm Al versus 40 mm PMMA) and noise de-trending. Finally, spatial stationarity of noise was assessed.Data weighting improved noise model fitting over large DAK ranges, especially at low detector exposures. The polynomial and explicit decompositions generally agreed for quantum and electronic noise but FP noise fraction was consistently underestimated by the polynomial method. Noise decomposition as a function of position in the image showed limited noise stationarity, especially for FP noise; thus the position of the region of interest (ROI) used for noise decomposition may influence fractional noise composition. The ROI area and position used in the Guidelines offer an acceptable estimation of noise components. While there are limitations to the polynomial model, when used with care and with appropriate data weighting, the method offers a simple and robust means of examining the detector noise components as a function of detector exposure.
Resumo:
Tiivistelmä: TDR-mittausten kalibrointi viljeltyjen turvemaiden kosteuden mittaamiseen
Resumo:
Toperform a meta-analysis of FDG-PET performances in the diagnosis of largevessels vasculitis (Giant Cell Arteritis (GCA) associated or not withPolymyalgia Rheumatica(PMR), Takayasu). Materials and methods : The MEDLINE,Cochrane Library, Embase were searched for relevant original articlesdescribing FDG-PET for vasculitis assessment, using MesH terms ("GiantCell Arteritis or Vasculitis" AND "PET"). Criteria for inclusionwere:(1)FDG-PET for diagnosis of vasculitis(2)American College of Rheumatologycriteria as reference standard(3)control group. After data extraction, analyseswere performed using a random-effects model. Results : Of 184 citations(database search and references screening),70 articles were reviewed of which12 eligible studies were extracted (sensitivity range from 32% to 97%). 7studies fulfilled all inclusion criteria. Owing to overlapping population, 1study was excluded. Statistical heterogeneity justified the random-effectsmodel. Pooled 6 studies analysis(116 vasculitis,224 controls) showed a 81%sensitivity (95%CI:70-89%);a 89% specificity (95%CI:77-95%);a 85%PPV(95%CI:63-95%); a 90% NPV(95%CI:79-95%);a 7.1 positive LR(95%CI:3.4-14.9); a0.2 negative LR(95%CI:0.14-0.35) and 90.1 DOR(95%CI: 18.6-437). Conclusion :FDG-PET has good diagnostic performances in the detection of large vesselsvasculitis. Its promising role could be extended to follow up patients undertreatment, but further studies are needed to confirm this possibility.
Resumo:
Canadian healthcare is changing. Over the course of the past decade, the Health Care in Canada Survey (HCIC) has annually measured the reactions of the public and professional stakeholders to many of these change forces. In HCIC 2008, for the first time, the public's perception of their health status and all stakeholders' views of the burden and effective management of chronic diseases were sought. Overall, Canadians perceive themselves as healthy, with 84% of adults reporting good-to-excellent health. However, good health decreased with age as the occurrence of chronic illness rose, from 12% in the age group 18-24 to 65% for the population =65 years. More than 70% of all stakeholders were strongly or somewhat supportive of the implementation of coordinated care, or disease management programs, to improve the care of patients with chronic illnesses. Concordant support was also expressed for key disease management components, including coordinated interventions to improve home, community and self-care; increased wellness promotion; and increased use of clinical measurements and feedback to all stakeholders. However, there were also important areas of non-concordance. For example, the public and doctors consistently expressed less support than other stakeholders for the value of team care, including the use of non-physician professionals to provide patient care; increased patient involvement in decision-making; and the use of electronic health records to facilitate communication. The actual participation in disease management programs averaged 34% for professionals and 25% for the public. We conclude that chronic diseases are common, age-related and burdensome in Canada. Disease management or coordinated intervention often delivered by teams is also relatively common, despite its less-than-universal acceptance by all stakeholders. Further insights are needed, particularly into the variable perceptions of the value and efficacy of team-delivered healthcare and its important components.
Resumo:
BACKGROUND: Hyperoxaluria is a major risk factor for kidney stone formation. Although urinary oxalate measurement is part of all basic stone risk assessment, there is no standardized method for this measurement. METHODS: Urine samples from 24-h urine collection covering a broad range of oxalate concentrations were aliquoted and sent, in duplicates, to six blinded international laboratories for oxalate, sodium and creatinine measurement. In a second set of experiments, ten pairs of native urine and urine spiked with 10 mg/L of oxalate were sent for oxalate measurement. Three laboratories used a commercially available oxalate oxidase kit, two laboratories used a high-performance liquid chromatography (HPLC)-based method and one laboratory used both methods. RESULTS: Intra-laboratory reliability for oxalate measurement expressed as intraclass correlation coefficient (ICC) varied between 0.808 [95% confidence interval (CI): 0.427-0.948] and 0.998 (95% CI: 0.994-1.000), with lower values for HPLC-based methods. Acidification of urine samples prior to analysis led to significantly higher oxalate concentrations. ICC for inter-laboratory reliability varied between 0.745 (95% CI: 0.468-0.890) and 0.986 (95% CI: 0.967-0.995). Recovery of the 10 mg/L oxalate-spiked samples varied between 8.7 ± 2.3 and 10.7 ± 0.5 mg/L. Overall, HPLC-based methods showed more variability compared to the oxalate oxidase kit-based methods. CONCLUSIONS: Significant variability was noted in the quantification of urinary oxalate concentration by different laboratories, which may partially explain the differences of hyperoxaluria prevalence reported in the literature. Our data stress the need for a standardization of the method of oxalate measurement.
Resumo:
Glioma cell lines are an important tool for research in basic and translational neuro-oncology. Documentation of their genetic identity has become a requirement for scientific journals and grant applications to exclude cross-contamination and misidentification that lead to misinterpretation of results. Here, we report the standard 16 marker short tandem repeat (STR) DNA fingerprints for a panel of 39 widely used glioma cell lines as reference. Comparison of the fingerprints among themselves and with the large DSMZ database comprising 9 marker STRs for 2278 cell lines uncovered 3 misidentified cell lines and confirmed previously known cross-contaminations. Furthermore, 2 glioma cell lines exhibited identity scores of 0.8, which is proposed as the cutoff for detecting cross-contamination. Additional characteristics, comprising lack of a B-raf mutation in one line and a similarity score of 1 with the original tumor tissue in the other, excluded a cross-contamination. Subsequent simulation procedures suggested that, when using DNA fingerprints comprising only 9 STR markers, the commonly used similarity score of 0.8 is not sufficiently stringent to unambiguously differentiate the origin. DNA fingerprints are confounded by frequent genetic alterations in cancer cell lines, particularly loss of heterozygosity, that reduce the informativeness of STR markers and, thereby, the overall power for distinction. The similarity score depends on the number of markers measured; thus, more markers or additional cell line characteristics, such as information on specific mutations, may be necessary to clarify the origin.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
The possible association between the microquasar LS 5039 and the EGRET source 3EG J1824-1514 suggests that microquasars could also be sources of high energy gamma-rays. In this paper, we explore, with a detailed numerical model, if this system can produce the emission detected by EGRET (>100 MeV) through inverse Compton (IC) scattering. Our numerical approach considers a population of relativistic electrons entrained in a cylindrical inhomogeneous jet, interacting with both the radiation and the magnetic fields, taking into account the Thomson and Klein-Nishina regimes of interaction. The computed spectrum reproduces the observed spectral characteristics at very high energy.
Resumo:
We present multiepoch Very Large Array (VLA) observations at 1.4 GHz, 4.9 GHz, 8.5 GHz and 14.9 GHz for a sample of eight RS CVn binary systems. Circular polarization measurements of these systems are also reported. Most of the fluxes observed are consistent with incoherent emission from mildly relativistic electrons. Several systems show an increase of the degree of circular polarization with increasing frequency in the optically thin regime, in conflict with predictions by gyrosynchrotron models. We observed a reversal in the sense of circular polarization with increasing frequency in three non-eclipsing systems: EI Eri, DM Uma and HD 8358. We find clear evidence for coherent plasma emission at 1.4 GHz in the quiescent spectrum of HD 8358 during the helicity reversal. The degrees of polarization of the other two systems could also be accounted for by a coherent emission process. The observations of ER Vul revealed two U-shaped flux spectra at the highest frequencies. The U-shape of the spectra may be accounted for by an optically thin gyrosynchrotron source for the low frequency part whereas the high frequency part is dominated by a thermal emission component.