54 resultados para Energy-Based Method
Resumo:
Debris flow hazard modelling at medium (regional) scale has been subject of various studies in recent years. In this study, hazard zonation was carried out, incorporating information about debris flow initiation probability (spatial and temporal), and the delimitation of the potential runout areas. Debris flow hazard zonation was carried out in the area of the Consortium of Mountain Municipalities of Valtellina di Tirano (Central Alps, Italy). The complexity of the phenomenon, the scale of the study, the variability of local conditioning factors, and the lacking data limited the use of process-based models for the runout zone delimitation. Firstly, a map of hazard initiation probabilities was prepared for the study area, based on the available susceptibility zoning information, and the analysis of two sets of aerial photographs for the temporal probability estimation. Afterwards, the hazard initiation map was used as one of the inputs for an empirical GIS-based model (Flow-R), developed at the University of Lausanne (Switzerland). An estimation of the debris flow magnitude was neglected as the main aim of the analysis was to prepare a debris flow hazard map at medium scale. A digital elevation model, with a 10 m resolution, was used together with landuse, geology and debris flow hazard initiation maps as inputs of the Flow-R model to restrict potential areas within each hazard initiation probability class to locations where debris flows are most likely to initiate. Afterwards, runout areas were calculated using multiple flow direction and energy based algorithms. Maximum probable runout zones were calibrated using documented past events and aerial photographs. Finally, two debris flow hazard maps were prepared. The first simply delimits five hazard zones, while the second incorporates the information about debris flow spreading direction probabilities, showing areas more likely to be affected by future debris flows. Limitations of the modelling arise mainly from the models applied and analysis scale, which are neglecting local controlling factors of debris flow hazard. The presented approach of debris flow hazard analysis, associating automatic detection of the source areas and a simple assessment of the debris flow spreading, provided results for consequent hazard and risk studies. However, for the validation and transferability of the parameters and results to other study areas, more testing is needed.
Resumo:
Medical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from reference-dependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from cross-section data containing information on the means of financing health payments. We estimate medical expenditure risk in seven Asian countries and find it is highest in Laos and China, and is lowest in Malaysia. Exposure to risk is generally higher for households that have less recourse to self-insurance, lower incomes, wealth and education, and suffer from chronic illness.
Resumo:
BACKGROUND: Measuring syringe availability and coverage is essential in the assessment of HIV/AIDS risk reduction policies. Estimates of syringe availability and coverage were produced for the years 1996 and 2006, based on all relevant available national-level aggregated data from published sources. METHODS: We defined availability as the total monthly number of syringes provided by harm reduction system divided by the estimated number of injecting drug users (IDU), and defined coverage as the proportion of injections performed with a new syringe, at national level (total supply over total demand). Estimates of supply of syringes were derived from the national monitoring system, including needle and syringe programmes (NSP), pharmacies, and medically prescribed heroin programmes. Estimates of syringe demand were based on the number of injections performed by IDU derived from surveys of low threshold facilities for drug users (LTF) with NSP combined with the number of IDU. This number was estimated by two methods combining estimates of heroin users (multiple estimation method) and (a) the number of IDU in methadone treatment (MT) (non-injectors) or (b) the proportion of injectors amongst LTF attendees. Central estimates and ranges were obtained for availability and coverage. RESULTS: The estimated number of IDU decreased markedly according to both methods. The MT-based method (from 14,818 to 4809) showed a much greater decrease and smaller size of the IDU population compared to the LTF-based method (from 24,510 to 12,320). Availability and coverage estimates are higher with the MT-based method. For 1996, central estimates of syringe availability were 30.5 and 18.4 per IDU per month; for 2006, they were 76.5 and 29.9. There were 4 central estimates of coverage. For 1996 they ranged from 24.3% to 43.3%, and for 2006, from 50.5% to 134.3%. CONCLUSION: Although 2006 estimates overlap 1996 estimates, the results suggest a shift to improved syringe availability and coverage over time.
Resumo:
Because of their beneficial impact on forest ecosystems, European red wood ants (Formica rufa group) are protected by law in many European countries and are considered to be among the most reliable bioindicators of forest stability. However, their taxonomy has been much debated and, unfortunately, it is too often neglected. This happens mainly because the morphology-based method for species delimitation requires lots of time and experience. We therefore employed 9 microsatellite loci and mitochondrial DNA (COI gene) to verify the power of genetic markers for red wood ant species delimitation and to investigate the cryptic diversity of these ants within the Eastern Swiss Alps. We analyzed 83 nests belonging to all red wood ant species that occur in the Swiss National Park area. Genetic data indicated that these species represent different genetic pools. Moreover, results showed that Formica aquilonia YARROW, 1955 and F. paralugubris SEIFERT, 1996 often hybridize within the Park, confirming that these two species are genetically very close and could have diverged only recently. Nevertheless, microsatellites also revealed that one entire population, located in the Minger Valley and morphologically identified as F. lugubris ZETTERSTEDT, 1838, is genetically different to all other analyzed F. lugubris populations found within the same area and to other red wood ant species. These findings, confirmed by mitochondrial DNA analyses, suggest the existence of a new cryptic species within the Eastern Swiss Alps. This putative cryptic species has been provisionally named F. lugubris-A2. These results have a great importance for future conservation plans, monitoring and evolutionary studies on these protected ants.
Resumo:
This paper deals with the problem of spatial data mapping. A new method based on wavelet interpolation and geostatistical prediction (kriging) is proposed. The method - wavelet analysis residual kriging (WARK) - is developed in order to assess the problems rising for highly variable data in presence of spatial trends. In these cases stationary prediction models have very limited application. Wavelet analysis is used to model large-scale structures and kriging of the remaining residuals focuses on small-scale peculiarities. WARK is able to model spatial pattern which features multiscale structure. In the present work WARK is applied to the rainfall data and the results of validation are compared with the ones obtained from neural network residual kriging (NNRK). NNRK is also a residual-based method, which uses artificial neural network to model large-scale non-linear trends. The comparison of the results demonstrates the high quality performance of WARK in predicting hot spots, reproducing global statistical characteristics of the distribution and spatial correlation structure.
Resumo:
Decision to revascularize a patient with stable coronary artery disease should be based on the detection of myocardial ischemia. If this decision can be straightforward with significant stenosis or in non-significant stenosis, the decision with intermediate stenosis is far more difficult and require invasive measures of functional impact of coronary stenosis on maximal blood (flow fractional flow reserve=FFR). A recent computer based method has been developed and is able to measure FFR with data acquired during a standard coronary CT-scan (FFRcT). Two recent clinical studies (DeFACTO and DISCOVER-FLOW) show that diagnostic performance of FFRcT was associated with improved diagnostic accuracy versus standard coronary CT-scan for the detection of myocardial ischemia although FFRcT need further development.
Resumo:
Accurate detection of subpopulation size determinations in bimodal populations remains problematic yet it represents a powerful way by which cellular heterogeneity under different environmental conditions can be compared. So far, most studies have relied on qualitative descriptions of population distribution patterns, on population-independent descriptors, or on arbitrary placement of thresholds distinguishing biological ON from OFF states. We found that all these methods fall short of accurately describing small population sizes in bimodal populations. Here we propose a simple, statistics-based method for the analysis of small subpopulation sizes for use in the free software environment R and test this method on real as well as simulated data. Four so-called population splitting methods were designed with different algorithms that can estimate subpopulation sizes from bimodal populations. All four methods proved more precise than previously used methods when analyzing subpopulation sizes of transfer competent cells arising in populations of the bacterium Pseudomonas knackmussii B13. The methods' resolving powers were further explored by bootstrapping and simulations. Two of the methods were not severely limited by the proportions of subpopulations they could estimate correctly, but the two others only allowed accurate subpopulation quantification when this amounted to less than 25% of the total population. In contrast, only one method was still sufficiently accurate with subpopulations smaller than 1% of the total population. This study proposes a number of rational approximations to quantifying small subpopulations and offers an easy-to-use protocol for their implementation in the open source statistical software environment R.
Resumo:
BACKGROUND: Hyperoxaluria is a major risk factor for kidney stone formation. Although urinary oxalate measurement is part of all basic stone risk assessment, there is no standardized method for this measurement. METHODS: Urine samples from 24-h urine collection covering a broad range of oxalate concentrations were aliquoted and sent, in duplicates, to six blinded international laboratories for oxalate, sodium and creatinine measurement. In a second set of experiments, ten pairs of native urine and urine spiked with 10 mg/L of oxalate were sent for oxalate measurement. Three laboratories used a commercially available oxalate oxidase kit, two laboratories used a high-performance liquid chromatography (HPLC)-based method and one laboratory used both methods. RESULTS: Intra-laboratory reliability for oxalate measurement expressed as intraclass correlation coefficient (ICC) varied between 0.808 [95% confidence interval (CI): 0.427-0.948] and 0.998 (95% CI: 0.994-1.000), with lower values for HPLC-based methods. Acidification of urine samples prior to analysis led to significantly higher oxalate concentrations. ICC for inter-laboratory reliability varied between 0.745 (95% CI: 0.468-0.890) and 0.986 (95% CI: 0.967-0.995). Recovery of the 10 mg/L oxalate-spiked samples varied between 8.7 ± 2.3 and 10.7 ± 0.5 mg/L. Overall, HPLC-based methods showed more variability compared to the oxalate oxidase kit-based methods. CONCLUSIONS: Significant variability was noted in the quantification of urinary oxalate concentration by different laboratories, which may partially explain the differences of hyperoxaluria prevalence reported in the literature. Our data stress the need for a standardization of the method of oxalate measurement.
Resumo:
Purpose: To perform in vivo imaging of the cerebellum with an in-plane resolution of 120 mm to observe its cortical granular and molecular layers by taking advantage of the high signal-to-noise ratio and the increased magnetic susceptibility-related contrast available at high magnetic field strength such as 7 T. Materials and Methods: The study was approved by the institutional review board, and all patients provided written consent. Three healthy persons (two men, one woman; mean age, 30 years; age range, 28-31 years) underwent MR imaging with a 7-T system. Gradient-echo images (repetition time msec/echo time msec, 1000/25) of the human cerebellum were acquired with a nominal in-plane resolution of approximately 120 mum and a section thickness of 1 mm. Results: Structures with dimensions as small as 240 mum, such as the granular and molecular layers in the cerebellar cortex, were detected in vivo. The detection of these structures was confirmed by comparing the contrast obtained on T2*-weighted and phase images with that obtained on images of rat cerebellum acquired at 14 T with 30 mum in-plane resolution. Conclusion: In vivo cerebellar imaging at near-microscopic resolution is feasible at 7 T. Such detailed observation of an anatomic area that can be affected by a number of neurologic and psychiatric diseases, such as stroke, tumors, autism, and schizophrenia, could potentially provide newer markers for diagnosis and follow-up in patients with such pathologic conditions. (c) RSNA, 2010.
Resumo:
INTRODUCTION: Currently, there is no reliable method to differentiate acute from chronic carotid occlusion. We propose a novel CTA-based method to differentiate acute from chronic carotid occlusions that could potentially aid clinical management of patients. METHODS: We examined 72 patients with 89 spontaneously occluded extracranial internal carotids with CT angiography (CTA). All occlusions were confirmed by another imaging modality and classified as acute (imaging <1 week of presumed occlusion) orchronic (imaging >4 weeks), based on circumstantial clinical and radiological evidence. A neuroradiologist and a neurologist blinded to clinical information determined the site of occlusion on axial sections of CTA. They also looked for (a) hypodensity in the carotid artery (thrombus), (b) contrast within the carotid wall (vasa vasorum), (c) the site of the occluded carotid, and (d) the "carotid ring sign" (defined as presence of a and/or b). RESULTS: Of 89 occluded carotids, 24 were excluded because of insufficient circumstantial evidence to determine timing of occlusion, 4 because of insufficient image quality, and 3 because of subacute timing of occlusion. Among the remaining 45 acute and 13 chronic occlusions, inter-rater agreement (kappa) for the site of proximal occlusion was 0.88, 0.45 for distal occlusion, 0.78 for luminal hypodensity, 0.82 for wall contrast, and 0.90 for carotid ring sign. The carotid ring sign had 88.9% sensitivity, 69.2% specificity, and 84.5% accuracy to diagnose acute occlusion. CONCLUSION: The carotid ring sign helps to differentiate acute from chronic carotid occlusion. If further confirmed, this information may be helpful in studying ischemic symptoms and selecting treatment strategies in patients with carotid occlusions.
Resumo:
Despite the development of novel typing methods based on whole genome sequencing, most laboratories still rely on classical molecular methods for outbreak investigation or surveillance. Reference methods for Clostridium difficile include ribotyping and pulsed-field gel electrophoresis, which are band-comparing methods often difficult to establish and which require reference strain collections. Here, we present the double locus sequence typing (DLST) scheme as a tool to analyse C. difficile isolates. Using a collection of clinical C. difficile isolates recovered during a 1-year period, we evaluated the performance of DLST and compared the results to multilocus sequence typing (MLST), a sequence-based method that has been used to study the structure of bacterial populations and highlight major clones. DLST had a higher discriminatory power compared to MLST (Simpson's index of diversity of 0.979 versus 0.965) and successfully identified all isolates of the study (100 % typeability). Previous studies showed that the discriminatory power of ribotyping was comparable to that of MLST; thus, DLST might be more discriminatory than ribotyping. DLST is easy to establish and provides several advantages, including absence of DNA extraction [polymerase chain reaction (PCR) is performed on colonies], no specific instrumentation, low cost and unambiguous definition of types. Moreover, the implementation of a DLST typing scheme on an Internet database, such as that previously done for Staphylococcus aureus and Pseudomonas aeruginosa ( http://www.dlst.org ), will allow users to easily obtain the DLST type by submitting directly sequencing files and will avoid problems associated with multiple databases.
Resumo:
X-ray medical imaging is increasingly becoming three-dimensional (3-D). The dose to the population and its management are of special concern in computed tomography (CT). Task-based methods with model observers to assess the dose-image quality trade-off are promising tools, but they still need to be validated for real volumetric images. The purpose of the present work is to evaluate anthropomorphic model observers in 3-D detection tasks for low-contrast CT images. We scanned a low-contrast phantom containing four types of signals at three dose levels and used two reconstruction algorithms. We implemented a multislice model observer based on the channelized Hotelling observer (msCHO) with anthropomorphic channels and investigated different internal noise methods. We found a good correlation for all tested model observers. These results suggest that the msCHO can be used as a relevant task-based method to evaluate low-contrast detection for CT and optimize scan protocols to lower dose in an efficient way.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment. © 2014 American Society for Bone and Mineral Research.
Resumo:
The trabecular bone score (TBS) is a gray-level textural metric that can be extracted from the two-dimensional lumbar spine dual-energy X-ray absorptiometry (DXA) image. TBS is related to bone microarchitecture and provides skeletal information that is not captured from the standard bone mineral density (BMD) measurement. Based on experimental variograms of the projected DXA image, TBS has the potential to discern differences between DXA scans that show similar BMD measurements. An elevated TBS value correlates with better skeletal microstructure; a low TBS value correlates with weaker skeletal microstructure. Lumbar spine TBS has been evaluated in cross-sectional and longitudinal studies. The following conclusions are based upon publications reviewed in this article: 1) TBS gives lower values in postmenopausal women and in men with previous fragility fractures than their nonfractured counterparts; 2) TBS is complementary to data available by lumbar spine DXA measurements; 3) TBS results are lower in women who have sustained a fragility fracture but in whom DXA does not indicate osteoporosis or even osteopenia; 4) TBS predicts fracture risk as well as lumbar spine BMD measurements in postmenopausal women; 5) efficacious therapies for osteoporosis differ in the extent to which they influence the TBS; 6) TBS is associated with fracture risk in individuals with conditions related to reduced bone mass or bone quality. Based on these data, lumbar spine TBS holds promise as an emerging technology that could well become a valuable clinical tool in the diagnosis of osteoporosis and in fracture risk assessment.
Resumo:
Rockfall hazard zoning is usually achieved using a qualitative estimate of hazard, and not an absolute scale. In Switzerland, danger maps, which correspond to a hazard zoning depending on the intensity of the considered phenomenon (e.g. kinetic energy for rockfalls), are replacing hazard maps. Basically, the danger grows with the mean frequency and with the intensity of the rockfall. This principle based on intensity thresholds may also be applied to other intensity threshold values than those used in Switzerland for rockfall hazard zoning method, i.e. danger mapping. In this paper, we explore the effect of slope geometry and rockfall frequency on the rockfall hazard zoning. First, the transition from 2D zoning to 3D zoning based on rockfall trajectory simulation is examined; then, its dependency on slope geometry is emphasized. The spatial extent of hazard zones is examined, showing that limits may vary widely depending on the rockfall frequency. This approach is especially dedicated to highly populated regions, because the hazard zoning has to be very fine in order to delineate the greatest possible territory containing acceptable risks.