199 resultados para Robust methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a segmentation method based on the geometric representation of images as 2-D manifolds embedded in a higher dimensional space. The segmentation is formulated as a minimization problem, where the contours are described by a level set function and the objective functional corresponds to the surface of the image manifold. In this geometric framework, both data-fidelity and regularity terms of the segmentation are represented by a single functional that intrinsically aligns the gradients of the level set function with the gradients of the image and results in a segmentation criterion that exploits the directional information of image gradients to overcome image inhomogeneities and fragmented contours. The proposed formulation combines this robust alignment of gradients with attractive properties of previous methods developed in the same geometric framework: 1) the natural coupling of image channels proposed for anisotropic diffusion and 2) the ability of subjective surfaces to detect weak edges and close fragmented boundaries. The potential of such a geometric approach lies in the general definition of Riemannian manifolds, which naturally generalizes existing segmentation methods (the geodesic active contours, the active contours without edges, and the robust edge integrator) to higher dimensional spaces, non-flat images, and feature spaces. Our experiments show that the proposed technique improves the segmentation of multi-channel images, images subject to inhomogeneities, and images characterized by geometric structures like ridges or valleys.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This book gives a general view of sequence analysis, the statistical study of successions of states or events. It includes innovative contributions on life course studies, transitions into and out of employment, contemporaneous and historical careers, and political trajectories. The approach presented in this book is now central to the life-course perspective and the study of social processes more generally. This volume promotes the dialogue between approaches to sequence analysis that developed separately, within traditions contrasted in space and disciplines. It includes the latest developments in sequential concepts, coding, atypical datasets and time patterns, optimal matching and alternative algorithms, survey optimization, and visualization. Field studies include original sequential material related to parenting in 19th-century Belgium, higher education and work in Finland and Italy, family formation before and after German reunification, French Jews persecuted in occupied France, long-term trends in electoral participation, and regime democratization. Overall the book reassesses the classical uses of sequences and it promotes new ways of collecting, formatting, representing and processing them. The introduction provides basic sequential concepts and tools, as well as a history of the method. Chapters are presented in a way that is both accessible to the beginner and informative to the expert.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Postmortem investigations are becoming more and more sophisticated. CT and MRI are already being used in pathology and forensic medicine. In this context, the impact of postmortem angiography increases because of the rapid evaluation of organ-specific vascular patterns, vascular alteration under pathologic and physiologic conditions, and tissue changes induced by artificial and unnatural causes. CONCLUSION: In this article, the advantages and disadvantages of former and current techniques and contrast agents are reviewed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess total free-living energy expenditure (EE) in Gambian farmers with two independent methods, and to determine the most realistic free-living EE and physical activity in order to establish energy requirements for rural populations in developing countries. DESIGN: In this cross-sectional study two methods were applied at the same time. SETTING: Three rural villages and Dunn Nutrition Centre Keneba, MRC, The Gambia. SUBJECTS: Eight healthy, male subjects were recruited from three rural Gambian villages in the sub-Sahelian area (age: 25 +/- 4y; weight: 61.2 +/- 10.1 kg; height: 169.5 +/- 6.5 cm, body mass index: 21.2 +/- 2.5 kg/m2). INTERVENTION: We assessed free-living EE with two inconspicuous and independent methods: the first one used doubly labeled water (DLW) (2H2 18O) over a period of 12 days, whereas the second one was based on continuous heart rate (HR) measurements on two to three days using individual regression lines (HR vs EE) established by indirect calorimetry in a respiration chamber. Isotopic dilution of deuterium (2H2O) was also used to assess total body water and hence fat-free mass (FFM). RESULTS: EE assessed by DLW was found to be 3880 +/- 994 kcal/day (16.2 +/- 4.2 MJ/day). Expressed per unit body weight the EE averaged 64.2 +/- 9.3 kcal/kg/d (269 +/- 38 kJ/kg/d). These results were consistent with the EE results assessed by HR: 3847 +/- 605 kcal/d (16.1 +/- 2.5 MJ/d) or 63.4 +/- 8.2 kcal/kg/d (265 +/- 34kJ/kg/d). Physical activity index, expressed as a multiple of basal metabolic rate (BMR), averaged 2.40 +/- 0.41 (DLW) or 2.40 +/- 0.28 (HR). CONCLUSIONS: These findings suggest an extremely high level of physical activity in Gambian men during intense agricultural work (wet season). This contrasts with the relative food shortage, previously reported during the harvesting period. We conclude that the assessment of EE during the agricultural season in non-industrialized countries needs further investigations in order to obtain information on the energy requirement of these populations. For this purpose the use of the DLW and HR methods have been shown to be useful and complementary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International conservation organisations have identified priority areas for biodiversity conservation. These global-scale prioritisations affect the distribution of funds for conservation interventions. As each organisation has a different focus, each prioritisation scheme is determined by different decision criteria and the resultant priority areas vary considerably. However, little is known about how the priority areas will respond to the impacts of climate change. In this paper, we examined the robustness of eight global-scale prioritisations to climate change under various climate predictions from seven global circulation models. We developed a novel metric of the climate stability for 803 ecoregions based on a recently introduced method to estimate the overlap of climate envelopes. The relationships between the decision criteria and the robustness of the global prioritisation schemes were statistically examined. We found that decision criteria related to level of endemism and landscape fragmentation were strongly correlated with areas predicted to be robust to a changing climate. Hence, policies that prioritise intact areas due to the likely cost efficiency, and assumptions related to the potential to mitigate the impacts of climate change, require further examination. Our findings will help determine where additional management is required to enable biodiversity to adapt to the impacts of climate change

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Phenytoin (PHT), valproic acid (VPA), or levetiracetam (LEV) are commonly used as second-line treatment of status epilepticus (SE), but comparative studies are not available to date.Methods: In our tertiary care hospital, among 279 SE episodes prospectively collected over four years, and occurring in adults, we identified 187 episodes in which PHT, VPA or LEV were prescribed after benzodiazepines. Patients with post-anoxic SE were not included. Demographics, clinical SE features, failure of second-line treatment to control SE, new handicap and mortality at hospital discharge were assessed. Uni- and multivariable statistical analyses were applied to compare the three agents.Results: Each compound was used in about one third of episodes. VPA failed to control SE in 25.4%, PHT in 41.4% and LEV in 48.3% of episodes in which these were prescribed as second-line agents. After adjustment for known SE outcome predictors, LEV failed more often than VPA (OR 2.69; 95% CI 1.19-6.08); in others words, 16.8% (95% CI 6.0-31.4%) of second-line treatment failures could be attributed to prescription for LEV instead of VPA. PHT was statistically not different from the other two compounds. At discharge, second-line treatment did not influence new handicap and mortality, while etiology and severity of the SE episode were robust independent predictors.Conclusions: Even without significant differences on outcome at discharge, LEV seems less efficcacious than VPA to control SE after benzodiazepines. A prospective comparative trial is needed to address this potentially concerning finding. The second interesting finding is that the outcome seems more influenced by the SE characteristics than the treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To investigate the effect of incremental increases in intraocular straylight on threshold measurements made by three modern forms of perimetry: Standard Automated Perimetry (SAP) using Octopus (Dynamic, G-Pattern), Pulsar Perimetry (PP) (TOP, 66 points) and the Moorfields Motion Displacement Test (MDT) (WEBS, 32 points).Methods: Four healthy young observers were recruited (mean age 26yrs [25yrs, 28yrs]), refractive correction [+2 D, -4.25D]). Five white opacity filters (WOF), each scattering light by different amounts were used to create incremental increases in intraocular straylight (IS). Resultant IS values were measured with each WOF and at baseline (no WOF) for each subject using a C-Quant Straylight Meter (Oculus, Wetzlar, Germany). A 25 yr old has an IS value of ~0.85 log(s). An increase of 40% in IS to 1.2log(s) corresponds to the physiological value of a 70yr old. Each WOFs created an increase in IS between 10-150% from baseline, ranging from effects similar to normal aging to those found with considerable cataract. Each subject underwent 6 test sessions over a 2-week period; each session consisted of the 3 perimetric tests using one of the five WOFs and baseline (both instrument and filter were randomised).Results: The reduction in sensitivity from baseline was calculated. A two-way ANOVA on mean change in threshold (where subjects were treated as rows in the block and each increment in fog filters was treated as column) was used to examine the effect of incremental increases in straylight. Both SAP (p<0.001) and Pulsar (p<0.001) were significantly affected by increases in straylight. The MDT (p=0.35) remained comparatively robust to increases in straylight.Conclusions: The Moorfields MDT measurement of threshold is robust to effects of additional straylight as compared to SAP and PP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Arsenic contamination of natural waters is a worldwide concern, as the drinking water supplies for large populations can have high concentrations of arsenic. Traditional techniques to detect arsenic in natural water samples can be costly and time-consuming; therefore, robust and inexpensive methods to detect arsenic in water are highly desirable. Additionally, methods for detecting arsenic in the field have been greatly sought after. This article focuses on the use of bacteria-based assays as an emerging method that is both robust and inexpensive for the detection of arsenic in groundwater both in the field and in the laboratory. The arsenic detection elements in bacteria-based bioassays are biosensor-reporter strains; genetically modified strains of, e.g., Escherichia coli, Bacillus subtilis, Staphylococcus aureus, and Rhodopseudomonas palustris. In response to the presence of arsenic, such bacteria produce a reporter protein, the amount or activity of which is measured in the bioassay. Some of these bacterial biosensor-reporters have been successfully utilized for comparative in-field analyses through the use of simple solution-based assays, but future methods may concentrate on miniaturization using fiberoptics or microfluidics platforms. Additionally, there are other potential emerging bioassays for the detection of arsenic in natural waters including nematodes and clams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal robust M-estimates of a multidimensional parameter are described using Hampel's infinitesimal approach. The optimal estimates are derived by minimizing a measure of efficiency under the model, subject to a bounded measure of infinitesimal robustness. To this purpose we define measures of efficiency and infinitesimal sensitivity based on the Hellinger distance.We show that these two measures coincide with similar ones defined by Yohai using the Kullback-Leibler divergence, and therefore the corresponding optimal estimates coincide too.We also give an example where we fit a negative binomial distribution to a real dataset of "days of stay in hospital" using the optimal robust estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Circadian cycles and cell cycles are two fundamental periodic processes with a period in the range of 1 day. Consequently, coupling between such cycles can lead to synchronization. Here, we estimated the mutual interactions between the two oscillators by time-lapse imaging of single mammalian NIH3T3 fibroblasts during several days. The analysis of thousands of circadian cycles in dividing cells clearly indicated that both oscillators tick in a 1:1 mode-locked state, with cell divisions occurring tightly 5 h before the peak in circadian Rev-Erbα-YFP reporter expression. In principle, such synchrony may be caused by either unidirectional or bidirectional coupling. While gating of cell division by the circadian cycle has been most studied, our data combined with stochastic modeling unambiguously show that the reverse coupling is predominant in NIH3T3 cells. Moreover, temperature, genetic, and pharmacological perturbations showed that the two interacting cellular oscillators adopt a synchronized state that is highly robust over a wide range of parameters. These findings have implications for circadian function in proliferative tissues, including epidermis, immune cells, and cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The variety of DNA microarray formats and datasets presently available offers an unprecedented opportunity to perform insightful comparisons of heterogeneous data. Cross-species studies, in particular, have the power of identifying conserved, functionally important molecular processes. Validation of discoveries can now often be performed in readily available public data which frequently requires cross-platform studies.Cross-platform and cross-species analyses require matching probes on different microarray formats. This can be achieved using the information in microarray annotations and additional molecular biology databases, such as orthology databases. Although annotations and other biological information are stored using modern database models ( e. g. relational), they are very often distributed and shared as tables in text files, i.e. flat file databases. This common flat database format thus provides a simple and robust solution to flexibly integrate various sources of information and a basis for the combined analysis of heterogeneous gene expression profiles.Results: We provide annotationTools, a Bioconductor-compliant R package to annotate microarray experiments and integrate heterogeneous gene expression profiles using annotation and other molecular biology information available as flat file databases. First, annotationTools contains a specialized set of functions for mining this widely used database format in a systematic manner. It thus offers a straightforward solution for annotating microarray experiments. Second, building on these basic functions and relying on the combination of information from several databases, it provides tools to easily perform cross-species analyses of gene expression data.Here, we present two example applications of annotationTools that are of direct relevance for the analysis of heterogeneous gene expression profiles, namely a cross-platform mapping of probes and a cross-species mapping of orthologous probes using different orthology databases. We also show how to perform an explorative comparison of disease-related transcriptional changes in human patients and in a genetic mouse model.Conclusion: The R package annotationTools provides a simple solution to handle microarray annotation and orthology tables, as well as other flat molecular biology databases. Thereby, it allows easy integration and analysis of heterogeneous microarray experiments across different technological platforms or species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A short overview is given on the most important analytical body composition methods. Principles of the methods and advantages and limitations of the methods are discussed also in relation to other fields of research such as energy metabolism. Attention is given to some new developments in body composition research such as chemical multiple-compartment models, computerized tomography or nuclear magnetic resonance imaging (tissue level), and multifrequency bioelectrical impedance. Possible future directions of body composition research in the light of these new developments are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.