992 resultados para Gradient methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In comparison with other micronutrients, the levels of nickel (Ni) available in soils and plant tissues are very low, making quantification very difficult. The objective of this paper is to present optimized determination methods of Ni availability in soils by extractants and total content in plant tissues for routine commercial laboratory analyses. Samples of natural and agricultural soils were processed and analyzed by Mehlich-1 extraction and by DTPA. To quantify Ni in the plant tissues, samples were digested with nitric acid in a closed system in a microwave oven. The measurement was performed by inductively coupled plasma/optical emission spectrometry (ICP-OES). There was a positive and significant correlation between the levels of available Ni in the soils subjected to Mehlich-1 and DTPA extraction, while for plant tissue samples the Ni levels recovered were high and similar to the reference materials. The availability of Ni in some of the natural soil and plant tissue samples were lower than the limits of quantification. Concentrations of this micronutrient were higher in the soil samples in which Ni had been applied. Nickel concentration differed in the plant parts analyzed, with highest levels in the grains of soybean. The grain, in comparison with the shoot and leaf concentrations, were better correlated with the soil available levels for both extractants. The methods described in this article were efficient in quantifying Ni and can be used for routine laboratory analysis of soils and plant tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple and sensitive LC-MS method was developed and validated for the simultaneous quantification of aripiprazole (ARI), atomoxetine (ATO), duloxetine (DUL), clozapine (CLO), olanzapine (OLA), sertindole (STN), venlafaxine (VEN) and their active metabolites dehydroaripiprazole (DARI), norclozapine (NCLO), dehydrosertindole (DSTN) and O-desmethylvenlafaxine (OVEN) in human plasma. The above mentioned compounds and the internal standard (remoxipride) were extracted from 0.5 mL plasma by solid-phase extraction (mix mode support). The analytical separation was carried out on a reverse phase liquid chromatography at basic pH (pH 8.1) in gradient mode. All analytes were monitored by MS detection in the single ion monitoring mode and the method was validated covering the corresponding therapeutic range: 2-200 ng/mL for DUL, OLA, and STN, 4-200 ng/mL for DSTN, 5-1000 ng/mL for ARI, DARI and finally 2-1000 ng/mL for ATO, CLO, NCLO, VEN, OVEN. For all investigated compounds, good performance in terms of recoveries, selectivity, stability, repeatability, intermediate precision, trueness and accuracy, was obtained. Real patient plasma samples were then successfully analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debris accumulation on bridge piers is an on-going national problem that can obstruct the waterway openings at bridges and result in significant erosion of stream banks and scour at abutments and piers. In some cases, the accumulation of debris can adversely affect the operation of the waterway opening or cause failure of the structure. In addition, removal of debris accumulation is difficult, time consuming, and expensive for maintenance programs. This research involves a literature search of publications, products, and pier design recommendations that provide a cost effective method to mitigate debris accumulation at bridges. In addition, a nationwide survey was conducted to determine the state-of-the-practice and the results are presented within.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this slight, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this light, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DnaSP is a software package for the analysis of DNA polymorphism data. Present version introduces several new modules and features which, among other options allow: (1) handling big data sets (~5 Mb per sequence); (2) conducting a large number of coalescent-based tests by Monte Carlo computer simulations; (3) extensive analyses of the genetic differentiation and gene flow among populations; (4) analysing the evolutionary pattern of preferred and unpreferred codons; (5) generating graphical outputs for an easy visualization of results. Availability: The software package, including complete documentation and examples, is freely available to academic users from: http://www.ub.es/dnasp

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Results Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. Conclusion The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate modeling of flow instabilities requires computational tools able to deal with several interacting scales, from the scale at which fingers are triggered up to the scale at which their effects need to be described. The Multiscale Finite Volume (MsFV) method offers a framework to couple fine-and coarse-scale features by solving a set of localized problems which are used both to define a coarse-scale problem and to reconstruct the fine-scale details of the flow. The MsFV method can be seen as an upscaling-downscaling technique, which is computationally more efficient than standard discretization schemes and more accurate than traditional upscaling techniques. We show that, although the method has proven accurate in modeling density-driven flow under stable conditions, the accuracy of the MsFV method deteriorates in case of unstable flow and an iterative scheme is required to control the localization error. To avoid large computational overhead due to the iterative scheme, we suggest several adaptive strategies both for flow and transport. In particular, the concentration gradient is used to identify a front region where instabilities are triggered and an accurate (iteratively improved) solution is required. Outside the front region the problem is upscaled and both flow and transport are solved only at the coarse scale. This adaptive strategy leads to very accurate solutions at roughly the same computational cost as the non-iterative MsFV method. In many circumstances, however, an accurate description of flow instabilities requires a refinement of the computational grid rather than a coarsening. For these problems, we propose a modified iterative MsFV, which can be used as downscaling method (DMsFV). Compared to other grid refinement techniques the DMsFV clearly separates the computational domain into refined and non-refined regions, which can be treated separately and matched later. This gives great flexibility to employ different physical descriptions in different regions, where different equations could be solved, offering an excellent framework to construct hybrid methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a renewal of interest among psychotherapy researchers and psychotherapists towards psychotherapy case studies. This article presents two paradigms that have greatly influenced this increasing interest in psychotherapy case studies : the pragmatic case study and the theory-building case study paradigm. The origins, developments and key-concepts of both paradigms are presented, as well as their methodological and ethical specificities. Examples of case studies, along with models developed, are cited. The differential influence of the post-modern schools on both paradigms are presented, as well as their contribution to the field of methods of psychotherapy case studies discussed and assessed in terms of relevance for the researcher and the psychotherapist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chronic hepatitis C is a major healthcare problem. The response to antiviral therapy for patients with chronic hepatitis C has previously been defined biochemically and by PCR. However, changes in the hepatic venous pressure gradient (HVPG) may be considered as an adjunctive end point for the therapeutic evaluation of antiviral therapy in chronic hepatitis C. It is a validated technique which is safe, well tolerated, well established, and reproducible. Serial HVPG measurements may be the best way to evaluate response to therapy in chronic hepatitis C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND/AIMS/METHODS During hepatic vein catheterisation, in addition to measurement of hepatic venous pressure gradient (HVPG), iodine wedged retrograde portography can be easily obtained. However, it rarely allows correct visualisation of the portal vein. Recently, CO2 has been suggested to allow better angiographic demonstration of the portal vein than iodine. In this study we investigated the efficacy of CO2 compared with iodinated contrast medium for portal vein imaging and its role in the evaluation of portal hypertension in a series of 100 patients undergoing hepatic vein catheterisation, 71 of whom had liver cirrhosis. RESULTS In the overall series, CO2 venography was markedly superior to iodine, allowing correct visualisation of the different segments of the portal venous system. In addition, CO2, but not iodine, visualised portal-systemic collaterals in 34 patients. In cirrhosis, non-visualisation of the portal vein on CO2 venography occurred in 11 cases; four had portal vein thrombosis and five had communications between different hepatic veins. Among non-cirrhotics, lack of portal vein visualisation had a 90% sensitivity, 88% specificity, 94% negative predictive value, and 83% positive predictive value in the diagnosis of pre-sinusoidal portal hypertension. CONCLUSIONS Visualisation of the venous portal system by CO2 venography is markedly superior to iodine. The use of CO2 wedged portography is a useful and safe complementary procedure during hepatic vein catheterisation which may help to detect portal thrombosis. Also, lack of demonstration of the portal vein in non-cirrhotic patients strongly suggests the presence of pre-sinusoidal portal hypertension.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and aims: Previous clinical trials suggest that adding non-selective beta-blockers improves the efficacy of endoscopic band ligation (EBL) in the prevention of recurrent bleeding, but no study has evaluated whether EBL improves the efficacy of beta-blockers + isosorbide-5-mononitrate. The present study was aimed at evaluating this issue in a multicentre randomised controlled trial (RCT) and to correlate changes in hepatic venous pressure gradient (HVPG) during treatment with clinical outcomes. Methods: 158 patients with cirrhosis, admitted because of variceal bleeding, were randomised to receive nadolol+isosorbide-5-mononitrate alone (Drug: n=78) or combined with EBL (Drug+EBL; n=80). HVPG measurements were performed at randomisation and after 4¿6 weeks on medical therapy. Results: Median follow-up was 15 months. One-year probability of recurrent bleeding was similar in both groups (33% vs 26%: p=0.3). There were no significant differences in survival or need of rescue shunts. Overall adverse events or those requiring hospital admission were significantly more frequent in the Drug+EBL group. Recurrent bleeding was significantly more frequent in HVPG non-responders than in responders (HVPG reduction ¿20% or ¿12 mm Hg). Among non-responders recurrent bleeding was similar in patients treated with Drugs or Drugs+EBL. Conclusions: Adding EBL to pharmacological treatment did not reduce recurrent bleeding, the need for rescue therapy, or mortality, and was associated with more adverse events. Furthermore, associating EBL to drug therapy did not reduce the high rebleeding risk of HVPG non-responders.