773 resultados para minimization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectif : Les auteurs s’intéressant à la relation entre le déni, la minimisation et les distorsions cognitives ont tous utilisé des méthodes et des définitions différentes pour décrire ces concepts, entrainant une importante variabilité des résultats. La recherche actuelle a donc pour objectif de clarifier la mesure du déni, de la minimisation et des distorsions cognitives. Méthode : Les participants étaient 313 détenus masculins ayant complété le programme national de traitement pour délinquants sexuels du Service correctionnel du Canada entre 2000 et 2004. Ces individus ont complété une série de tests psychométriques avant et après leur participation au programme, dont le SOARS et les échelles de Bumby. L’analyse des données a suivi le processus de validation de construit établi par Nunnally et Bernstein (1994). Résultats : Les résultats des analyses statistiques indiquent que le Sex Offender Acceptance of Responsibility Scales (SOARS; Peacock, 2000) ne mesure pas efficacement le construit du déni et de la minimisation. Ses propriétés psychométriques sont discutables. La réduction de l’instrument à dix variables permet cependant d’améliorer la mesure. L’échelle résultante est composée de deux facteurs, soit l’« acceptation du tort sexuel » et l’« acceptation de l’intention sexuelle ». Ces deux facteurs ont été mis en relation avec les facteurs des échelles de Bumby afin d’explorer les similitudes entre les concepts de déni, minimisation et distorsion cognitive. Or, malgré des corrélations faibles à moyennes, les différentes variables ne convergent en aucun facteur lors d’une analyse factorielle et les variables du SOARS corrèlent très peu au total de l’échelle, suggérant qu’il s’agit de concepts distincts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The need for reliable predictions of the solar activity cycle motivates the development of dynamo models incorporating a representation of surface processes sufficiently detailed to allow assimilation of magnetographic data. In this series of papers we present one such dynamo model, and document its behavior and properties. This first paper focuses on one of the model's key components, namely surface magnetic flux evolution. Using a genetic algorithm, we obtain best-fit parameters of the transport model by least-squares minimization of the differences between the associated synthetic synoptic magnetogram and real magnetographic data for activity cycle 21. Our fitting procedure also returns Monte Carlo-like error estimates. We show that the range of acceptable surface meridional flow profiles is in good agreement with Doppler measurements, even though the latter are not used in the fitting process. Using a synthetic database of bipolar magnetic region (BMR) emergences reproducing the statistical properties of observed emergences, we also ascertain the sensitivity of global cycle properties, such as the strength of the dipole moment and timing of polarity reversal, to distinct realizations of BMR emergence, and on this basis argue that this stochasticity represents a primary source of uncertainty for predicting solar cycle characteristics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the major concerns of scoliotic patients undergoing spinal correction surgery is the trunk's external appearance after the surgery. This paper presents a novel incremental approach for simulating postoperative trunk shape in scoliosis surgery. Preoperative and postoperative trunk shapes data were obtained using three-dimensional medical imaging techniques for seven patients with adolescent idiopathic scoliosis. Results of qualitative and quantitative evaluations, based on the comparison of the simulated and actual postoperative trunk surfaces, showed an adequate accuracy of the method. Our approach provides a candidate simulation tool to be used in a clinical environment for the surgery planning process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A bivariate semi-Pareto distribution is introduced and characterized using geometric minimization. Autoregressive minification models for bivariate random vectors with bivariate semi-Pareto and bivariate Pareto distributions are also discussed. Multivariate generalizations of the distributions and the processes are briefly indicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dieser Arbeit werden nichtüberlappende Gebietszerlegungsmethoden einerseits hinsichtlich der zu lösenden Problemklassen verallgemeinert und andererseits in bisher nicht untersuchten Kontexten betrachtet. Dabei stehen funktionalanalytische Untersuchungen zur Wohldefiniertheit, eindeutigen Lösbarkeit und Konvergenz im Vordergrund. Im ersten Teil werden lineare elliptische Dirichlet-Randwertprobleme behandelt, wobei neben Problemen mit dominantem Hauptteil auch solche mit singulärer Störung desselben, wie konvektions- oder reaktionsdominante Probleme zugelassen sind. Der zweite Teil befasst sich mit (gleichmäßig) monotonen koerziven quasilinearen elliptischen Dirichlet-Randwertproblemen. In beiden Fällen wird das Lipschitz-Gebiet in endlich viele Lipschitz-Teilgebiete zerlegt, wobei insbesondere Kreuzungspunkte und Teilgebiete ohne Außenrand zugelassen sind. Anschließend werden Transmissionsprobleme mit frei wählbaren $L^{\infty}$-Parameterfunktionen hergeleitet, wobei die Konormalenableitungen als Funktionale auf geeigneten Funktionenräumen über den Teilrändern ($H_{00}^{1/2}(\Gamma)$) interpretiert werden. Die iterative Lösung dieser Transmissionsprobleme mit einem Ansatz von Deng führt auf eine Substrukturierungsmethode mit Robin-artigen Transmissionsbedingungen, bei der eine Auswertung der Konormalenableitungen aufgrund einer geschickten Aufdatierung der Robin-Daten nicht notwendig ist (insbesondere ist die bekannte Robin-Robin-Methode von Lions als Spezialfall enthalten). Die Konvergenz bezüglich einer partitionierten $H^1$-Norm wird für beide Problemklassen gezeigt. Dabei werden keine über $H^1$ hinausgehende Regularitätsforderungen an die Lösungen gestellt und die Gebiete müssen keine zusätzlichen Glattheitsvoraussetzungen erfüllen. Im letzten Kapitel werden nichtmonotone koerzive quasilineare Probleme untersucht, wobei das Zugrunde liegende Gebiet nur in zwei Lipschitz-Teilgebiete zerlegt sein soll. Das zugehörige nichtlineare Transmissionsproblem wird durch Kirchhoff-Transformation in lineare Teilprobleme mit nichtlinearen Kopplungsbedingungen überführt. Ein optimierungsbasierter Lösungsansatz, welcher einen geeigneten Abstand der rücktransformierten Dirichlet-Daten der linearen Teilprobleme auf den Teilrändern minimiert, führt auf ein optimales Kontrollproblem. Die dabei entstehenden regularisierten freien Minimierungsprobleme werden mit Hilfe eines Gradientenverfahrens unter minimalen Glattheitsforderungen an die Nichtlinearitäten gelöst. Unter zusätzlichen Glattheitsvoraussetzungen an die Nichtlinearitäten und weiteren technischen Voraussetzungen an die Lösung des quasilinearen Ausgangsproblems, kann zudem die quadratische Konvergenz des Newton-Verfahrens gesichert werden.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates a method for human-robot interaction (HRI) in order to uphold productivity of industrial robots like minimization of the shortest operation time, while ensuring human safety like collision avoidance. For solving such problems an online motion planning approach for robotic manipulators with HRI has been proposed. The approach is based on model predictive control (MPC) with embedded mixed integer programming. The planning strategies of the robotic manipulators mainly considered in the thesis are directly performed in the workspace for easy obstacle representation. The non-convex optimization problem is approximated by a mixed-integer program (MIP). It is further effectively reformulated such that the number of binary variables and the number of feasible integer solutions are drastically decreased. Safety-relevant regions, which are potentially occupied by the human operators, can be generated online by a proposed method based on hidden Markov models. In contrast to previous approaches, which derive predictions based on probability density functions in the form of single points, such as most likely or expected human positions, the proposed method computes safety-relevant subsets of the workspace as a region which is possibly occupied by the human at future instances of time. The method is further enhanced by combining reachability analysis to increase the prediction accuracy. These safety-relevant regions can subsequently serve as safety constraints when the motion is planned by optimization. This way one arrives at motion plans that are safe, i.e. plans that avoid collision with a probability not less than a predefined threshold. The developed methods have been successfully applied to a developed demonstrator, where an industrial robot works in the same space as a human operator. The task of the industrial robot is to drive its end-effector according to a nominal sequence of grippingmotion-releasing operations while no collision with a human arm occurs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work described in this thesis began as an inquiry into the nature and use of optimization programs based on "genetic algorithms." That inquiry led, eventually, to three powerful heuristics that are broadly applicable in gradient-ascent programs: First, remember the locations of local maxima and restart the optimization program at a place distant from previously located local maxima. Second, adjust the size of probing steps to suit the local nature of the terrain, shrinking when probes do poorly and growing when probes do well. And third, keep track of the directions of recent successes, so as to probe preferentially in the direction of most rapid ascent. These algorithms lie at the core of a novel optimization program that illustrates the power to be had from deploying them together. The efficacy of this program is demonstrated on several test problems selected from a variety of fields, including De Jong's famous test-problem suite, the traveling salesman problem, the problem of coordinate registration for image guided surgery, the energy minimization problem for determining the shape of organic molecules, and the problem of assessing the structure of sedimentary deposits using seismic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an adaptive learning model for market-making under the reinforcement learning framework. Reinforcement learning is a learning technique in which agents aim to maximize the long-term accumulated rewards. No knowledge of the market environment, such as the order arrival or price process, is assumed. Instead, the agent learns from real-time market experience and develops explicit market-making strategies, achieving multiple objectives including the maximizing of profits and minimization of the bid-ask spread. The simulation results show initial success in bringing learning techniques to building market-making algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Support Vector Machine (SVM) is a new and very promising classification technique developed by Vapnik and his group at AT&T Bell Labs. This new learning algorithm can be seen as an alternative training technique for Polynomial, Radial Basis Function and Multi-Layer Perceptron classifiers. An interesting property of this approach is that it is an approximate implementation of the Structural Risk Minimization (SRM) induction principle. The derivation of Support Vector Machines, its relationship with SRM, and its geometrical insight, are discussed in this paper. Training a SVM is equivalent to solve a quadratic programming problem with linear and box constraints in a number of variables equal to the number of data points. When the number of data points exceeds few thousands the problem is very challenging, because the quadratic form is completely dense, so the memory needed to store the problem grows with the square of the number of data points. Therefore, training problems arising in some real applications with large data sets are impossible to load into memory, and cannot be solved using standard non-linear constrained optimization algorithms. We present a decomposition algorithm that can be used to train SVM's over large data sets. The main idea behind the decomposition is the iterative solution of sub-problems and the evaluation of, and also establish the stopping criteria for the algorithm. We present previous approaches, as well as results and important details of our implementation of the algorithm using a second-order variant of the Reduced Gradient Method as the solver of the sub-problems. As an application of SVM's, we present preliminary results we obtained applying SVM to the problem of detecting frontal human faces in real images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aplicació d'una DAOM (Diagnosi Ambiental d’Oportunitats de Minimització)a l'Ajuntament de Banyoles. Una DAOM és una eina desenvolupada pel Centre per a l’Empresa i el Medi Ambient, que consisteix en l’avaluació d’una activitat o procés, per determinar les possibles oportunitats de prevenció i reducció en origen de la contaminació, i aportar-hi alternatives d’actuació tècnica i econòmicament viables

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most effective techniques offering QoS routing is minimum interference routing. However, it is complex in terms of computation time and is not oriented toward improving the network protection level. In order to include better levels of protection, new minimum interference routing algorithms are necessary. Minimizing the failure recovery time is also a complex process involving different failure recovery phases. Some of these phases depend completely on correct routing selection, such as minimizing the failure notification time. The level of protection also involves other aspects, such as the amount of resources used. In this case shared backup techniques should be considered. Therefore, minimum interference techniques should also be modified in order to include sharing resources for protection in their objectives. These aspects are reviewed and analyzed in this article, and a new proposal combining minimum interference with fast protection using shared segment backups is introduced. Results show that our proposed method improves both minimization of the request rejection ratio and the percentage of bandwidth allocated to backup paths in networks with low and medium protection requirements

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En esta Tesis se presenta el modelo de Kou, Difusión con saltos doble exponenciales, para la valoración de opciones Call de tipo europeo sobre los precios del petróleo como activo subyacente. Se mostrarán los cálculos numéricos para la formulación de expresiones analíticas que se resolverán mediante la implementación de algoritmos numéricos eficientes que conllevaran a los precios teóricos de las opciones evaluadas. Posteriormente se discutirán las ventajas de usar métodos como la transformada de Fourier por la sencillez relativa de su programación frente a los desarrollos de otras técnicas numéricas. Este método es usado en conjunto con el ejercicio de calibración no paramétrica de regularización, que mediante la minimización de los errores al cuadrado sujeto a una penalización fundamentada en el concepto de entropía relativa, resultaran en la obtención de precios para las opciones Call sobre el petróleo considerando una mejor capacidad del modelo de asignar precios justos frente a los transados en el mercado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In image processing, segmentation algorithms constitute one of the main focuses of research. In this paper, new image segmentation algorithms based on a hard version of the information bottleneck method are presented. The objective of this method is to extract a compact representation of a variable, considered the input, with minimal loss of mutual information with respect to another variable, considered the output. First, we introduce a split-and-merge algorithm based on the definition of an information channel between a set of regions (input) of the image and the intensity histogram bins (output). From this channel, the maximization of the mutual information gain is used to optimize the image partitioning. Then, the merging process of the regions obtained in the previous phase is carried out by minimizing the loss of mutual information. From the inversion of the above channel, we also present a new histogram clustering algorithm based on the minimization of the mutual information loss, where now the input variable represents the histogram bins and the output is given by the set of regions obtained from the above split-and-merge algorithm. Finally, we introduce two new clustering algorithms which show how the information bottleneck method can be applied to the registration channel obtained when two multimodal images are correctly aligned. Different experiments on 2-D and 3-D images show the behavior of the proposed algorithms