466 resultados para maximization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND 2013 AHA/ACC guidelines on the treatment of cholesterol advised to tailor high-intensity statin after ACS, while previous ATP-III recommended titration of statin to reach low-density lipoprotein cholesterol (LDL-C) targets. We simulated the impact of this change of paradigm on the achievement of recommended targets. METHODS Among a prospective cohort study of consecutive patients hospitalized for ACS from 2009 to 2012 at four Swiss university hospitals, we analyzed 1602 patients who survived one year after recruitment. Targets based on the previous guidelines approach was defined as (1) achievement of LDL-C target < 1.8 mmol/l, (2) reduction of LDL-C ≥ 50% or (3) intensification of statin in patients who did not reach LDL-C targets. Targets based on the 2013 AHA/ACC guidelines approach was defined as the maximization of statin therapy at high-intensity in patients aged ≤75 years and moderate- or high-intensity statin in patients >75 years. RESULTS 1578 (99%) patients were prescribed statin at discharge, with 1120 (70%) at high-intensity. 1507 patients (94%) reported taking statin at one year, with 909 (57%) at high-intensity. Among 482 patients discharged with sub-maximal statin, intensification of statin was only observed in 109 patients (23%). 773 (47%) patients reached the previous LDL-C targets, while 1014 (63%) reached the 2013 AHA/ACC guidelines targetsone year after ACS (p value < 0.001). CONCLUSION The application of the new 2013 AHA/ACC guidelines criteria would substantially increase the proportion of patients achieving recommended lipid targets one year after ACS. Clinical trial number, NCT01075868.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with parallel optimization of expensive objective functions which are modelled as sample realizations of Gaussian processes. The study is formalized as a Bayesian optimization problem, or continuous multi-armed bandit problem, where a batch of q > 0 arms is pulled in parallel at each iteration. Several algorithms have been developed for choosing batches by trading off exploitation and exploration. As of today, the maximum Expected Improvement (EI) and Upper Confidence Bound (UCB) selection rules appear as the most prominent approaches for batch selection. Here, we build upon recent work on the multipoint Expected Improvement criterion, for which an analytic expansion relying on Tallis’ formula was recently established. The computational burden of this selection rule being still an issue in application, we derive a closed-form expression for the gradient of the multipoint Expected Improvement, which aims at facilitating its maximization using gradient-based ascent algorithms. Substantial computational savings are shown in application. In addition, our algorithms are tested numerically and compared to state-of-the-art UCB-based batchsequential algorithms. Combining starting designs relying on UCB with gradient-based EI local optimization finally appears as a sound option for batch design in distributed Gaussian Process optimization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently there is no general method to study the impact of population admixture within families on the assumptions of random mating and consequently, Hardy-Weinberg equilibrium (HWE) and linkage equilibrium (LE) and on the inference obtained from traditional linkage analysis. ^ First, through simulation, the effect of admixture of two populations on the log of the odds (LOD) score was assessed, using Prostate Cancer as the typical disease model. Comparisons between simulated mixed and homogeneous families were performed. LOD scores under both models of admixture (within families and within a data set of homogeneous families) were closest to the homogeneous family scores of the population having the highest mixing proportion. Random sampling of families or ascertainment of families with disease affection status did not affect this observation, nor did the mode of inheritance (dominant/recessive) or sample size. ^ Second, after establishing the effect of admixture on the LOD score and inference for linkage, the presence of induced disequilibria by population admixture within families was studied and an adjustment procedure was developed. The adjustment did not force all disequilibria to disappear but because the families were adjusted for the population admixture, those replicates where the disequilibria exist are no longer affected by the disequilibria in terms of maximization for linkage. Furthermore, the adjustment was able to exclude uninformative families or families that had such a high departure from HWE and/or LE that their LOD scores were not reliable. ^ Together these observations imply that the presence of families of mixed population ancestry impacts linkage analysis in terms of the LOD score and the estimate of the recombination fraction. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prevalent sampling is an efficient and focused approach to the study of the natural history of disease. Right-censored time-to-event data observed from prospective prevalent cohort studies are often subject to left-truncated sampling. Left-truncated samples are not randomly selected from the population of interest and have a selection bias. Extensive studies have focused on estimating the unbiased distribution given left-truncated samples. However, in many applications, the exact date of disease onset was not observed. For example, in an HIV infection study, the exact HIV infection time is not observable. However, it is known that the HIV infection date occurred between two observable dates. Meeting these challenges motivated our study. We propose parametric models to estimate the unbiased distribution of left-truncated, right-censored time-to-event data with uncertain onset times. We first consider data from a length-biased sampling, a specific case in left-truncated samplings. Then we extend the proposed method to general left-truncated sampling. With a parametric model, we construct the full likelihood, given a biased sample with unobservable onset of disease. The parameters are estimated through the maximization of the constructed likelihood by adjusting the selection bias and unobservable exact onset. Simulations are conducted to evaluate the finite sample performance of the proposed methods. We apply the proposed method to an HIV infection study, estimating the unbiased survival function and covariance coefficients. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este artículo se analizará el pasaje de la mano invisible en Investigaciones sobre la naturaleza y causa de la riqueza de las naciones. Se mostrará que el resultado esperado por la acción de la mano invisible, la maximización del producto total anual, algunas veces es un obstáculo para la defensa de las naciones. Para comprender cabalmente este punto analizaremos previamente algunas cuestiones en relación con el pasaje de la mano invisible

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este artículo se analizará el pasaje de la mano invisible en Investigaciones sobre la naturaleza y causa de la riqueza de las naciones. Se mostrará que el resultado esperado por la acción de la mano invisible, la maximización del producto total anual, algunas veces es un obstáculo para la defensa de las naciones. Para comprender cabalmente este punto analizaremos previamente algunas cuestiones en relación con el pasaje de la mano invisible

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este artículo se analizará el pasaje de la mano invisible en Investigaciones sobre la naturaleza y causa de la riqueza de las naciones. Se mostrará que el resultado esperado por la acción de la mano invisible, la maximización del producto total anual, algunas veces es un obstáculo para la defensa de las naciones. Para comprender cabalmente este punto analizaremos previamente algunas cuestiones en relación con el pasaje de la mano invisible

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El propósito de la presente investigación es determinar si, a través del estudio y análisis de los estudios de tráfico en autopistas de peaje, se pueden determinar las razones de los incumplimientos en las previsiones de estos estudios. La metodología se basa en un análisis empírico ex- post facto de los estudios de tráfico contenidos en los anteproyectos de las autopistas Radial 3 y Radial 5 y los datos realmente verificados. Tras una introducción para presentar las principales características de las autopistas de peaje, se realiza una revisión de la bibliografía sobre el cumplimiento de las previsiones de tráfico. Lo anterior permite establecer una serie de aspectos que pueden contribuir a estos incumplimientos, así como una serie de medidas encontradas para mejorar las futuras previsiones. Ya en el núcleo fundamental de la investigación, esta se centra en el análisis del cumplimiento de las previsiones de tráfico contenidas en los anteproyectos de la Radial 3 y Radial 5. Se realiza un análisis crítico de la metodología adoptada, así como de las variables e hipótesis realizadas. Tras este primer análisis, se profundiza en la fase de asignación de los estudios. Siempre con base a los tráficos reales y para el año 2006, se cuantifica el efecto en los incumplimientos, por un lado de las variables utilizadas, y por otro, del propio método ó curva de asignación. Finalmente, y con base en los hallazgos anteriores, se determinan una serie de limitaciones en el método de asignación de tráficos entre recorridos alternativos para el caso de entornos urbanos usado. El planteamiento con base a las teorías del agente racional y maximización de la utilidad esperada es criticado desde la perspectiva de la teoría de decisión bajo condiciones de riesgo planteada por Kahneman y Tversky. Para superar las limitaciones anteriores, se propone una nueva curva de asignación semi empírica que relaciona la proporción del tráfico que circula por la autopista de peaje con la velocidad media en la autovía libre alternativa. ABSTRACT The aim of this research is to confirm whether the forensic analysis of the traffic forecast studies for tolled highways may bring to light the reasons behind the lack of accuracy. The methodology used on this research is empirical and is based on the ex –post facto analysis of the real traffic numbers compared to the forecasted for the tolled highways Radial 3 and Radial 5. Firstly the main features of tolled highways are presented as an introductory chapter. Secondly a broad bibliographic search is presented, this is done from a global perspective and from the Spanish perspective too. From this review, a list of the main causes behind the systematic inaccuracy together with measures to improve future traffic forecast exercises are shown. In what we could consider as the core of the research, it focuses on the ratios of actual / forecast traffic for the tolled highways Radial 3 y Radial 5 in Madrid outskirts. From a critical perspective, the methodology and inputs used in the traffic studies are analysed. In a further step, the trip assignment stage is scrutinised to quantify the influence of the inputs and the assignment model itself in the accuracy of the traffic studies. This exercise is bases on the year 2006. Finally, the assignment model used is criticised for its application in tolled urban highways. The assumptions behind the model, rational agent and expected utility maximization theories, are questioned under the theories presented by Kahneman and Tversky (Prospect Theory). To overcome these assignment model limitations, the author presents a semi empiric new diversion curve. This curve links the traffic proportion using the tolled highway and the average speed in the toll free alternative highway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have developed a new projector model specifically tailored for fast list-mode tomographic reconstructions in Positron emission tomography (PET) scanners with parallel planar detectors. The model provides an accurate estimation of the probability distribution of coincidence events defined by pairs of scintillating crystals. This distribution is parameterized with 2D elliptical Gaussian functions defined in planes perpendicular to the main axis of the tube of response (TOR). The parameters of these Gaussian functions have been obtained by fitting Monte Carlo simulations that include positron range, acolinearity of gamma rays, as well as detector attenuation and scatter effects. The proposed model has been applied efficiently to list-mode reconstruction algorithms. Evaluation with Monte Carlo simulations over a rotating high resolution PET scanner indicates that this model allows to obtain better recovery to noise ratio in OSEM (ordered-subsets, expectation-maximization) reconstruction, if compared to list-mode reconstruction with symmetric circular Gaussian TOR model, and histogram-based OSEM with precalculated system matrix using Monte Carlo simulated models and symmetries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, several computational schemes are presented for the optimal tuning of the global behavior of nonlinear dynamical sys- tems. Specifically, the maximization of the size of domains of attraction associated with invariants in parametrized dynamical sys- tems is addressed. Cell Mapping (CM) tech- niques are used to estimate the size of the domains, and such size is then maximized via different optimization tools. First, a ge- netic algorithm is tested whose performance shows to be good for determining global maxima at the expense of high computa- tional cost. Secondly, an iterative scheme based on a Stochastic Approximation proce- dure (the Kiefer-Wolfowitz algorithm) is eval- uated showing acceptable performance at low cost. Finally, several schemes combining neu- ral network based estimations and optimiza- tion procedures are addressed with promising results. The performance of the methods is illus- trated with two applications: first on the well-known van der Pol equation with stan- dard parametrization, and second the tuning of a controller for saturated systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La mayor parte de los entornos diseñados por el hombre presentan características geométricas específicas. En ellos es frecuente encontrar formas poligonales, rectangulares, circulares . . . con una serie de relaciones típicas entre distintos elementos del entorno. Introducir este tipo de conocimiento en el proceso de construcción de mapas de un robot móvil puede mejorar notablemente la calidad y la precisión de los mapas resultantes. También puede hacerlos más útiles de cara a un razonamiento de más alto nivel. Cuando la construcción de mapas se formula en un marco probabilístico Bayesiano, una especificación completa del problema requiere considerar cierta información a priori sobre el tipo de entorno. El conocimiento previo puede aplicarse de varias maneras, en esta tesis se presentan dos marcos diferentes: uno basado en el uso de primitivas geométricas y otro que emplea un método de representación cercano al espacio de las medidas brutas. Un enfoque basado en características geométricas supone implícitamente imponer un cierto modelo a priori para el entorno. En este sentido, el desarrollo de una solución al problema SLAM mediante la optimización de un grafo de características geométricas constituye un primer paso hacia nuevos métodos de construcción de mapas en entornos estructurados. En el primero de los dos marcos propuestos, el sistema deduce la información a priori a aplicar en cada caso en base a una extensa colección de posibles modelos geométricos genéricos, siguiendo un método de Maximización de la Esperanza para hallar la estructura y el mapa más probables. La representación de la estructura del entorno se basa en un enfoque jerárquico, con diferentes niveles de abstracción para los distintos elementos geométricos que puedan describirlo. Se llevaron a cabo diversos experimentos para mostrar la versatilidad y el buen funcionamiento del método propuesto. En el segundo marco, el usuario puede definir diferentes modelos de estructura para el entorno mediante grupos de restricciones y energías locales entre puntos vecinos de un conjunto de datos del mismo. El grupo de restricciones que se aplica a cada grupo de puntos depende de la topología, que es inferida por el propio sistema. De este modo, se pueden incorporar nuevos modelos genéricos de estructura para el entorno con gran flexibilidad y facilidad. Se realizaron distintos experimentos para demostrar la flexibilidad y los buenos resultados del enfoque propuesto. Abstract Most human designed environments present specific geometrical characteristics. In them, it is easy to find polygonal, rectangular and circular shapes, with a series of typical relations between different elements of the environment. Introducing this kind of knowledge in the mapping process of mobile robots can notably improve the quality and accuracy of the resulting maps. It can also make them more suitable for higher level reasoning applications. When mapping is formulated in a Bayesian probabilistic framework, a complete specification of the problem requires considering a prior for the environment. The prior over the structure of the environment can be applied in several ways; this dissertation presents two different frameworks, one using a feature based approach and another one employing a dense representation close to the measurements space. A feature based approach implicitly imposes a prior for the environment. In this sense, feature based graph SLAM was a first step towards a new mapping solution for structured scenarios. In the first framework, the prior is inferred by the system from a wide collection of feature based priors, following an Expectation-Maximization approach to obtain the most probable structure and the most probable map. The representation of the structure of the environment is based on a hierarchical model with different levels of abstraction for the geometrical elements describing it. Various experiments were conducted to show the versatility and the good performance of the proposed method. In the second framework, different priors can be defined by the user as sets of local constraints and energies for consecutive points in a range scan from a given environment. The set of constraints applied to each group of points depends on the topology, which is inferred by the system. This way, flexible and generic priors can be incorporated very easily. Several tests were carried out to demonstrate the flexibility and the good results of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future economic development trajectory for India is likely to result in rapid and accelerated growth in energy demand, with expected shortages. Many of its current policies and strategies are aimed at the improvement and possible maximization of energy production from the renewable sector. It is also clear that while energy-conservation and energy-efficiency can make an important contribution in the national energy strategy, renewable energies will be essential to the solution and are likely to play an increasingly important role for the growth of grid power, providing energy access, reducing consumption of fossil fuels, and helping India pursue its low carbon progressive pathway. However, most of the states in India, like the northernmost State of Jammu and Kashmir (J&K), have experienced an energy crisis over a sustained period of time. As India intends to be one of the emerging powers of the 21st century, it has to embark upon with these pressing issues in a more sustainable manner and accordingly initiate various renewable energy projects within these states. This paper will provide a broad-spectrum view about the energy situation within J&K and will highlight the current policies along with future strategies for the optimal utilization of renewable energy resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future economic growth for India is likely to result in rapid and accelerated surge in energy demand, with expected shortages in terms of supply. Many of its current policies and strategies are aimed at the improvement and possible maximization of energy production from the renewable sector. It is also clear that while energy conservation and energy efficiency can make an important contribution, renewable energies will be essential to the solution and are likely to play an increasingly important role for providing enhanced energy access, reducing consumption of fossil fuels, and helping India pursue its low-carbon progressive pathway. However, most of the states in India, like the northernmost state of Jammu and Kashmir, have experienced an energy crisis over a sustained period of time and the government both at center and state level has to embark upon with these pressing issues in a more sustainable manner and accordingly initiate various renewable energy projects within these states. This paper will provide a broad-spectrum view about the energy situation within Jammu and Kashmir and will highlight the current policies along with future strategies for the optimal utilization of renewable energy resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ATM, SDH or satellite have been used in the last century as the contribution network of Broadcasters. However the attractive price of IP networks is changing the infrastructure of these networks in the last decade. Nowadays, IP networks are widely used, but their characteristics do not offer the level of performance required to carry high quality video under certain circumstances. Data transmission is always subject to errors on line. In the case of streaming, correction is attempted at destination, while on transfer of files, retransmissions of information are conducted and a reliable copy of the file is obtained. In the latter case, reception time is penalized because of the low priority this type of traffic on the networks usually has. While in streaming, image quality is adapted to line speed, and line errors result in a decrease of quality at destination, in the file copy the difference between coding speed vs line speed and errors in transmission are reflected in an increase of transmission time. The way news or audiovisual programs are transferred from a remote office to the production centre depends on the time window and the type of line available; in many cases, it must be done in real time (streaming), with the resulting image degradation. The main purpose of this work is the workflow optimization and the image quality maximization, for that reason a transmission model for multimedia files adapted to JPEG2000, is described based on the combination of advantages of file transmission and those of streaming transmission, putting aside the disadvantages that these models have. The method is based on two patents and consists of the safe transfer of the headers and data considered to be vital for reproduction. Aside, the rest of the data is sent by streaming, being able to carry out recuperation operations and error concealment. Using this model, image quality is maximized according to the time window. In this paper, we will first give a briefest overview of the broadcasters requirements and the solutions with IP networks. We will then focus on a different solution for video file transfer. We will take the example of a broadcast center with mobile units (unidirectional video link) and regional headends (bidirectional link), and we will also present a video file transfer file method that satisfies the broadcaster requirements.