932 resultados para Convex combination


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and aims The high metal bioavailability and the poor conditions of mine soils yield a low plant biomass, limiting the application of phytoremediation techniques. A greenhouse experiment was performed to evaluate the effects of organic amendments on metal stabilization and the potential of Brassica juncea L. for phytostabilization in mine soils. Methods Plants were grown in pots filled with soils collected from two mine sites located in Central Spain mixed with 0, 30 and 60 tha?1 of pine bark compost and horse- and sheep-manure compost. Plant biomass and metal concentrations in roots and shoots were measured. Metal bioavailability was assessed using a rhizosphere-based method (rhizo), which consists of a mixture of low-molecular-weight organic acids to simulate root exudates. Results Manure reduced metal concentrations in shoots (10?50 % reduction of Cu and 40?80 % of Zn in comparison with non-amended soils), bioconcentration factor (10?50 % of Cu and 40?80 % of Zn) and metal bioavailability in soil (40?50 % of Cu and 10?30 % of Zn) due to the high pH and the contribution of organic matter. Manure improved soil fertility and was also able to increase plant biomass (5?20 times in shoots and 3?30 times in roots), which resulted in a greater amount of metals removed from soil and accumulated in roots (increase of 2?7 times of Cu and Zn). Plants grown in pine bark treatments and in non-amended soils showed a limited biomass and high metal concentrations in shoots. Conclusions The addition of manure could be effective for the stabilization of metals and for enhancing the phytostabilization ability of B. juncea in mine soils. In this study, this species resulted to be a potential candidate for phytostabilization in combination with manure, differing from previous results, in which B. juncea had been recognized as a phytoextraction plant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new method to automatically refine a facial disparity map obtained with standard cameras and under conventional illumination conditions by using a smart combination of traditional computer vision and 3D graphics techniques. Our system inputs two stereo images acquired with standard (calibrated) cameras and uses dense disparity estimation strategies to obtain a coarse initial disparity map, and SIFT to detect and match several feature points in the subjects face. We then use these points as anchors to modify the disparity in the facial area by building a Delaunay triangulation of their convex hull and interpolating their disparity values inside each triangle. We thus obtain a refined disparity map providing a much more accurate representation of the the subjects facial features. This refined facial disparity map may be easily transformed, through the camera calibration parameters, into a depth map to be used, also automatically, to improve the facial mesh of a 3D avatar to match the subjects real human features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show the existence of sets with n points (n ? 4) for which every convex decomposition contains more than (35/32)n?(3/2) polygons,which refutes the conjecture that for every set of n points there is a convex decomposition with at most n+C polygons. For sets having exactly three extreme pointswe show that more than n+sqr(2(n ? 3))?4 polygons may be necessary to form a convex decomposition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is an account of some aspects of the geometry of Kahler affine metrics based on considering them as smooth metric measure spaces and applying the comparison geometry of Bakry-Emery Ricci tensors. Such techniques yield a version for Kahler affine metrics of Yau s Schwarz lemma for volume forms. By a theorem of Cheng and Yau, there is a canonical Kahler affine Einstein metric on a proper convex domain, and the Schwarz lemma gives a direct proof of its uniqueness up to homothety. The potential for this metric is a function canonically associated to the cone, characterized by the property that its level sets are hyperbolic affine spheres foliating the cone. It is shown that for an n -dimensional cone, a rescaling of the canonical potential is an n -normal barrier function in the sense of interior point methods for conic programming. It is explained also how to construct from the canonical potential Monge-Ampère metrics of both Riemannian and Lorentzian signatures, and a mean curvature zero conical Lagrangian submanifold of the flat para-Kahler space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes an automatic methodology for modeling complex systems. Our methodology is based on the combination of Grammatical Evolution and classical regression to obtain an optimal set of features that take part of a linear and convex model. This technique provides both Feature Engineering and Symbolic Regression in order to infer accurate models with no effort or designer's expertise requirements. As advanced Cloud services are becoming mainstream, the contribution of data centers in the overall power consumption of modern cities is growing dramatically. These facilities consume from 10 to 100 times more power per square foot than typical office buildings. Modeling the power consumption for these infrastructures is crucial to anticipate the effects of aggressive optimization policies, but accurate and fast power modeling is a complex challenge for high-end servers not yet satisfied by analytical approaches. For this case study, our methodology minimizes error in power prediction. This work has been tested using real Cloud applications resulting on an average error in power estimation of 3.98%. Our work improves the possibilities of deriving Cloud energy efficient policies in Cloud data centers being applicable to other computing environments with similar characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An innovative background modeling technique that is able to accurately segment foreground regions in RGB-D imagery (RGB plus depth) has been presented in this paper. The technique is based on a Bayesian framework that efficiently fuses different sources of information to segment the foreground. In particular, the final segmentation is obtained by considering a prediction of the foreground regions, carried out by a novel Bayesian Network with a depth-based dynamic model, and, by considering two independent depth and color-based mixture of Gaussians background models. The efficient Bayesian combination of all these data reduces the noise and uncertainties introduced by the color and depth features and the corresponding models. As a result, more compact segmentations, and refined foreground object silhouettes are obtained. Experimental results with different databases suggest that the proposed technique outperforms existing state-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose and experimentally demonstrate a scalable and reconfigurable optical scheme to generate high order UWB pulses. Firstly, various ultra wideband doublets are created through a process of phase-tointensity conversion by means of a phase modulation and a dispersive media. In a second stage, doublets are combined in an optical processing unit that allows the reconfiguration of UWB high order pulses. Experimental results both in time and frequency domains are presented showing good performance related to the fractional bandwidth and spectral efficiency parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low cost RGB-D cameras such as the Microsoft’s Kinect or the Asus’s Xtion Pro are completely changing the computer vision world, as they are being successfully used in several applications and research areas. Depth data are particularly attractive and suitable for applications based on moving objects detection through foreground/background segmentation approaches; the RGB-D applications proposed in literature employ, in general, state of the art foreground/background segmentation techniques based on the depth information without taking into account the color information. The novel approach that we propose is based on a combination of classifiers that allows improving background subtraction accuracy with respect to state of the art algorithms by jointly considering color and depth data. In particular, the combination of classifiers is based on a weighted average that allows to adaptively modifying the support of each classifier in the ensemble by considering foreground detections in the previous frames and the depth and color edges. In this way, it is possible to reduce false detections due to critical issues that can not be tackled by the individual classifiers such as: shadows and illumination changes, color and depth camouflage, moved background objects and noisy depth measurements. Moreover, we propose, for the best of the author’s knowledge, the first publicly available RGB-D benchmark dataset with hand-labeled ground truth of several challenging scenarios to test background/foreground segmentation algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El objetivo de esta tesis es la caracterización de la generación térmica representativa de la existente en la realidad, para posteriormente proceder a su modelización y simulación integrándolas en una red eléctrica tipo y llevar a cabo estudios de optimización multiobjetivo económico medioambiental. Para ello, en primera instancia se analiza el contexto energético y eléctrico actual, y más concretamente el peninsular, en el que habiendo desaparecido las centrales de fuelóleo, sólo quedan ciclos combinados y centrales de carbón de distinto rango. Seguidamente se lleva a cabo un análisis de los principales impactos medioambientales de las centrales eléctricas basadas en combustión, representados sobre todo por sus emisiones de CO2, SO2 y NOx, de las medidas de control y mitigación de las mismas y de la normativa que les aplica. A continuación, a partir de las características de los combustibles y de la información de los consumos específicos, se caracterizan los grupos térmicos frente a las funciones relevantes que definen su comportamiento energético, económico y medioambiental, en términos de funciones de salida horarias dependiendo de la carga. Se tiene en cuenta la posibilidad de desnitrificación y desulfuración. Dado que las funciones objetivo son múltiples, y que están en conflicto unas con otras, se ha optado por usar métodos multiobjetivo que son capaces de identificar el contorno de puntos óptimos o frente de Pareto, en los que tomando una solución no existe otra que lo mejore en alguna de las funciones objetivo sin empeorarlo en otra. Se analizaron varios métodos de optimización multiobjetivo y se seleccionó el de las ε constraint, capaz de encontrar frentes no convexos y cuya optimalidad estricta se puede comprobar. Se integró una representación equilibrada de centrales de antracita, hulla nacional e importada, lignito y ciclos combinados en la red tipo IEEE-57, en la que se puede trabajar con siete centrales sin distorsionar demasiado las potencias nominales reales de los grupos, y se programó en Matlab la resolución de flujos óptimos de carga en alterna con el método multiobjetivo integrado. Se identifican los frentes de Pareto de las combinaciones de coste y cada uno de los tres tipos de emisión, y también el de los cuatro objetivos juntos, obteniendo los resultados de costes óptimos del sistema para todo el rango de emisiones. Se valora cuánto le cuesta al sistema reducir una tonelada adicional de cualquier tipo de emisión a base de desplazarse a combinaciones de generación más limpias. Los puntos encontrados aseguran que bajo unas determinadas emisiones no pueden ser mejorados económicamente, o que atendiendo a ese coste no se puede reducir más allá el sistema en lo relativo a emisiones. También se indica cómo usar los frentes de Pareto para trazar estrategias óptimas de producción ante cambios horarios de carga. ABSTRACT The aim of this thesis is the characterization of electrical generation based on combustion processes representative of the actual power plants, for the latter modelling and simulation of an electrical grid and the development of economic- environmental multiobjective optimization studies. In this line, the first step taken is the analysis of the current energetic and electrical framework, focused on the peninsular one, where the fuel power plants have been shut down, and the only ones remaining are coal units of different types and combined cycle. Then it is carried out an analysis of the main environmental impacts of the thermal power plants, represented basically by the emissions of CO2, SO2 y NOx, their control and reduction measures and the applicable regulations. Next, based on the combustibles properties and the information about the units heat rates, the different power plants are characterized in relation to the outstanding functions that define their energy, economic and environmental behaviour, in terms of hourly output functions depending on their load. Optional denitrification and desulfurization is considered. Given that there are multiple objectives, and that they go in conflictive directions, it has been decided the use of multiobjective techniques, that have the ability of identifying the optimal points set, which is called the Pareto front, where taken a solution there will be no other point that can beat the former in an objective without worsening it in another objective. Several multiobjective optimization methods were analysed and pondered, selecting the ε constraint technique, which is able to find no convex fronts and it is opened to be tested to prove the strict Pareto optimality of the obtained solutions. A balanced representation of the thermal power plants, formed by anthracite, lignite, bituminous national and imported coals and combined cycle, was integrated in the IEEE-57 network case. This system was selected because it deals with a total power that will admit seven units without distorting significantly the actual size of the power plants. Next, an AC optimal power flow with the multiobjective method implemented in the routines was programmed. The Pareto fronts of the combination of operative costs with each of the three emissions functions were found, and also the front of all of them together. The optimal production costs of the system for all the emissions range were obtained. It is also evaluated the cost of reducing an additional emission ton of any of the emissions when the optimal production mix is displaced towards cleaner points. The obtained solutions assure that under a determined level of emissions they cannot be improved economically or, in the other way, at a determined cost it cannot be found points of lesser emissions. The Pareto fronts are also applied for the search of optimal strategic paths to follow the hourly load changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An engineering modification of blade element/momentum theory is applied to describe the vertical autorotation of helicopter rotors. A full non‐linear aerodynamic model is considered for the airfoils, taking into account the dependence of lift and drag coefficients on both the angle of attack and the Reynolds number. The proposed model, which has been validated in previous work, has allowed the identification of different autorotation modes, which depend on the descent velocity and the twist of the rotor blades. These modes present different radial distributions of driven and driving blade regions, as well as different radial upwash/downwash patterns. The number of blade sections with zero tangential force, the existence of a downwash region in the rotor disk, the stability of the autorotation state, and the overall rotor autorotation efficiency, are all analyzed in terms of the flight velocity and the characteristics of the rotor. It is shown that, in vertical autorotation, larger blade twist leads to smaller values of descent velocity for a given thrust generated by the rotor in the autorotational state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an approach for evaluating the efficacy of combination antitumor agent schedules that accounts for order and timing of drug administration. Our model-based approach compares in vivo tumor volume data over a time course and offers a quantitative definition for additivity of drug effects, relative to which synergism and antagonism are interpreted. We begin by fitting data from individual mice receiving at most one drug to a differential equation tumor growth/drug effect model and combine individual parameter estimates to obtain population statistics. Using two null hypotheses: (i) combination therapy is consistent with additivity or (ii) combination therapy is equivalent to treating with the more effective single agent alone, we compute predicted tumor growth trajectories and their distribution for combination treated animals. We illustrate this approach by comparing entire observed and expected tumor volume trajectories for a data set in which HER-2/neu-overexpressing MCF-7 human breast cancer xenografts are treated with a humanized, anti-HER-2 monoclonal antibody (rhuMAb HER-2), doxorubicin, or one of five proposed combination therapy schedules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure to 3TC of HIV-1 mutant strains containing non-nucleoside reverse transcriptase inhibitor (NNRTI)-specific mutations in their reverse transcriptase (RT) easily selected for double-mutant viruses that had acquired the characteristic 184-Ile mutation in their RT in addition to the NNRTI-specific mutations. Conversely, exposure of 3TC-resistant 184-Val mutant HIV-1 strains to nine different NNRTIs resulted in the rapid emergence of NNRTI-resistant virus strains at a time that was not more delayed than when wild-type HIV-1(IIIB) was exposed to the same compounds. The RTs of these resistant virus strains had acquired the NNRTI-characteristic mutations in addition to the preexisting 184-Val mutation. Surprisingly, when the 184-Ile mutant HIV-1 was exposed to a variety of NNRTIs, the 188-His mutation invariably occurred concomitantly with the 184-Ile mutation in the HIV-1 RT. Breakthrough of this double-mutant virus was markedly accelerated as compared with the mutant virus selected from the wild-type or 184-Val mutant HIV-1 strain. The double (184-Ile + 188-His) mutant virus showed a much more profound resistance profile against the NNRTIs than the 188-His HIV-1 mutant. In contrast with the sequential chemotherapy, concomitant combination treatment of HIV-1-infected cells with 3TC and a variety of NNRTIs resulted in a dramatic delay of virus breakthrough and resistance development.