43 resultados para exceedance probabilities

em Universidad Politécnica de Madrid


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transitionprobabilities and oscillatorstrengths of 176 spectral lines with astrophysical interest arising from 5d10ns (n = 7,8), 5d10np (n = 6,7), 5d10nd (n = 6,7), 5d105f, 5d105g, 5d10nh (n = 6,7,8), 5d96s2, and 5d96s6p configurations, and radiativelifetimes for 43 levels of PbIV, have been calculated. These values were obtained in intermediate coupling (IC) and using relativistic Hartree–Fock calculations including core-polarization effects. For the IC calculations, we use the standard method of least-square fitting from experimental energy levels by means of the Cowan computer code. The inclusion in these calculations of the 5d107p and 5d105f configurations has facilitated a complete assignment of the energy levels in the PbIV. Transitionprobabilities, oscillatorstrengths, and radiativelifetimes obtained are generally in good agreement with the experimental data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have determined matrix elements for all experimental configurations of Ca III, including the 3s3p63d configuration. These values have been obtained using intermediate coupling (IC). For these IC calculations, we have used the standard method of least-squares fitting from the experimental energy levels, using the computer code developed by Robert Cowan. In this paper, using these matrix elements, we report the calculated values of the Ca III Stark widths and shifts for 148 spectral lines, of 56 Ca III spectral line transition probabilities and of eight radiative lifetimes of Ca III levels. The Stark widths and shifts, calculated using the Griem semi-empirical approach, correspond to the spectral lines of Ca III and are presented for an electron density of 1017 cm?3 and temperatures T = 1.0?10.0 (×104 K). The theoretical trends of the Stark broadening parameter versus the temperature are presented for transitions that are of astrophysical interest. There is good agreement between our calculations, for transition probabilities and radiative lifetimes, and the experimental values presented in the literature. We have not been able to find any values for the Stark parameters in the references.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present improved experimental transition probabilities for the optical Ca I 4s4p-4s4d and 4s4p-4p2multiplets. The values were determined with an absolute uncertainty of 10%. Transition probabilities have been determined by the branching ratios from the measurement of relative line intensities emitted by laser-induced plasma (LIP). The line intensities were obtained with the target (leadcalcium) placed in argon atmosphere at 6 Torr, recorded at a 2.5 µs delay from the laser pulse, which provides appropriate measurement conditions, and analysed between 350.0 and 550.0 nm. They are measured when the plasma reaches local thermodynamic equilibrium (LTE). The plasma is characterized by electron temperature (T) of 11400 K and an electron number density (Ne) of 1.1 x 1016 cm-3. The influence self-absorption has been estimated for every line, and plasma homogeneity has been checked. The values obtained were compared with previous experimental values in the literature. The method for measurement of transition probabilities using laser-induced plasma as spectroscopic source has been checked.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonparametric belief propagation (NBP) is a well-known particle-based method for distributed inference in wireless networks. NBP has a large number of applications, including cooperative localization. However, in loopy networks NBP suffers from similar problems as standard BP, such as over-confident beliefs and possible nonconvergence. Tree-reweighted NBP (TRW-NBP) can mitigate these problems, but does not easily lead to a distributed implementation due to the non-local nature of the required so-called edge appearance probabilities. In this paper, we propose a variation of TRWNBP, suitable for cooperative localization in wireless networks. Our algorithm uses a fixed edge appearance probability for every edge, and can outperform standard NBP in dense wireless networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a new method for the automatic detection and tracking of road traffic signs using an on-board single camera. This method aims to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. The proposed approach exploits a combination of different features, such as color, appearance, and tracking information. This information is introduced into a recursive Bayesian decision framework, in which prior probabilities are dynamically adapted to tracking results. This decision scheme obtains a number of candidate regions in the image, according to their HS (Hue-Saturation). Finally, a Kalman filter with an adaptive noise tuning provides the required time and spatial coherence to the estimates. Results have shown that the proposed method achieves high detection rates in challenging scenarios, including illumination changes, rapid motion and significant perspective distortion

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hail is a serious concern for agriculture on the Iberian Peninsula. Hailstorms affect crop yield and/or quality to a degree that depends on the crop species and the phenological time. In Europe, Spain is one of the countries that experience relatively high agricultural losses related to hailstorms. It is of high interest to study models that can support calculations of the probabilities of economic losses due to hail damage and of the tendency over time for such losses. Some studies developed in France and the Netherdlands show that the summer mean temperature was highly correlated with a yearly hail severity index developed from hailrelated parameters obtained for insurance purposes. Meanwhile, other studies in the USA point out that a highly significant correlation between both is not possible to find due to high climatic variability. The aim of this work is to test the correlation between average minimum temperatures and hail damage intensity over the Spanish Iberian Peninsula. With this purpose, correlation analyses on both variables were performed for the 47 Spanish provinces (as individuals and single set) and for all crops and four individual crops: grapes, wheat, barley and winter grains. Suitable crop insurance data are available from 1981 until 2007 and based on this period, temperature data were obtained. This study does not confirm the results previously obtained for France and the Netherlands that relate observed hail damage to the average minimum temperature. The reason for this difference and the nature of the cases observed are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the presynaptic rule, a classical rule for hebbian learning, is revisited. It is shown that the presynaptic rule exhibits relevant synaptic properties like synaptic directionality, and LTP metaplasticity (long-term potentiation threshold metaplasticity). With slight modifications, the presynaptic model also exhibits metaplasticity of the long-term depression threshold, being also consistent with Artola, Brocher and Singer’s (ABS) influential model. Two asymptotically equivalent versions of the presynaptic rule were adopted for this analysis: the first one uses an incremental equation while the second, conditional probabilities. Despite their simplicity, both types of presynaptic rules exhibit sophisticated biological properties, specially the probabilistic version

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artículo propone un método para llevar a cabo la calibración de las familias de discontinuidades en macizos rocosos. We present a novel approach for calibration of stochastic discontinuity network parameters based on genetic algorithms (GAs). To validate the approach, examples of application of the method to cases with known parameters of the original Poisson discontinuity network are presented. Parameters of the model are encoded as chromosomes using a binary representation, and such chromosomes evolve as successive generations of a randomly generated initial population, subjected to GA operations of selection, crossover and mutation. Such back-calculated parameters are employed to make assessments about the inference capabilities of the model using different objective functions with different probabilities of crossover and mutation. Results show that the predictive capabilities of GAs significantly depend on the type of objective function considered; and they also show that the calibration capabilities of the genetic algorithm can be acceptable for practical engineering applications, since in most cases they can be expected to provide parameter estimates with relatively small errors for those parameters of the network (such as intensity and mean size of discontinuities) that have the strongest influence on many engineering applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on the railway rolling stock circulation problem in rapid transit networks, in which frequencies are high and distances are relatively short. Although the distances are not very large, service times are high due to the large number of intermediate stops required to allow proper passenger flow. The main complicating issue is the fact that the available capacity at depot stations is very low, and both capacity and rolling stock are shared between different train lines. This forces the introduction of empty train movements and rotation maneuvers, to ensure sufficient station capacity and rolling stock availability. However, these shunting operations may sometimes be difficult to perform and can easily malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operation will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Critic trains, defined as train services that come through stations that have a large number of passengers arriving at the platform during rush hours, are also introduced. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results of the model, achieved in approximately 1 min, have been received positively by RENFE planners

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to propose an integrated planning model to adequate the offered capacity and system frequencies to attend the increased passenger demand and traffic congestion around urban and suburban areas. The railway capacity is studied in line planning, however, these planned frequencies were obtained without accounting for rolling stock flows through the rapid transit network. In order to provide the problem more freedom to decide rolling stock flows and therefore better adjusting these flows to passenger demand, a new integrated model is proposed, where frequencies are readjusted. Then, the railway timetable and rolling stock assignment are also calculated, where shunting operations are taken into account. These operations may sometimes malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operations will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Swapping operations will also be ensured using homogeneous rolling stock material and ensuring parkings in strategic stations. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results show that through this integrated approach a greater robustness degree can be obtained

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current trends in the fields of artifical intelligence and expert systems are moving towards the exciting possibility of reproducing and simulating human expertise and expert behaviour into a knowledge base, coupled with an appropriate, partially ‘intelligent’, computer code. This paper deals with the quality level prediction in concrete structures using the helpful assistance of an expert system, QL-CONST1, which is able to reason about this specific field of structural engineering. Evidence, hypotheses and factors related to this human knowledge field have been codified into a knowledge base. This knowledge base has been prepared in terms of probabilities of the presence of either hypotheses or evidence and the conditional presence of both. Human experts in the fields of structural engineering and the safety of structures gave their invaluable knowledge and assistance to the construction of the knowledge base. Some illustrative examples for, the validation of the expert system behaviour are included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural regeneration is an ecological key-process that makes plant persistence possible and, consequently, it constitutes an essential element of sustainable forest management. In this respect, natural regeneration in even-aged stands of Pinus pinea L. located in the Spanish Northern Plateau has not always been successfully achieved despite over a century of pine nut-based management. As a result, natural regeneration has recently become a major concern for forest managers when we are living a moment of rationalization of investment in silviculture. The present dissertation is addressed to provide answers to forest managers on this topic through the development of an integral regeneration multistage model for P. pinea stands in the region. From this model, recommendations for natural regeneration-based silviculture can be derived under present and future climate scenarios. Also, the model structure makes it possible to detect the likely bottlenecks affecting the process. The integral model consists of five submodels corresponding to each of the subprocesses linking the stages involved in natural regeneration (seed production, seed dispersal, seed germination, seed predation and seedling survival). The outputs of the submodels represent the transitional probabilities between these stages as a function of climatic and stand variables, which in turn are representative of the ecological factors driving regeneration. At subprocess level, the findings of this dissertation should be interpreted as follows. The scheduling of the shelterwood system currently conducted over low density stands leads to situations of dispersal limitation since the initial stages of the regeneration period. Concerning predation, predator activity appears to be only limited by the occurrence of severe summer droughts and masting events, the summer resulting in a favourable period for seed survival. Out of this time interval, predators were found to almost totally deplete seed crops. Given that P. pinea dissemination occurs in summer (i.e. the safe period against predation), the likelihood of a seed to not be destroyed is conditional to germination occurrence prior to the intensification of predator activity. However, the optimal conditions for germination seldom take place, restraining emergence to few days during the fall. Thus, the window to reach the seedling stage is narrow. In addition, the seedling survival submodel predicts extremely high seedling mortality rates and therefore only some individuals from large cohorts will be able to persist. These facts, along with the strong climate-mediated masting habit exhibited by P. pinea, reveal that viii the overall probability of establishment is low. Given this background, current management –low final stand densities resulting from intense thinning and strict felling schedules– conditions the occurrence of enough favourable events to achieve natural regeneration during the current rotation time. Stochastic simulation and optimisation computed through the integral model confirm this circumstance, suggesting that more flexible and progressive regeneration fellings should be conducted. From an ecological standpoint, these results inform a reproductive strategy leading to uneven-aged stand structures, in full accordance with the medium shade-tolerant behaviour of the species. As a final remark, stochastic simulations performed under a climate-change scenario show that regeneration in the species will not be strongly hampered in the future. This resilient behaviour highlights the fundamental ecological role played by P. pinea in demanding areas where other tree species fail to persist.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with quality level prediction in concrete structures through the helpful assistance of an expert system wich is able to apply reasoning to this field of structural engineering. Evidences, hypotheses and factors related to this human knowledge field have been codified into a Knowledge Base in terms of probabilities for the presence of either hypotheses or evidences,and conditional presence of both. Human experts in structural engineering and safety of structures gave their invaluable knowledge and assistance necessary when constructing the "computer knowledge body".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.