951 resultados para exceedance probabilities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzes there lative importance of the factors that influence the decision to produce for foreign markets in the Chilean agricultural sector. Using data obtained from personal interviews with 368 farmers, the market/production decision was estimated using a multinomial logit model. Three market/production alternatives were analyzed: production aimed for the external market, production for the internal market but with expectations of being exported, and production targeted only for the internal market. Marginal effects, odds ratios and predicted probabilities were used to identify the relevance of each variable. The results showed that a producer that is male, with a higher educational level, that does not own the land, but rents it, whose farm has irrigation and is located in an area that has a high concentration of exporting producers, will have a high probability of producing exportables. However, the factor that has the highest impact on producing for the external market is the geographic concentration of exporting producers, that is, an export spillover effect. Indeed, when the concentration change from 0 to its maximum (0.26), the odds of producing exportables rather than producing traditional products increases by a factor of 70 (against a factor of 10 in the case of irrigation).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the climate development over the Holocene in the Northern Hemisphere is well known, palaeolimnological climate reconstructions reveal spatiotemporal variability in northern Eurasia. Here we present a multi-proxy study from north-eastern Siberia combining sediment geochemistry, and diatom and pollen data from lake-sediment cores covering the last 38,000 cal. years. Our results show major changes in pyrite content and fragilarioid diatom species distributions, indicating prolonged seasonal lake-ice cover between ~13,500 and ~8,900 cal. years BP and possibly during the 8,200 cal. years BP cold event. A pollen-based climate reconstruction generated a mean July temperature of 17.8°C during the Holocene Thermal Maximum (HTM) between ~8,900 and ~4,500 cal. years BP. Naviculoid diatoms appear in the late Holocene indicating a shortening of the seasonal ice cover that continues today. Our results reveal a strong correlation between the applied terrestrial and aquatic indicators and natural seasonal climate dynamics in the Holocene. Planktonic diatoms show a strong response to changes in the lake ecosystem due to recent climate warming in the Anthropocene. We assess other palaeolimnological studies to infer the spatiotemporal pattern of the HTM and affirm that the timing of its onset, a difference of up to 3,000 years from north to south, can be well explained by climatic teleconnections. The westerlies brought cold air to this part of Siberia until the Laurentide ice-sheet vanished 7,000 years ago. The apparent delayed ending of the HTM in the central Siberian record can be ascribed to the exceedance of ecological thresholds trailing behind increases in winter temperatures and decreases in contrast in insolation between seasons during the mid to late Holocene as well as lacking differentiation between summer and winter trends in paleolimnological reconstructions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Snow cover has dramatic effects on the structure and functioning of Arctic ecosystems in winter. In the tundra, the subnivean space is the primary habitat of wintering small mammals and may be critical for their survival and reproduction. We have investigated the effects of snow cover and habitat features on the distributions of collared lemming (Dicrostonyx groenlandicus) and brown lemming (Lemmus trimucronatus) winter nests, as well as on their probabilities of reproduction and predation by stoats (Mustela erminea) and arctic foxes (Vulpes lagopus). We sampled 193 lemming winter nests and measured habitat features at all of these nests and at random sites at two spatial scales. We also monitored overwinter ground temperature at a subsample of nest and random sites. Our results demonstrate that nests were primarily located in areas with high micro-topography heterogeneity, steep slopes, deep snow cover providing thermal protection (reduced daily temperature fluctuations) and a high abundance of mosses. The probability of reproduction increased in collared lemming nests at low elevation and in brown lemming nests with high availability of some graminoid species. The probability of predation by stoats was density dependent and was higher in nests used by collared lemmings. Snow cover did not affect the probability of predation of lemming nests by stoats, but deep snow cover limited predation attempts by arctic foxes. We conclude that snow cover plays a key role in the spatial structure of wintering lemming populations and potentially in their population dynamics in the Arctic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Maritime accidents involving ships carrying passengers may pose a high risk with respect to human casualties. For effective risk mitigation, an insight into the process of risk escalation is needed. This requires a proactive approach when it comes to risk modelling for maritime transportation systems. Most of the existing models are based on historical data on maritime accidents, and thus they can be considered reactive instead of proactive. This paper introduces a systematic, transferable and proactive framework estimating the risk for maritime transportation systems, meeting the requirements stemming from the adopted formal definition of risk. The framework focuses on ship-ship collisions in the open sea, with a RoRo/Passenger ship (RoPax) being considered as the struck ship. First, it covers an identification of the events that follow a collision between two ships in the open sea, and, second, it evaluates the probabilities of these events, concluding by determining the severity of a collision. The risk framework is developed with the use of Bayesian Belief Networks and utilizes a set of analytical methods for the estimation of the risk model parameters. The model can be run with the use of GeNIe software package. Finally, a case study is presented, in which the risk framework developed here is applied to a maritime transportation system operating in the Gulf of Finland (GoF). The results obtained are compared to the historical data and available models, in which a RoPax was involved in a collision, and good agreement with the available records is found.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Probabilistic climate data have become available for the first time through the UK Climate Projections 2009, so that the risk of tree growth change can be quantified. We assess the drought risk spatially and temporally using drought probabilities and tree species vulnerabilities across Britain. We assessed the drought impact on the potential yield class of three major tree species (Picea sitchensis, Pinus sylvestris, and Quercus robur) which presently cover around 59% (400,700 ha) of state-managed forests, across lowland and upland sites. Here we show that drought impacts result mostly in reduced tree growth over the next 80 years when using b1, a1b and a1fi IPCC emissions scenarios. We found a maximum reduction of 94% but also a maximum increase of 56% in potential stand yield class in the 2080s from the baseline climate (1961-1990). Furthermore, potential production over the national forest estate for all three species in the 2080s may decrease due to drought by 42% in the lowlands and 32% in the uplands in comparison to the baseline climate. Our results reveal that potential tree growth and forest production on the national forest estate in Britain is likely to reduce, and indicate where and when adaptation measures are required. Moreover, this paper demonstrates the value of probabilistic climate projections for an important economic and environmental sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The zip folder comprises a text file and a gzipped tar archive. 1) The text file contains individual genotype data for 90 SNPs, 9 microsatellites and the mitochondrial ND4 gene that were determined in deep-sea hydrothermal vent mussels from the Mid-Atlantic Ridge (genus Bathymodiolus). Mussel specimens are grouped according to the population (pop)/location from which they have been sampled (first column). The remaining columns contain the respective allele/haplotype codes for the different genetic loci (names in the header line). The data file is in CONVERT format and can be directly transformed into different input files for population genetic statistics. 2) The tar archive contains NetCDF files with larval dispersal probabilities for simulated annual larval releases between 1998 and 2007. For each simulated vent location (Menez Gwen, Lucky Strike, Rainbow, Vent 1-10) two NetCDF files are given, one for an assumed pelagic larval duration of 1 year and the other one for an assumed pelagic larval duration of 6 months (6m).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonparametric belief propagation (NBP) is a well-known particle-based method for distributed inference in wireless networks. NBP has a large number of applications, including cooperative localization. However, in loopy networks NBP suffers from similar problems as standard BP, such as over-confident beliefs and possible nonconvergence. Tree-reweighted NBP (TRW-NBP) can mitigate these problems, but does not easily lead to a distributed implementation due to the non-local nature of the required so-called edge appearance probabilities. In this paper, we propose a variation of TRWNBP, suitable for cooperative localization in wireless networks. Our algorithm uses a fixed edge appearance probability for every edge, and can outperform standard NBP in dense wireless networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a new method for the automatic detection and tracking of road traffic signs using an on-board single camera. This method aims to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. The proposed approach exploits a combination of different features, such as color, appearance, and tracking information. This information is introduced into a recursive Bayesian decision framework, in which prior probabilities are dynamically adapted to tracking results. This decision scheme obtains a number of candidate regions in the image, according to their HS (Hue-Saturation). Finally, a Kalman filter with an adaptive noise tuning provides the required time and spatial coherence to the estimates. Results have shown that the proposed method achieves high detection rates in challenging scenarios, including illumination changes, rapid motion and significant perspective distortion

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hail is a serious concern for agriculture on the Iberian Peninsula. Hailstorms affect crop yield and/or quality to a degree that depends on the crop species and the phenological time. In Europe, Spain is one of the countries that experience relatively high agricultural losses related to hailstorms. It is of high interest to study models that can support calculations of the probabilities of economic losses due to hail damage and of the tendency over time for such losses. Some studies developed in France and the Netherdlands show that the summer mean temperature was highly correlated with a yearly hail severity index developed from hailrelated parameters obtained for insurance purposes. Meanwhile, other studies in the USA point out that a highly significant correlation between both is not possible to find due to high climatic variability. The aim of this work is to test the correlation between average minimum temperatures and hail damage intensity over the Spanish Iberian Peninsula. With this purpose, correlation analyses on both variables were performed for the 47 Spanish provinces (as individuals and single set) and for all crops and four individual crops: grapes, wheat, barley and winter grains. Suitable crop insurance data are available from 1981 until 2007 and based on this period, temperature data were obtained. This study does not confirm the results previously obtained for France and the Netherlands that relate observed hail damage to the average minimum temperature. The reason for this difference and the nature of the cases observed are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, the presynaptic rule, a classical rule for hebbian learning, is revisited. It is shown that the presynaptic rule exhibits relevant synaptic properties like synaptic directionality, and LTP metaplasticity (long-term potentiation threshold metaplasticity). With slight modifications, the presynaptic model also exhibits metaplasticity of the long-term depression threshold, being also consistent with Artola, Brocher and Singer’s (ABS) influential model. Two asymptotically equivalent versions of the presynaptic rule were adopted for this analysis: the first one uses an incremental equation while the second, conditional probabilities. Despite their simplicity, both types of presynaptic rules exhibit sophisticated biological properties, specially the probabilistic version

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artículo propone un método para llevar a cabo la calibración de las familias de discontinuidades en macizos rocosos. We present a novel approach for calibration of stochastic discontinuity network parameters based on genetic algorithms (GAs). To validate the approach, examples of application of the method to cases with known parameters of the original Poisson discontinuity network are presented. Parameters of the model are encoded as chromosomes using a binary representation, and such chromosomes evolve as successive generations of a randomly generated initial population, subjected to GA operations of selection, crossover and mutation. Such back-calculated parameters are employed to make assessments about the inference capabilities of the model using different objective functions with different probabilities of crossover and mutation. Results show that the predictive capabilities of GAs significantly depend on the type of objective function considered; and they also show that the calibration capabilities of the genetic algorithm can be acceptable for practical engineering applications, since in most cases they can be expected to provide parameter estimates with relatively small errors for those parameters of the network (such as intensity and mean size of discontinuities) that have the strongest influence on many engineering applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper focuses on the railway rolling stock circulation problem in rapid transit networks, in which frequencies are high and distances are relatively short. Although the distances are not very large, service times are high due to the large number of intermediate stops required to allow proper passenger flow. The main complicating issue is the fact that the available capacity at depot stations is very low, and both capacity and rolling stock are shared between different train lines. This forces the introduction of empty train movements and rotation maneuvers, to ensure sufficient station capacity and rolling stock availability. However, these shunting operations may sometimes be difficult to perform and can easily malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operation will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Critic trains, defined as train services that come through stations that have a large number of passengers arriving at the platform during rush hours, are also introduced. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results of the model, achieved in approximately 1 min, have been received positively by RENFE planners

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to propose an integrated planning model to adequate the offered capacity and system frequencies to attend the increased passenger demand and traffic congestion around urban and suburban areas. The railway capacity is studied in line planning, however, these planned frequencies were obtained without accounting for rolling stock flows through the rapid transit network. In order to provide the problem more freedom to decide rolling stock flows and therefore better adjusting these flows to passenger demand, a new integrated model is proposed, where frequencies are readjusted. Then, the railway timetable and rolling stock assignment are also calculated, where shunting operations are taken into account. These operations may sometimes malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operations will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Swapping operations will also be ensured using homogeneous rolling stock material and ensuring parkings in strategic stations. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results show that through this integrated approach a greater robustness degree can be obtained

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.