999 resultados para Gumbel Extreme Value Autoregressive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solutions to combinatorial optimization, such as p-median problems of locating facilities, frequently rely on heuristics to minimize the objective function. The minimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. However, pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small branch of the literature suggests using statistical principles to estimate the minimum and use the estimate for either stopping or evaluating the quality of the solution. In this paper we use test-problems taken from Baesley's OR-library and apply Simulated Annealing on these p-median problems. We do this for the purpose of comparing suggested methods of minimum estimation and, eventually, provide a recommendation for practioners. An illustration ends the paper being a problem of locating some 70 distribution centers of the Swedish Post in a region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a spatial-temporal downscaling approach to construction of the intensity-duration-frequency (IDF) relations at a local site in the context of climate change and variability. More specifically, the proposed approach is based on a combination of a spatial downscaling method to link large-scale climate variables given by General Circulation Model (GCM) simulations with daily extreme precipitations at a site and a temporal downscaling procedure to describe the relationships between daily and sub-daily extreme precipitations based on the scaling General Extreme Value (GEV) distribution. The feasibility and accuracy of the suggested method were assessed using rainfall data available at eight stations in Quebec (Canada) for the 1961-2000 period and climate simulations under four different climate change scenarios provided by the Canadian (CGCM3) and UK (HadCM3) GCM models. Results of this application have indicated that it is feasible to link sub-daily extreme rainfalls at a local site with large-scale GCM-based daily climate predictors for the construction of the IDF relations for present (1961-1990) and future (2020s, 2050s, and 2080s) periods at a given site under different climate change scenarios. In addition, it was found that annual maximum rainfalls downscaled from the HadCM3 displayed a smaller change in the future, while those values estimated from the CGCM3 indicated a large increasing trend for future periods. This result has demonstrated the presence of high uncertainty in climate simulations provided by different GCMs. In summary, the proposed spatial-temporal downscaling method provided an essential tool for the estimation of extreme rainfalls that are required for various climate-related impact assessment studies for a given region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Competitive Strategy literature predicts three different mechanisms of performance generation, thus distinguishing between firms that have competitive advantage, firms that have competitive disadvantage or firms that have neither. Nonetheless, previous works in the field have fitted a single normal distribution to model firm performance. Here, we develop a new approach that distinguishes among performance generating mechanisms and allows the identification of firms with competitive advantage or disadvantage. Theorizing on the positive feedback loops by which firms with competitive advantage have facilitated access to acquire new resources, we proposed a distribution we believe data on firm performance should follow. We illustrate our model by assessing its fit to data on firm performance, addressing its theoretical implications and comparing it to previous works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Outliers são observações que parecem ser inconsistentes com as demais. Também chamadas de valores atípicos, extremos ou aberrantes, estas inconsistências podem ser causadas por mudanças de política ou crises econômicas, ondas inesperadas de frio ou calor, erros de medida ou digitação, entre outras. Outliers não são necessariamente valores incorretos, mas, quando provenientes de erros de medida ou digitação, podem distorcer os resultados de uma análise e levar o pesquisador à conclusões equivocadas. O objetivo deste trabalho é estudar e comparar diferentes métodos para detecção de anormalidades em séries de preços do Índice de Preços ao Consumidor (IPC), calculado pelo Instituto Brasileiro de Economia (IBRE) da Fundação Getulio Vargas (FGV). O IPC mede a variação dos preços de um conjunto fixo de bens e serviços componentes de despesas habituais das famílias com nível de renda situado entre 1 e 33 salários mínimos mensais e é usado principalmente como um índice de referência para avaliação do poder de compra do consumidor. Além do método utilizado atualmente no IBRE pelos analistas de preços, os métodos considerados neste estudo são: variações do Método do IBRE, Método do Boxplot, Método do Boxplot SIQR, Método do Boxplot Ajustado, Método de Cercas Resistentes, Método do Quartil, do Quartil Modificado, Método do Desvio Mediano Absoluto e Algoritmo de Tukey. Tais métodos foram aplicados em dados pertencentes aos municípios Rio de Janeiro e São Paulo. Para que se possa analisar o desempenho de cada método, é necessário conhecer os verdadeiros valores extremos antecipadamente. Portanto, neste trabalho, tal análise foi feita assumindo que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers. O Método do IBRE é bastante correlacionado com os preços alterados ou descartados pelos analistas. Sendo assim, a suposição de que os preços alterados ou descartados pelos analistas são os verdadeiros valores extremos pode influenciar os resultados, fazendo com que o mesmo seja favorecido em comparação com os demais métodos. No entanto, desta forma, é possível computar duas medidas através das quais os métodos são avaliados. A primeira é a porcentagem de acerto do método, que informa a proporção de verdadeiros outliers detectados. A segunda é o número de falsos positivos produzidos pelo método, que informa quantos valores precisaram ser sinalizados para um verdadeiro outlier ser detectado. Quanto maior for a proporção de acerto gerada pelo método e menor for a quantidade de falsos positivos produzidos pelo mesmo, melhor é o desempenho do método. Sendo assim, foi possível construir um ranking referente ao desempenho dos métodos, identificando o melhor dentre os analisados. Para o município do Rio de Janeiro, algumas das variações do Método do IBRE apresentaram desempenhos iguais ou superiores ao do método original. Já para o município de São Paulo, o Método do IBRE apresentou o melhor desempenho. Em trabalhos futuros, espera-se testar os métodos em dados obtidos por simulação ou que constituam bases largamente utilizadas na literatura, de forma que a suposição de que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers não interfira nos resultados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dissertation goal is to quantify the tail risk premium embedded into hedge funds' returns. Tail risk is the probability of extreme large losses. Although it is a rare event, asset pricing theory suggests that investors demand compensation for holding assets sensitive to extreme market downturns. By de nition, such events have a small likelihood to be represented in the sample, what poses a challenge to estimate the e ects of tail risk by means of traditional approaches such as VaR. The results show that it is not su cient to account for the tail risk stemming from equities markets. Active portfolio management employed by hedge funds demand a speci c measure to estimate and control tail risk. Our proposed factor lls that void inasmuch it presents explanatory power both over the time series as well as the cross-section of funds' returns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Matematica Aplicada e Computacional - FCT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analisi dell'incidenza di porosità interne sul limite di fatica di particolari getti di ghisa. Studio sviluppato grazie alla tecnica probabilistica "extreme value analysis".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La teoria dei sistemi dinamici studia l'evoluzione nel tempo dei sistemi fisici e di altra natura. Nonostante la difficoltà di assegnare con esattezza una condizione iniziale (fatto che determina un non-controllo della dinamica del sistema), gli strumenti della teoria ergodica e dello studio dell'evoluzione delle densità di probabilità iniziali dei punti del sistema (operatore di Perron-Frobenius), ci permettono di calcolare la probabilità che un certo evento E (che noi definiamo come evento raro) accada, in particolare la probabilità che il primo tempo in cui E si verifica sia n. Abbiamo studiato i casi in cui l'evento E sia definito da una successione di variabili aleatorie (prima nel caso i.i.d, poi nel caso di catene di Markov) e da una piccola regione dello spazio delle fasi da cui i punti del sistema possono fuoriuscire (cioè un buco). Dagli studi matematici sui sistemi aperti condotti da Keller e Liverani, si ricava una formula esplicita del tasso di fuga nella taglia del buco. Abbiamo quindi applicato questo metodo al caso in cui l'evento E sia definito dai punti dello spazio in cui certe osservabili assumono valore maggiore o uguale a un dato numero reale a, per ricavare l'andamento asintotico in n della probabilità che E non si sia verificato al tempo n, al primo ordine, per a che tende all'infinito.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent decades, extremely hazardous windstorms have caused enormous losses to buildings, infrastructure and forests in Switzerland. This has increased societal and scientific interest in the intensity and frequency of historical high-impact storms. However, high-resolution wind data and damage statistics mostly span recent decades only. For this study, we collected quantitative (e.g., volumes of windfall timber, losses relating to buildings) and descriptive (e.g., forestry or insurance reports) information on the impact of historical windstorms. To define windstorm severity, normalized and declustered quantitative data were processed by extreme value statistics. Descriptive information was classified using a conceptual guideline. Validation with independent damage information, as well as comparison with wind measurements and a reanalysis, indicates that the most hazardous winter storms are captured, while too few moderate windstorms are detected. Strong storms in the wind measurements and reanalysis are thus added to the catalog. The final catalog encompasses approximately 240 high-impact windstorms in Switzerland since 1859. It features three robust severity classes and contains eight extreme windstorms. Evidence of high winter storm activity in the early and late 20th century compared to the mid-20th century in both damage and wind data indicates a co-variability of hazard and related damage on decadal timescales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From November 2004 to December 2007, size-segregated aerosol samples were collected all-year-round at Dome C (East Antarctica) by using PM10 and PM2.5 samplers, and multi-stage impactors. The data set obtained from the chemical analysis provided the longest and the most time-resolved record of sea spray aerosol (sea salt Na+) in inner Antarctica. Sea spray showed a sharp seasonal pattern. The highest values measured in winter (Apr-Nov) were about ten times larger than in summer (Dec-Mar). For the first time, a size-distribution seasonal pattern was also shown: in winter, sea spray particles are mainly submicrometric, while their summer size-mode is around 1-2 µm. Meteorological analysis on a synoptic scale allowed the definition of atmospheric conditions leading sea spray to Dome C. An extreme-value approach along with specific environmental based criteria was taken to yield stronger fingerprints linking atmospheric circulation (means and anomalies) to extreme sea spray events. Air mass back-trajectory analyses for some high sea spray events allowed the identification of two major air mass pathways, reflecting different size distributions: micrometric fractions for transport from the closer Indian-Pacific sector, and sub-micrometric particles for longer trajectories over the Antarctic Plateau. The seasonal pattern of the SO4**2- /Na+ ratio enabled the identification of few events depleted in sulphate, with respect to the seawater composition. By using methanesulphonic acid (MSA) profile to evaluate the biogenic SO4**2- contribution, a more reliable sea salt sulphate was calculated. In this way, few events (mainly in April and in September) were identified originating probably from the "frost flower" source. A comparison with daily-collected superficial snow samples revealed that there is a temporal shift between aerosol and snow sea spray trends. This feature could imply a more complex deposition processes of sea spray, involving significant contribution of wet and diamond dust deposition, but further work has to be carried out to rule out the effect of wind re-distribution and to have more statistic significance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seventeen sediment samples of Albian-Cenomanian to early Pliocene age from DSDP Hole 530A in the Angola Basin and six sediment samples of early Pliocene to late Pleistocene age from the Walvis Ridge were investigated by organic geochemical methods, including organic carbon determination, Rock-Eval pyrolysis, gas chromatography and combined gas chromatography/mass spectrometry of extractable hydrocarbons, and kerogen microscopy. The organic matter in all samples is strongly influenced by a terrigenous component from the nearby continent. The amount of marine organic matter present usually increases with the total organic carbon content, which reaches an extreme value of more than 10% in a Cenomanian black shale from Hole 530A. At Site 530 the extent of preservation of organic matter in the deep sea sediments is related to mass transport down the continental slope, whereas the high organic carbon contents in the sediments from Site 532 reflect both high bioproductivity in the Benguela upwelling regime and considerable supply of terrigenous organic matter. The maturation level of the organic matter is low in all samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La mayoría de las estructuras de hormigón pretensadas construidas en los últimos 50 años han demostrado una excelente durabilidad cuando su construcción se realiza atendiendo las recomendaciones de un buen diseño así como una buena ejecución y puesta en obra de la estructura. Este hecho se debe en gran parte al temor que despierta el fenómeno de la corrosión bajo tensión típico de las armaduras de acero de alta resistencia. Menos atención se ha prestado a la susceptibilidad a la corrosión bajo tensión de los anclajes de postensado, posiblemente debido a que se han reportado pocos casos de fallos catastróficos. El concepto de Tolerancia al Daño y la Mecánica de la Fractura en estructuras de Ingeniería Civil ha empezado a incorporarse recientemente en algunas normas de diseño y cálculo de estructuras metálicas, sin embargo, aún está lejos de ser asimilado y empleado habitualmente por los ingenieros en sus cálculos cuando la ocasión lo requiere. Este desconocimiento de los aspectos relacionados con la Tolerancia al Daño genera importantes gastos de mantenimiento y reparación. En este trabajo se ha estudiado la aplicabilidad de los conceptos de la Mecánica de la Fractura a los componentes de los sistemas de postensado empleados en ingeniería civil, empleándolo para analizar la susceptibilidad de las armaduras activas frente a la corrosión bajo tensiones y a la pérdida de capacidad portante de las cabezas de anclajes de postensado debido a la presencia de defectos. Con este objeto se han combinado tanto técnicas experimentales como numéricas. Los defectos superficiales en los alambres de pretensado no se presentan de manera aislada si no que existe una cierta continuidad en la dirección axial así como un elevado número de defectos. Por este motivo se ha optado por un enfoque estadístico, que es más apropiado que el determinístico. El empleo de modelos estadísticos basados en la teoría de valores extremos ha permitido caracterizar el estado superficial en alambres de 5,2 mm de diámetro. Por otro lado la susceptibilidad del alambre frente a la corrosión bajo tensión ha sido evaluada mediante la realización de una campaña de ensayos de acuerdo con la actual normativa que ha permitido caracterizar estadísticamente su comportamiento. A la vista de los resultados ha sido posible evaluar como los parámetros que definen el estado superficial del alambre pueden determinar la durabilidad de la armadura atendiendo a su resistencia frente a la corrosión bajo tensión, evaluada mediante los ensayos que especifica la normativa. En el caso de las cabezas de anclaje de tendones de pretensado, los defectos se presentan de manera aislada y tienen su origen en marcas, arañazos o picaduras de corrosión que pueden producirse durante el proceso de fabricación, transporte, manipulación o puesta en obra. Dada la naturaleza de los defectos, el enfoque determinístico es más apropiado que el estadístico. La evaluación de la importancia de un defecto en un elemento estructural requiere la estimación de la solicitación local que genera el defecto, que permite conocer si el defecto es crítico o si puede llegar a serlo, si es que progresa con el tiempo (por fatiga, corrosión, una combinación de ambas, etc.). En este trabajo los defectos han sido idealizados como grietas, de manera que el análisis quedara del lado de la seguridad. La evaluación de la solicitación local del defecto ha sido calculada mediante el empleo de modelos de elementos finitos de la cabeza de anclaje que simulan las condiciones de trabajo reales de la cabeza de anclaje durante su vida útil. A partir de estos modelos numéricos se ha analizado la influencia en la carga de rotura del anclaje de diversos factores como la geometría del anclaje, las condiciones del apoyo, el material del anclaje, el tamaño del defecto su forma y su posición. Los resultados del análisis numérico han sido contrastados satisfactoriamente mediante la realización de una campaña experimental de modelos a escala de cabezas de anclaje de Polimetil-metacrilato en los que artificialmente se han introducido defectos de diversos tamaños y en distintas posiciones. ABSTRACT Most of the prestressed concrete structures built in the last 50 years have demonstrated an excellent durability when they are constructed in accordance with the rules of good design, detailing and execution. This is particularly true with respect to the feared stress corrosion cracking, which is typical of high strength prestressing steel wires. Less attention, however, has been paid to the stress corrosion cracking susceptibility of anchorages for steel tendons for prestressing concrete, probably due to the low number of reported failure cases. Damage tolerance and fracture mechanics concepts in civil engineering structures have recently started to be incorporated in some design and calculation rules for metallic structures, however it is still far from being assimilated and used by civil engineers in their calculations on a regular basis. This limited knowledge of the damage tolerance basis could lead to significant repair and maintenance costs. This work deals with the applicability of fracture mechanics and damage tolerance concepts to the components of prestressed systems, which are used in civil engineering. Such concepts have been applied to assess the susceptibility of the prestressing steel wires to stress corrosion cracking and the reduction of load bearing capability of anchorage devices due to the presence of defects. For this purpose a combination of experimental work and numerical techniques have been performed. Surface defects in prestressing steel wires are not shown alone, though a certain degree of continuity in the axial direction exist. A significant number of such defects is also observed. Hence a statistical approach was used, which is assumed to be more appropriate than the deterministic approach. The use of statistical methods based in extreme value theories has allowed the characterising of the surface condition of 5.2 mm-diameter wires. On the other hand the stress corrosion cracking susceptibility of the wire has been assessed by means of an experimental testing program in line with the current regulations, which has allowed statistical characterisasion of their performances against stress corrosion cracking. In the light of the test results, it has been possible to evaluate how the surface condition parameters could determine the durability of the active metal armour regarding to its resistance against stress corrosion cracking assessed by means of the current testing regulations. In the case of anchorage devices for steel tendons for prestressing concrete, the damage is presented as point defects originating from dents, scratches or corrosion pits that could be produced during the manufacturing proccess, transport, handling, assembly or use. Due to the nature of these defects, in this case the deterministic approach is more appropriate than the statistical approach. The assessment of the relevancy of defect in a structural component requires the computation of the stress intensity factors, which in turn allow the evaluation of whether the size defect is critical or could become critical with the progress of time (due to fatigue, corrosion or a combination of both effects). In this work the damage is idealised as tiny cracks, a conservative hypothesis. The stress intensity factors have been calculated by means of finite element models of the anchorage representing the real working conditions during its service life. These numeric models were used to assess the impact of some factors on the rupture load of the anchorage, such the anchorage geometry, material, support conditions, defect size, shape and its location. The results from the numerical analysis have been succesfully correlated against the results of the experimental testing program of scaled models of the anchorages in poly-methil methacrylate in which artificial damage in several sizes and locations were introduced.