963 resultados para MODIFIED WEIBULL DISTRIBUTION
Resumo:
Serial correlation of extreme midlatitude cyclones observed at the storm track exits is explained by deviations from a Poisson process. To model these deviations, we apply fractional Poisson processes (FPPs) to extreme midlatitude cyclones, which are defined by the 850 hPa relative vorticity of the ERA interim reanalysis during boreal winter (DJF) and summer (JJA) seasons. Extremes are defined by a 99% quantile threshold in the grid-point time series. In general, FPPs are based on long-term memory and lead to non-exponential return time distributions. The return times are described by a Weibull distribution to approximate the Mittag–Leffler function in the FPPs. The Weibull shape parameter yields a dispersion parameter that agrees with results found for midlatitude cyclones. The memory of the FPP, which is determined by detrended fluctuation analysis, provides an independent estimate for the shape parameter. Thus, the analysis exhibits a concise framework of the deviation from Poisson statistics (by a dispersion parameter), non-exponential return times and memory (correlation) on the basis of a single parameter. The results have potential implications for the predictability of extreme cyclones.
Resumo:
This work presents a characterization of the surface wind climatology over the Iberian Peninsula (IP). For this objective, an unprecedented observational database has been developed. The database covers a period of 6years (2002–2007) and consists of hourly wind speed and wind direction data recorded at 514 automatic weather stations. Theoriginal observations underwent a quality control process to remove rough errors from the data set. In the first step, the annual and seasonal mean behaviour of the wind field are presented. This analysis shows the high spatial variability of the wind as a result of its interaction with the main orographic features of the IP. In order to simplify the characterization of the wind, a clustering procedure was applied to group the observational sites with similar temporal wind variability. A total of 20 regions are identified. These regions are strongly related to the main landforms of the IP. The wind behaviour of each region, characterized by the wind rose (WR), annual cycle (AC) and wind speed histogram, is explained as the response of each region to the main circulation types (CTs) affecting the IP. Results indicate that the seasonal variability of the synoptic scale is related with intra-annual variability and modulated by local features in the WRs variability. The wind speed distribution not always fit to a unimodal Weibull distribution consequence of interactions at different atmospheric scales. This work contributes to a deeper understanding of the temporal and spatial variability of surface winds. Taken together, the wind database created, the methodology used and the conclusion extracted are a benchmark for future works based on the wind behaviour.
Resumo:
The determination of size as well as power of a test is a vital part of a Clinical Trial Design. This research focuses on the simulation of clinical trial data with time-to-event as the primary outcome. It investigates the impact of different recruitment patterns, and time dependent hazard structures on size and power of the log-rank test. A non-homogeneous Poisson process is used to simulate entry times according to the different accrual patterns. A Weibull distribution is employed to simulate survival times according to the different hazard structures. The current study utilizes simulation methods to evaluate the effect of different recruitment patterns on size and power estimates of the log-rank test. The size of the log-rank test is estimated by simulating survival times with identical hazard rates between the treatment and the control arm of the study resulting in a hazard ratio of one. Powers of the log-rank test at specific values of hazard ratio (≠1) are estimated by simulating survival times with different, but proportional hazard rates for the two arms of the study. Different shapes (constant, decreasing, or increasing) of the hazard function of the Weibull distribution are also considered to assess the effect of hazard structure on the size and power of the log-rank test. ^
Resumo:
El estudio de la fiabilidad de componentes y sistemas tiene gran importancia en diversos campos de la ingenieria, y muy concretamente en el de la informatica. Al analizar la duracion de los elementos de la muestra hay que tener en cuenta los elementos que no fallan en el tiempo que dure el experimento, o bien los que fallen por causas distintas a la que es objeto de estudio. Por ello surgen nuevos tipos de muestreo que contemplan estos casos. El mas general de ellos, el muestreo censurado, es el que consideramos en nuestro trabajo. En este muestreo tanto el tiempo hasta que falla el componente como el tiempo de censura son variables aleatorias. Con la hipotesis de que ambos tiempos se distribuyen exponencialmente, el profesor Hurt estudio el comportamiento asintotico del estimador de maxima verosimilitud de la funcion de fiabilidad. En principio parece interesante utilizar metodos Bayesianos en el estudio de la fiabilidad porque incorporan al analisis la informacion a priori de la que se dispone normalmente en problemas reales. Por ello hemos considerado dos estimadores Bayesianos de la fiabilidad de una distribucion exponencial que son la media y la moda de la distribucion a posteriori. Hemos calculado la expansion asint6tica de la media, varianza y error cuadratico medio de ambos estimadores cuando la distribuci6n de censura es exponencial. Hemos obtenido tambien la distribucion asintotica de los estimadores para el caso m3s general de que la distribucion de censura sea de Weibull. Dos tipos de intervalos de confianza para muestras grandes se han propuesto para cada estimador. Los resultados se han comparado con los del estimador de maxima verosimilitud, y con los de dos estimadores no parametricos: limite producto y Bayesiano, resultando un comportamiento superior por parte de uno de nuestros estimadores. Finalmente nemos comprobado mediante simulacion que nuestros estimadores son robustos frente a la supuesta distribuci6n de censura, y que uno de los intervalos de confianza propuestos es valido con muestras pequenas. Este estudio ha servido tambien para confirmar el mejor comportamiento de uno de nuestros estimadores. SETTING OUT AND SUMMARY OF THE THESIS When we study the lifetime of components it's necessary to take into account the elements that don't fail during the experiment, or those that fail by reasons which are desirable to exclude from consideration. The model of random censorship is very usefull for analysing these data. In this model the time to failure and the time censor are random variables. We obtain two Bayes estimators of the reliability function of an exponential distribution based on randomly censored data. We have calculated the asymptotic expansion of the mean, variance and mean square error of both estimators, when the censor's distribution is exponential. We have obtained also the asymptotic distribution of the estimators for the more general case of censor's Weibull distribution. Two large-sample confidence bands have been proposed for each estimator. The results have been compared with those of the maximum likelihood estimator, and with those of two non parametric estimators: Product-limit and Bayesian. One of our estimators has the best behaviour. Finally we have shown by simulation, that our estimators are robust against the assumed censor's distribution, and that one of our intervals does well in small sample situation.
Resumo:
Quasi-monocrystalline silicon wafers have appeared as a critical innovation in the PV industry, joining the most favourable characteristics of the conventional substrates: the higher solar cell efficiencies of monocrystalline Czochralski-Si (Cz-Si) wafers and the lower cost and the full square-shape of the multicrystalline ones. However, the quasi-mono ingot growth can lead to a different defect structure than the typical Cz-Si process. Thus, the properties of the brand-new quasi-mono wafers, from a mechanical point of view, have been for the first time studied, comparing their strength with that of both Cz-Si mono and typical multicrystalline materials. The study has been carried out employing the four line bending test and simulating them by means of FE models. For the analysis, failure stresses were fitted to a three-parameter Weibull distribution. High mechanical strength was found in all the cases. The low quality quasi-mono wafers, interestingly, did not exhibit critical strength values for the PV industry, despite their noticeable density of extended defects.
Resumo:
Production of back contact solar cells requires holes generations on the wafers to keep both positive and negative contacts on the back side of the cell. This drilling process weakens the wafer mechanically due to the presence of the holes and the damage introduced during the process as microcracks. In this study, several chemical processes have been applied to drilled wafers in order to eliminate or reduce the damage generated during this fabrication step. The treatments analyzed are the followings: alkaline etching during 1, 3 and 5 minutes, acid etching for 2 and 4 minutes and texturisation. To determine mechanical strength of the samples a common mechanical study has been carried out testing the samples by the Ring on Ring bending test and obtaining the stress state in the moment of failure by FE simulation. Finally the results obtained for each treatment were fitted to a three parameter Weibull distribution
Resumo:
Este estudio profundiza en la estimación de variables forestales a partir de información LiDAR en el Valle de la Fuenfría (Cercedilla, Madrid). Para ello se dispone de dos vuelos realizados con sensor LiDAR en los años 2002 y 2011 y en el invierno de 2013 se ha realizado un inventario de 60 parcelas de campo. En primer lugar se han estimado seis variables dasométricas (volumen, área basimétrica, biomasa total, altura dominante, densidad y diámetro medio cuadrático) para 2013, tanto a nivel de píxel como a nivel de rodal y monte. Se construyeron modelos de regresión lineal múltiple que permitieron estimar con precisión dichas variables. En segundo lugar, se probaron diferentes métodos para la estimación de la distribución diamétrica. Por un lado, el método de predicción de percentiles y, por otro lado, el método de predicción de parámetros. Este segundo método se probó para una función Weibull simple, una función Weibull doble y una combinación de ambas según la distribución que mejor se ajustaba a cada parcela. Sin embargo, ninguno de los métodos ha resultado suficientemente válido para predecir la distribución diamétrica. Por último se estimaron el crecimiento en volumen y área basimétrica a partir de la comparación de los vuelos del 2002 y 2011. A pesar de que la tecnología LiDAR era diferente y solo se disponía de un inventario completo, realizado en 2013, los modelos construidos presentan buenas bondades de ajuste. Asimismo, el crecimiento a nivel de pixel se ha mostrado estar relacionado de forma estadísticamente significativa con la pendiente, orientación y altitud media del píxel. ABSTRACT This project goes in depth on the estimation of forest attributes by means of LiDAR data in Fuenfria’s Valley (Cercedilla, Madrid). The available information was two LiDAR flights (2002 and 2011) and a forest inventory consisting of 60 plots (2013). First, six different dasometric attributes (volume, basal area, total aboveground biomass, top height, density and quadratic mean diameter) were estimated in 2013 both at a pixel, stand and forest level. The models were developed using multiple linear regression and were good enough to predict these attributes with great accuracy. Second, the measured diameter distribution at each plot was fitted to a simple and a double Weibull distribution and different methods for its estimation were tested. Neither parameter prediction method nor percentile prediction method were able to account for the diameter distribution. Finally, volume and top height growths were estimated comparing 2011 LiDAR flight with 2002 LiDAR flight. Even though the LiDAR technology was not the same and there was just one forest inventory with sample plots, the models properly explain the growth. Besides, growth at each pixel is significantly related to its average slope, orientation and altitude.
Resumo:
A floresta Amazônica possui um papel ambiental, social e econômico importante para a região, para o país e para o mundo. Dessa forma, técnicas de exploração que visam a diminuição dos impactos causados à floresta são essenciais. Com isso, o objetivo dessa tese é comparar a Exploração de Impacto Reduzido com a Exploração Convencional na Amazônia brasileira através de modelos empíricos de árvore individual de crescimento e produção. O experimento foi instalado na fazenda Agrossete, localizada em Paragominas - PA. Em 1993, três áreas dessa fazenda foram selecionadas para exploração. Na primeira área, 105 hectares foram explorados através da Exploração de Impacto Reduzido. Na segunda área, 75 hectares foram submetidos à Exploração Convencional. E, por fim, a terceira área foi mantida como área testemunha. A coleta de dados de diâmetro à altura do peito e a identificação das espécies dentro de uma parcela de 24,5 hectares, instalada aleatoriamente em cada área, foi realizada nos anos de 1993 (antes da colheita), 1994 (seis meses depois da colheita), 1995, 1996, 1998, 2000, 2003, 2006 e 2009. Dessa forma, as três áreas foram comparadas através do ajuste de um modelo de incremento diamétrico, considerando que efeito estocástico podia assumir outras quatro distribuições além da distribuição normal, de um modelo de probabilidade de mortalidade e de um modelo de probabilidade de recrutamento. O comportamento do incremento diamétrico indicou que as áreas que foram submetidas a exploração possuem o mesmo comportamento em quase todos os grupos de espécies, com exceção do grupo de espécies intermediárias. Os indivíduos que são submetidos a exploração possuem um maior crescimento em diâmetros quando comparados com área que não sofreu exploração. Além disso, assumir o efeito estocástico com distribuição Weibull melhorou o ajuste dos modelos. Em relação à probabilidade de mortalidade, novamente as áreas que sofreram exploração possuem comportamento semelhante quanto à mortalidade, mas diferente da área que não foi explorada, sendo que os indivíduos localizados nas áreas exploradas possuem uma maior probabilidade de morte em relação aos presentes na área não explorada. Os modelos de probabilidade de recrutamento indicaram diferença apenas entre as áreas exploradas e a área controle. Sendo que, as áreas exploradas apresentaram uma maior taxa de recrumento em comparação a área não explorada. Portanto, o comportamento individual das árvores após a exploração é o mesmo na Exploração Convencional e na Exploração de Impacto Reduzido.
Resumo:
Modeling of self-similar traffic is performed for the queuing system of G/M/1/K type using Weibull distribution. To study the self-similar traffic the simulation model is developed by using SIMULINK software package in MATLAB environment. Approximation of self-similar traffic on the basis of spline functions. Modeling self-similar traffic is carried outfor QS of W/M/1/K type using the Weibull distribution. Initial data are: the value of Hurst parameter H=0,65, the shape parameter of the distribution curve α≈0,7 and distribution parameter β≈0,0099. Considering that the self-similar traffic is characterized by the presence of "splashes" and long-termdependence between the moments of requests arrival in this study under given initial data it is reasonable to use linear interpolation splines.
Resumo:
On the basis of aerial photographs of sea ice floes in the marginal ice zone (MIZ) of Prydz Bay acquired from December 2004 to February 2005 during the 21st Chinese National Antarctic Research Expedition, image processing techniques are employed to extract some geometric parameters of floes from two merged transects covering the whole MIZ. Variations of these parameters with the distance into the MIZ are then obtained. Different parameters of floe size, namely area, perimeter, and mean caliper diameter (MCD), follow three similar stages of increasing, flat and increasing again, with distance from the open ocean. Floe shape parameters (roundness and the ratio of perimeter to MCD), however, have less significant variations than that of floe size. Then, to modify the deviation of the cumulative floe size distribution from the ideal power law, an upper truncated power-law function and a Weibull function are used, and four calculated parameters of the above functions are found to be important descriptors of the evolution of floe size distribution in the MIZ. Among them, Lr of the upper truncated power-law function indicates the upper limit of floe size and roughly equals the maximum floe size in each square sample area. L0 in the Weibull distribution shows an increasing proportion of larger floes in squares farther from the open ocean and roughly equals the mean floe size. D in the upper truncated power-law function is closely associated with the degree of confinement during ice breakup. Its decrease with the distance into MIZ indicates the weakening of confinement conditions on floes owing to wave attenuation. The gamma of the Weibull distribution characterizes the degree of homogeneity in a data set. It also decreases with distance into MIZ, implying that floe size distributes increase in range. Finally, a statistical test on floe size is performed to divide the whole MIZ into three distinct zones made up of floes of quite different characteristics. This zonal structure of floe size also agrees well with the trends of floe shape and floe size distribution, and is believed to be a straightforward result of wave-ice interaction in the MIZ.
Resumo:
We consider the problem of estimating P(Yi + (...) + Y-n > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a toot for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y-1 + (...) + Y-n > x), n - 1 of the Yi have distribution F and one the conditional distribution of Y given Y > x. We show in some specific parametric examples (Pareto and Weibull) how this leads to precise answers which, as demonstrated numerically, are close to being variance minimal within the parametric class under consideration. Related problems for M/G/l and GI/G/l queues are also discussed.
Resumo:
The estimation of P(S-n > u) by simulation, where S, is the sum of independent. identically distributed random varibles Y-1,..., Y-n, is of importance in many applications. We propose two simulation estimators based upon the identity P(S-n > u) = nP(S, > u, M-n = Y-n), where M-n = max(Y-1,..., Y-n). One estimator uses importance sampling (for Y-n only), and the other uses conditional Monte Carlo conditioning upon Y1,..., Yn-1. Properties of the relative error of the estimators are derived and a numerical study given in terms of the M/G/1 queue in which n is replaced by an independent geometric random variable N. The conclusion is that the new estimators compare extremely favorably with previous ones. In particular, the conditional Monte Carlo estimator is the first heavy-tailed example of an estimator with bounded relative error. Further improvements are obtained in the random-N case, by incorporating control variates and stratification techniques into the new estimation procedures.
Resumo:
The recurrence interval statistics for regional seismicity follows a universal distribution function, independent of the tectonic setting or average rate of activity (Corral, 2004). The universal function is a modified gamma distribution with power-law scaling of recurrence intervals shorter than the average rate of activity and exponential decay for larger intervals. We employ the method of Corral (2004) to examine the recurrence statistics of a range of cellular automaton earthquake models. The majority of models has an exponential distribution of recurrence intervals, the same as that of a Poisson process. One model, the Olami-Feder-Christensen automaton, has recurrence statistics consistent with regional seismicity for a certain range of the conservation parameter of that model. For conservation parameters in this range, the event size statistics are also consistent with regional seismicity. Models whose dynamics are dominated by characteristic earthquakes do not appear to display universality of recurrence statistics.
Resumo:
2000 Mathematics Subject Classification: 60G70, 60F12, 60G10.
Resumo:
Survival models deals with the modelling of time to event data. In certain situations, a share of the population can no longer be subjected to the event occurrence. In this context, the cure fraction models emerged. Among the models that incorporate a fraction of cured one of the most known is the promotion time model. In the present study we discuss hypothesis testing in the promotion time model with Weibull distribution for the failure times of susceptible individuals. Hypothesis testing in this model may be performed based on likelihood ratio, gradient, score or Wald statistics. The critical values are obtained from asymptotic approximations, which may result in size distortions in nite sample sizes. This study proposes bootstrap corrections to the aforementioned tests and Bartlett bootstrap to the likelihood ratio statistic in Weibull promotion time model. Using Monte Carlo simulations we compared the nite sample performances of the proposed corrections in contrast with the usual tests. The numerical evidence favors the proposed corrected tests. At the end of the work an empirical application is presented.