973 resultados para Weibull distribution function


Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60G70, 60F12, 60G10.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62E16,62F15, 62H12, 62M20.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examined the anatomy of expanding, mature, and senescing leaves of tropical plants for the presence of red pigments: anthocyanins and betacyanins. We studied 463 species in total, 370 genera, belonging to 94 families. This included 21 species from five families in the Caryophyllales, where betacyanins are the basis for red color. We also included 14 species of ferns and gymnosperms in seven families and 29 species with undersurface coloration at maturity. We analyzed 399 angiosperm species (74 families) for factors (especially developmental and evolutionary) influencing anthocyanin production during expansion and senescence. During expansion, 44.9% produced anthocyanins and only 13.5% during senescence. At both stages, relatively few patterns of tissue distributions developed, primarily in the mesophyll, and very few taxa produced anthocyanins in dermal and ground tissue simultaneously. Of the 35 species producing anthocyanins both in development and senescence, most had similar cellular distributions. Anthocyanin distributions were identical in different developing leaves of three heteroblastic taxa. Phylogeny has influenced the distribution of anthocyanins in the epidermis and mesophyll of expanding leaves and the palisade parenchyma during senescence, although these influences are not strong. Betacyanins appear to have similar distributions in leaves of taxa within the Caryophyllales and, perhaps, similar functions. The presence of anthocyanins in the mesophyll of so many species is inconsistent with the hypothesis of protection against UV damage or fungal pathogens, and the differing tissue distributions indicate that the pigments may function in different ways, as in photoprotection and freeradical scavenging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Complexity analysis of a given time series is executed using various measures of irregularity, the most commonly used being Approximate entropy (ApEn), Sample entropy (SampEn) and Fuzzy entropy (FuzzyEn). However, the dependence of these measures on the critical parameter of tolerance `r' leads to precarious results, owing to random selections of r. Attempts to eliminate the use of r in entropy calculations introduced a new measure of entropy namely distribution entropy (DistEn) based on the empirical probability distribution function (ePDF). DistEn completely avoids the use of a variance dependent parameter like r and replaces it by a parameter M, which corresponds to the number of bins used in the histogram to calculate it. When tested for synthetic data, M has been observed to produce a minimal effect on DistEn as compared to the effect of r on other entropy measures. Also, DistEn is said to be relatively stable with data length (N) variations, as far as synthetic data is concerned. However, these claims have not been analyzed for physiological data. Our study evaluates the effect of data length N and bin number M on the performance of DistEn using both synthetic and physiologic time series data. Synthetic logistic data of `Periodic' and `Chaotic' levels of complexity and 40 RR interval time series belonging to two groups of healthy aging population (young and elderly) have been used for the analysis. The stability and consistency of DistEn as a complexity measure as well as a classifier have been studied. Experiments prove that the parameters N and M are more influential in deciding the efficacy of DistEn performance in the case of physiologic data than synthetic data. Therefore, a generalized random selection of M for a given data length N may not always be an appropriate combination to yield good performance of DistEn for physiologic data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When the distribution of a process characterized by a profile is non normal, process capability analysis using normal assumption often leads to erroneous interpretations of the process performance. Profile monitoring is a relatively new set of techniques in quality control that is used in situations where the state of product or process is represented by a function of two or more quality characteristics. Such profiles can be modeled using linear or nonlinear regression models. In some applications, it is assumed that the quality characteristics follow a normal distribution; however, in certain applications this assumption may fail to hold and may yield misleading results. In this article, we consider process capability analysis of non normal linear profiles. We investigate and compare five methods to estimate non normal process capability index (PCI) in profiles. In three of the methods, an estimation of the cumulative distribution function (cdf) of the process is required to analyze process capability in profiles. In order to estimate cdf of the process, we use a Burr XII distribution as well as empirical distributions. However, the resulted PCI with estimating cdf of the process is sometimes far from its true value. So, here we apply artificial neural network with supervised learning which allows the estimation of PCIs in profiles without the need to estimate cdf of the process. Box-Cox transformation technique is also developed to deal with non normal situations. Finally, a comparison study is performed through the simulation of Gamma, Weibull, Lognormal, Beta and student-t data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The recent development of indoor wireless local area network (WLAN) standards at 2.45 GHz and 5 GHz has led to increased interest in propagation studies at these frequency bands. Within the indoor environment, human body effects can strongly reduce the quality of wireless communication systems. Human body effects can cause temporal variations and shadowing due to pedestrian movement and antenna- body interaction with portable terminals. This book presents a statistical characterisation, based on measurements, of human body effects on indoor narrowband channels at 2.45 GHz and at 5.2 GHz. A novel cumulative distribution function (CDF) that models the 5 GHz narrowband channel in populated indoor environments is proposed. This novel CDF describes the received envelope in terms of pedestrian traffic. In addition, a novel channel model for the populated indoor environment is proposed for the Multiple-Input Multiple-Output (MIMO) narrowband channel in presence of pedestrians at 2.45 GHz. Results suggest that practical MIMO systems must be sufficiently adaptive if they are to benefit from the capacity enhancement caused by pedestrian movement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Survival probability prediction using covariate-based hazard approach is a known statistical methodology in engineering asset health management. We have previously reported the semi-parametric Explicit Hazard Model (EHM) which incorporates three types of information: population characteristics; condition indicators; and operating environment indicators for hazard prediction. This model assumes the baseline hazard has the form of the Weibull distribution. To avoid this assumption, this paper presents the non-parametric EHM which is a distribution-free covariate-based hazard model. In this paper, an application of the non-parametric EHM is demonstrated via a case study. In this case study, survival probabilities of a set of resistance elements using the non-parametric EHM are compared with the Weibull proportional hazard model and traditional Weibull model. The results show that the non-parametric EHM can effectively predict asset life using the condition indicator, operating environment indicator, and failure history.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Maintenance activities in a large-scale engineering system are usually scheduled according to the lifetimes of various components in order to ensure the overall reliability of the system. Lifetimes of components can be deduced by the corresponding probability distributions with parameters estimated from past failure data. While failure data of the components is not always readily available, the engineers have to be content with the primitive information from the manufacturers only, such as the mean and standard deviation of lifetime, to plan for the maintenance activities. In this paper, the moment-based piecewise polynomial model (MPPM) are proposed to estimate the parameters of the reliability probability distribution of the products when only the mean and standard deviation of the product lifetime are known. This method employs a group of polynomial functions to estimate the two parameters of the Weibull Distribution according to the mathematical relationship between the shape parameter of two-parameters Weibull Distribution and the ratio of mean and standard deviation. Tests are carried out to evaluate the validity and accuracy of the proposed methods with discussions on its suitability of applications. The proposed method is particularly useful for reliability-critical systems, such as railway and power systems, in which the maintenance activities are scheduled according to the expected lifetimes of the system components.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prognostics and asset life prediction is one of research potentials in engineering asset health management. We previously developed the Explicit Hazard Model (EHM) to effectively and explicitly predict asset life using three types of information: population characteristics; condition indicators; and operating environment indicators. We have formerly studied the application of both the semi-parametric EHM and non-parametric EHM to the survival probability estimation in the reliability field. The survival time in these models is dependent not only upon the age of the asset monitored, but also upon the condition and operating environment information obtained. This paper is a further study of the semi-parametric and non-parametric EHMs to the hazard and residual life prediction of a set of resistance elements. The resistance elements were used as corrosion sensors for measuring the atmospheric corrosion rate in a laboratory experiment. In this paper, the estimated hazard of the resistance element using the semi-parametric EHM and the non-parametric EHM is compared to the traditional Weibull model and the Aalen Linear Regression Model (ALRM), respectively. Due to assuming a Weibull distribution in the baseline hazard of the semi-parametric EHM, the estimated hazard using this model is compared to the traditional Weibull model. The estimated hazard using the non-parametric EHM is compared to ALRM which is a well-known non-parametric covariate-based hazard model. At last, the predicted residual life of the resistance element using both EHMs is compared to the actual life data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pedestrian movement is known to cause significant effects on indoor MIMO channels. In this paper, a statistical characterization of the indoor MIMO-OFDM channel subject ot pedestrian movement is reported. The experiment used 4 sending and 4 receiving antennas and 114 sub-carriers at 5.2 GHz. Measurement scenarios varied from zero to ten pedestrians walking randomly between transmitter (tx) and receiver (Rx) arrays. The empirical cumulative distribution function (CDF) of the received fading envelope fits the Ricean distribution with K factors ranging from 7dB to 15 dB, for the 10 pedestrians and vacant scenarios respectively. In general, as the number of pedestrians increase, the CDF slope tends to decrease proportionally. Furthermore, as the number of pedestrians increase, increasing multipath contribution, the dynamic range of channel capacity increases proportionally. These results are consistent with measurement results obtained in controlled scenarios for a fixed narrowband Single-Input Single-Output (SISO) link at 5.2 GHz in previous work. The described empirical characterization provides an insight into the prediction of human-body shadowing effects for indoor MIMO-OFDM channels at 5.2 GHz.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.