13 resultados para HALF-NORMAL DISTRIBUTION

em Digital Commons at Florida International University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Let (X, Y) be bivariate normal random vectors which represent the responses as a result of Treatment 1 and Treatment 2. The statistical inference about the bivariate normal distribution parameters involving missing data with both treatment samples is considered. Assuming the correlation coefficient ρ of the bivariate population is known, the MLE of population means and variance (ξ, η, and σ2) are obtained. Inferences about these parameters are presented. Procedures of constructing confidence interval for the difference of population means ξ – η and testing hypothesis about ξ – η are established. The performances of the new estimators and testing procedure are compared numerically with the method proposed in Looney and Jones (2003) on the basis of extensive Monte Carlo simulation. Simulation studies indicate that the testing power of the method proposed in this thesis study is higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multivariate normal distribution is commonly encountered in any field, a frequent issue is the missing values in practice. The purpose of this research was to estimate the parameters in three-dimensional covariance permutation-symmetric normal distribution with complete data and all possible patterns of incomplete data. In this study, MLE with missing data were derived, and the properties of the MLE as well as the sampling distributions were obtained. A Monte Carlo simulation study was used to evaluate the performance of the considered estimators for both cases when ρ was known and unknown. All results indicated that, compared to estimators in the case of omitting observations with missing data, the estimators derived in this article led to better performance. Furthermore, when ρ was unknown, using the estimate of ρ would lead to the same conclusion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Suppose two or more variables are jointly normally distributed. If there is a common relationship between these variables it would be very important to quantify this relationship by a parameter called the correlation coefficient which measures its strength, and the use of it can develop an equation for predicting, and ultimately draw testable conclusion about the parent population. This research focused on the correlation coefficient ρ for the bivariate and trivariate normal distribution when equal variances and equal covariances are considered. Particularly, we derived the maximum Likelihood Estimators (MLE) of the distribution parameters assuming all of them are unknown, and we studied the properties and asymptotic distribution of . Showing this asymptotic normality, we were able to construct confidence intervals of the correlation coefficient ρ and test hypothesis about ρ. With a series of simulations, the performance of our new estimators were studied and were compared with those estimators that already exist in the literature. The results indicated that the MLE has a better or similar performance than the others.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Road pricing has emerged as an effective means of managing road traffic demand while simultaneously raising additional revenues to transportation agencies. Research on the factors that govern travel decisions has shown that user preferences may be a function of the demographic characteristics of the individuals and the perceived trip attributes. However, it is not clear what are the actual trip attributes considered in the travel decision- making process, how these attributes are perceived by travelers, and how the set of trip attributes change as a function of the time of the day or from day to day. In this study, operational Intelligent Transportation Systems (ITS) archives are mined and the aggregated preferences for a priced system are extracted at a fine time aggregation level for an extended number of days. The resulting information is related to corresponding time-varying trip attributes such as travel time, travel time reliability, charged toll, and other parameters. The time-varying user preferences and trip attributes are linked together by means of a binary choice model (Logit) with a linear utility function on trip attributes. The trip attributes weights in the utility function are then dynamically estimated for each time of day by means of an adaptive, limited-memory discrete Kalman filter (ALMF). The relationship between traveler choices and travel time is assessed using different rules to capture the logic that best represents the traveler perception and the effect of the real-time information on the observed preferences. The impact of travel time reliability on traveler choices is investigated considering its multiple definitions. It can be concluded based on the results that using the ALMF algorithm allows a robust estimation of time-varying weights in the utility function at fine time aggregation levels. The high correlations among the trip attributes severely constrain the simultaneous estimation of their weights in the utility function. Despite the data limitations, it is found that, the ALMF algorithm can provide stable estimates of the choice parameters for some periods of the day. Finally, it is found that the daily variation of the user sensitivities for different periods of the day resembles a well-defined normal distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation examines the quality of hazard mitigation elements in a coastal, hazard prone state. I answer two questions. First, in a state with a strong mandate for hazard mitigation elements in comprehensive plans, does plan quality differ among county governments? Second, if such variation exists, what drives this variation? My research focuses primarily on Florida's 35 coastal counties, which are all at risk for hurricane and flood hazards, and all fall under Florida's mandate to have a comprehensive plan that includes a hazard mitigation element. Research methods included document review to rate the hazard mitigation elements of all 35 coastal county plans and subsequent analysis against demographic and hazard history factors. Following this, I conducted an electronic, nationwide survey of planning professionals and academics, informed by interviews of planning leaders in Florida counties. I found that hazard mitigation element quality varied widely among the 35 Florida coastal counties, but were close to a normal distribution. No plans were of exceptionally high quality. Overall, historical hazard effects did not correlate with hazard mitigation element quality, but some demographic variables that are associated with urban populations did. The variance in hazard mitigation element quality indicates that while state law may mandate, and even prescribe, hazard mitigation in local comprehensive plans, not all plans will result in equal, or even adequate, protection for people. Furthermore, the mixed correlations with demographic variables representing social and disaster vulnerability shows that, at least at the county level, vulnerability to hazards does not have a strong effect on hazard mitigation element quality. From a theory perspective, my research is significant because it compares assumptions about vulnerability based on hazard history and demographics to plan quality. The only vulnerability-related variables that appeared to correlate, and at that mildly so, with hazard mitigation element quality, were those typically representing more urban areas. In terms of the theory of Neo-Institutionalism and theories related to learning organizations, my research shows that planning departments appear to have set norms and rules of operating that preclude both significant public involvement and learning from prior hazard events.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Goodness-of-fit tests have been studied by many researchers. Among them, an alternative statistical test for uniformity was proposed by Chen and Ye (2009). The test was used by Xiong (2010) to test normality for the case that both location parameter and scale parameter of the normal distribution are known. The purpose of the present thesis is to extend the result to the case that the parameters are unknown. A table for the critical values of the test statistic is obtained using Monte Carlo simulation. The performance of the proposed test is compared with the Shapiro-Wilk test and the Kolmogorov-Smirnov test. Monte-Carlo simulation results show that proposed test performs better than the Kolmogorov-Smirnov test in many cases. The Shapiro Wilk test is still the most powerful test although in some cases the test proposed in the present research performs better.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation examines the quality of hazard mitigation elements in a coastal, hazard prone state. I answer two questions. First, in a state with a strong mandate for hazard mitigation elements in comprehensive plans, does plan quality differ among county governments? Second, if such variation exists, what drives this variation? My research focuses primarily on Florida’s 35 coastal counties, which are all at risk for hurricane and flood hazards, and all fall under Florida’s mandate to have a comprehensive plan that includes a hazard mitigation element. Research methods included document review to rate the hazard mitigation elements of all 35 coastal county plans and subsequent analysis against demographic and hazard history factors. Following this, I conducted an electronic, nationwide survey of planning professionals and academics, informed by interviews of planning leaders in Florida counties. I found that hazard mitigation element quality varied widely among the 35 Florida coastal counties, but were close to a normal distribution. No plans were of exceptionally high quality. Overall, historical hazard effects did not correlate with hazard mitigation element quality, but some demographic variables that are associated with urban populations did. The variance in hazard mitigation element quality indicates that while state law may mandate, and even prescribe, hazard mitigation in local comprehensive plans, not all plans will result in equal, or even adequate, protection for people. Furthermore, the mixed correlations with demographic variables representing social and disaster vulnerability shows that, at least at the county level, vulnerability to hazards does not have a strong effect on hazard mitigation element quality. From a theory perspective, my research is significant because it compares assumptions about vulnerability based on hazard history and demographics to plan quality. The only vulnerability-related variables that appeared to correlate, and at that mildly so, with hazard mitigation element quality, were those typically representing more urban areas. In terms of the theory of Neo-Institutionalism and theories related to learning organizations, my research shows that planning departments appear to have set norms and rules of operating that preclude both significant public involvement and learning from prior hazard events.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Total soil-mercury and phosphorus concentrations were determined in 64 sites in the southern half of Water Conservation Area 3A, an area of approximately 500 km2 . Surface soil-Hg concentrations ranged from 117 to 300 ng-g-1;total phosphorus concentrations range from 350 to 850 pg~g-1. No consistent north-south or east-west trends are found in the mercury or phosphorus surface concentrations when they are normalized to soil bulk density. Nine sites were used for the determination of the vertical distribution of soilmercury. Vertical profiles of soil-Hg revealed decreasing concentrations with depth and correlated well with phosphorus in soil profiles. Mercury concentrations in soil profiles may be interpreted as an increase in the rate of deposition of mercury in the region in recent decades and/or as postdepositionalmobilization of mercury to surface layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel biocompatible and biodegradable polymer, termed poly(Glycerol malate co-dodecanedioate) (PGMD), was prepared by thermal condensation method and used for fabrication of nanoparticles (NPs). PGMD NPs were prepared using the single oil emulsion technique and loaded with an imaging/hyperthermia agent (IR820) and a chemotherapeutic agent (doxorubicin, DOX). The size of the void PGMD NPs, IR820-PGMD NPs and DOX-IR820-PGMD NPs were approximately 90 nm, 110 nm, and 125 nm respectively. An acidic environment (pH=5.0) induced higher DOX and IR820 release compared to pH=7.4. DOX release was also enhanced by exposure to laser, which increased the temperature to 42°C. Cytotoxicity of DOX-IR820-PGMD NPs was comparable in MES-SA but was higher in Dx5 cells compared to free DOX plus IR820 (p<0.05). The combination of hyperthermia (HT) and chemotherapy improved cytotoxicity in both cell lines. We also explored the cellular response after rapid, short-term and low thermal dose (laser/Dye/NP) induced-heating, and compared it to slow, long-term and high thermal dose cell incubator heating by investigating the reactive oxygen species (ROS) level, hypoxia-inducible factor-1&agr; (HIF-1&agr;) and vascular endothelial growth factor (VEGF) expression. The cytotoxicity of IR820-PGMD NPs after laser/Dye/NP HT resulted in higher cancer cell killing compared to incubator HT. ROS level, HIF-1&agr; and VEGF expression were elevated under incubator HT, while maintained at the baseline level under the laser/Dye/NP HT. In vivo mouse studies showed that NP formulation significantly improved the plasma half-life of IR820 after tail vein injection. Significant lower IR820 content was observed in kidney in DOX-IR820-PGMD NP treatment as compared to free IR820 treatment in our biodistribution studies (p<0.05). In conclusion, both IR820-PGMD NPs and DOX-IR820-PGMD NPs were successfully developed and used for both imaging and therapeutic purposes. Rapid and short-term laser/Dye/NP HT, with a low thermal dose, did not up-regulate HIF-1&agr; and VEGF expression, whereas slow and long-term incubator HT, with a high thermal dose, can enhance expression of both HIF-1&agr; and VEGF.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Theoretical research and specific surface area analysis of nitrogen adsorption indicated that a lot of structural micropores exist in sepiolite minerals fibers. However, the microporous size, existing form, and the distribution relationship between microporous structures were not proved yet. In this paper, the section TEM samples of nanofibers were prepared on the basis of the metal embedding and cutting technique, and the inner structure of sepiolite nanofibers was observed by TEM. The results showed that sepiolite fibers have multiplayer structure similar to concentric circles, and many micropores with the size of about 2–5 nm are normal and parallel to the -axis. The reason for the previously mentioned phenomenon was explained by using BET analysis and X-ray diffraction analysis results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel biocompatible and biodegradable polymer, termed poly(Glycerol malate co-dodecanedioate) (PGMD), was prepared by thermal condensation method and used for fabrication of nanoparticles (NPs). PGMD NPs were prepared using the single oil emulsion technique and loaded with an imaging/hyperthermia agent (IR820) and a chemotherapeutic agent (doxorubicin, DOX). The size of the void PGMD NPs, IR820-PGMD NPs and DOX-IR820-PGMD NPs were approximately 90 nm, 110 nm, and 125 nm respectively. An acidic environment (pH=5.0) induced higher DOX and IR820 release compared to pH=7.4. DOX release was also enhanced by exposure to laser, which increased the temperature to 42°C. Cytotoxicity of DOX-IR820-PGMD NPs was comparable in MES-SA but was higher in Dx5 cells compared to free DOX plus IR820 (pIn vivomouse studies showed that NP formulation significantly improved the plasma half-life of IR820 after tail vein injection. Significant lower IR820 content was observed in kidney in DOX-IR820-PGMD NP treatment as compared to free IR820 treatment in our biodistribution studies (p