9 resultados para mean-variance estimation

em Digital Commons at Florida International University


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since the seminal works of Markowitz (1952), Sharpe (1964), and Lintner (1965), numerous studies on portfolio selection and performance measure have been based upon the mean-variance framework. However, several researchers (e.g., Arditti (1967, and 1971), Samuelson (1970), and Rubinstein (1973)) argue that the higher moments cannot be neglected unless there is reason to believe that: (i) the asset returns are normally distributed and the investor's utility function is quadratic, or (ii) the empirical evidence demonstrates that higher moments are irrelevant to the investor's decision. Based on the same argument, this dissertation investigates the impact of higher moments of return distributions on three issues concerning the 14 international stock markets.^ First, the portfolio selection with skewness is determined using: the Polynomial Goal Programming in which investor preferences for skewness can be incorporated. The empirical findings suggest that the return distributions of international stock markets are not normally distributed, and that the incorporation of skewness into an investor's portfolio decision causes a major change in the construction of his optimal portfolio. The evidence also indicates that an investor will trade expected return of the portfolio for skewness. Moreover, when short sales are allowed, investors are better off as they attain higher expected return and skewness simultaneously.^ Second, the performance of international stock markets are evaluated using two types of performance measures: (i) the two-moment performance measures of Sharpe (1966), and Treynor (1965), and (ii) the higher-moment performance measures of Prakash and Bear (1986), and Stephens and Proffitt (1991). The empirical evidence indicates that higher moments of return distributions are significant and relevant to the investor's decision. Thus, the higher moment performance measures should be more appropriate to evaluate the performances of international stock markets. The evidence also indicates that various measures provide a vastly different performance ranking of the markets, albeit in the same direction.^ Finally, the inter-temporal stability of the international stock markets is investigated using the Parhizgari and Prakash (1989) algorithm for the Sen and Puri (1968) test which accounts for non-normality of return distributions. The empirical finding indicates that there is strong evidence to support the stability in international stock market movements. However, when the Anderson test which assumes normality of return distributions is employed, the stability in the correlation structure is rejected. This suggests that the non-normality of the return distribution is an important factor that cannot be ignored in the investigation of inter-temporal stability of international stock markets. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hurricanes are one of the deadliest and costliest natural hazards affecting the Gulf coast and Atlantic coast areas of the United States. An effective way to minimize hurricane damage is to strengthen structures and buildings. The investigation of surface level hurricane wind behavior and the resultant wind loads on structures is aimed at providing structural engineers with information on hurricane wind characteristics required for the design of safe structures. Information on mean wind profiles, gust factors, turbulence intensity, integral scale, and turbulence spectra and co-spectra is essential for developing realistic models of wind pressure and wind loads on structures. The research performed for this study was motivated by the fact that considerably fewer data and validated models are available for tropical than for extratropical storms. ^ Using the surface wind measurements collected by the Florida Coastal Monitoring Program (FCMP) during hurricane passages over coastal areas, this study presents comparisons of surface roughness length estimates obtained by using several estimation methods, and estimates of the mean wind and turbulence structure of hurricane winds over coastal areas under neutral stratification conditions. In addition, a program has been developed and tested to systematically analyze Wall of Wind (WoW) data, that will make it possible to perform analyses of baseline characteristics of flow obtained in the WoW. This program can be used in future research to compare WoW data with FCMP data, as gust and turbulence generator systems and other flow management devices will be used to create WoW flows that match as closely as possible real hurricane wind conditions. ^ Hurricanes are defined as tropical cyclones for which the maximum 1-minute sustained surface wind speeds exceed 74 mph. FCMP data include data for tropical cyclones with lower sustained speeds. However, for the winds analyzed in this study the speeds were sufficiently high to assure that neutral stratification prevailed. This assures that the characteristics of those winds are similar to those prevailing in hurricanes. For this reason in this study the terms tropical cyclones and hurricanes are used interchangeably. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation aims to improve the performance of existing assignment-based dynamic origin-destination (O-D) matrix estimation models to successfully apply Intelligent Transportation Systems (ITS) strategies for the purposes of traffic congestion relief and dynamic traffic assignment (DTA) in transportation network modeling. The methodology framework has two advantages over the existing assignment-based dynamic O-D matrix estimation models. First, it combines an initial O-D estimation model into the estimation process to provide a high confidence level of initial input for the dynamic O-D estimation model, which has the potential to improve the final estimation results and reduce the associated computation time. Second, the proposed methodology framework can automatically convert traffic volume deviation to traffic density deviation in the objective function under congested traffic conditions. Traffic density is a better indicator for traffic demand than traffic volume under congested traffic condition, thus the conversion can contribute to improving the estimation performance. The proposed method indicates a better performance than a typical assignment-based estimation model (Zhou et al., 2003) in several case studies. In the case study for I-95 in Miami-Dade County, Florida, the proposed method produces a good result in seven iterations, with a root mean square percentage error (RMSPE) of 0.010 for traffic volume and a RMSPE of 0.283 for speed. In contrast, Zhou's model requires 50 iterations to obtain a RMSPE of 0.023 for volume and a RMSPE of 0.285 for speed. In the case study for Jacksonville, Florida, the proposed method reaches a convergent solution in 16 iterations with a RMSPE of 0.045 for volume and a RMSPE of 0.110 for speed, while Zhou's model needs 10 iterations to obtain the best solution, with a RMSPE of 0.168 for volume and a RMSPE of 0.179 for speed. The successful application of the proposed methodology framework to real road networks demonstrates its ability to provide results both with satisfactory accuracy and within a reasonable time, thus establishing its potential usefulness to support dynamic traffic assignment modeling, ITS systems, and other strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research addresses the problem of cost estimation for product development in engineer-to-order (ETO) operations. An ETO operation starts the product development process with a product specification and ends with delivery of a rather complicated, highly customized product. ETO operations are practiced in various industries such as engineering tooling, factory plants, industrial boilers, pressure vessels, shipbuilding, bridges and buildings. ETO views each product as a delivery item in an industrial project and needs to make an accurate estimation of its development cost at the bidding and/or planning stage before any design or manufacturing activity starts. ^ Many ETO practitioners rely on an ad hoc approach to cost estimation, with use of past projects as reference, adapting them to the new requirements. This process is often carried out on a case-by-case basis and in a non-procedural fashion, thus limiting its applicability to other industry domains and transferability to other estimators. In addition to being time consuming, this approach usually does not lead to an accurate cost estimate, which varies from 30% to 50%. ^ This research proposes a generic cost modeling methodology for application in ETO operations across various industry domains. Using the proposed methodology, a cost estimator will be able to develop a cost estimation model for use in a chosen ETO industry in a more expeditious, systematic and accurate manner. ^ The development of the proposed methodology was carried out by following the meta-methodology as outlined by Thomann. Deploying the methodology, cost estimation models were created in two industry domains (building construction and the steel milling equipment manufacturing). The models are then applied to real cases; the cost estimates are significantly more accurate than the actual estimates, with mean absolute error rate of 17.3%. ^ This research fills an important need of quick and accurate cost estimation across various ETO industries. It differs from existing approaches to the problem in that a methodology is developed for use to quickly customize a cost estimation model for a chosen application domain. In addition to more accurate estimation, the major contributions are in its transferability to other users and applicability to different ETO operations. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the importance of color processing in computer vision and computer graphics, estimating and rendering illumination spectral reflectance of image scenes is important to advance the capability of a large class of applications such as scene reconstruction, rendering, surface segmentation, object recognition, and reflectance estimation. Consequently, this dissertation proposes effective methods for reflection components separation and rendering in single scene images. Based on the dichromatic reflectance model, a novel decomposition technique, named the Mean-Shift Decomposition (MSD) method, is introduced to separate the specular from diffuse reflectance components. This technique provides a direct access to surface shape information through diffuse shading pixel isolation. More importantly, this process does not require any local color segmentation process, which differs from the traditional methods that operate by aggregating color information along each image plane. ^ Exploiting the merits of the MSD method, a scene illumination rendering technique is designed to estimate the relative contributing specular reflectance attributes of a scene image. The image feature subset targeted provides a direct access to the surface illumination information, while a newly introduced efficient rendering method reshapes the dynamic range distribution of the specular reflectance components over each image color channel. This image enhancement technique renders the scene illumination reflection effectively without altering the scene’s surface diffuse attributes contributing to realistic rendering effects. ^ As an ancillary contribution, an effective color constancy algorithm based on the dichromatic reflectance model was also developed. This algorithm selects image highlights in order to extract the prominent surface reflectance that reproduces the exact illumination chromaticity. This evaluation is presented using a novel voting scheme technique based on histogram analysis. ^ In each of the three main contributions, empirical evaluations were performed on synthetic and real-world image scenes taken from three different color image datasets. The experimental results show over 90% accuracy in illumination estimation contributing to near real world illumination rendering effects. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hurricanes are one of the deadliest and costliest natural hazards affecting the Gulf coast and Atlantic coast areas of the United States. An effective way to minimize hurricane damage is to strengthen structures and buildings. The investigation of surface level hurricane wind behavior and the resultant wind loads on structures is aimed at providing structural engineers with information on hurricane wind characteristics required for the design of safe structures. Information on mean wind profiles, gust factors, turbulence intensity, integral scale, and turbulence spectra and co-spectra is essential for developing realistic models of wind pressure and wind loads on structures. The research performed for this study was motivated by the fact that considerably fewer data and validated models are available for tropical than for extratropical storms. Using the surface wind measurements collected by the Florida Coastal Monitoring Program (FCMP) during hurricane passages over coastal areas, this study presents comparisons of surface roughness length estimates obtained by using several estimation methods, and estimates of the mean wind and turbulence structure of hurricane winds over coastal areas under neutral stratification conditions. In addition, a program has been developed and tested to systematically analyze Wall of Wind (WoW) data, that will make it possible to perform analyses of baseline characteristics of flow obtained in the WoW. This program can be used in future research to compare WoW data with FCMP data, as gust and turbulence generator systems and other flow management devices will be used to create WoW flows that match as closely as possible real hurricane wind conditions. Hurricanes are defined as tropical cyclones for which the maximum 1-minute sustained surface wind speeds exceed 74 mph. FCMP data include data for tropical cyclones with lower sustained speeds. However, for the winds analyzed in this study the speeds were sufficiently high to assure that neutral stratification prevailed. This assures that the characteristics of those winds are similar to those prevailing in hurricanes. For this reason in this study the terms tropical cyclones and hurricanes are used interchangeably.