948 resultados para Random Coefficient Autoregressive Model{ RCAR (1)}
Resumo:
Many destination marketing organizations in the United States and elsewhere are facing budget retrenchment for tourism marketing, especially for advertising. This study evaluates a three-stage model using Random Coefficient Logit (RCL) approach which controls for correlations between different non-independent alternatives and considers heterogeneity within individual’s responses to advertising. The results of this study indicate that the proposed RCL model results in a significantly better fit as compared to traditional logit models, and indicates that tourism advertising significantly influences tourist decisions with several variables (age, income, distance and Internet access) moderating these decisions differently depending on decision stage and product type. These findings suggest that this approach provides a better foundation for assessing, and in turn, designing more effective advertising campaigns.
Resumo:
In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.
Estimation of productivity in Korean electric power plants:a semiparametric smooth coefficient model
Resumo:
This paper analyzes the impact of load factor, facility and generator types on the productivity of Korean electric power plants. In order to capture important differences in the effect of load policy on power output, we use a semiparametric smooth coefficient (SPSC) model that allows us to model heterogeneous performances across power plants and over time by allowing underlying technologies to be heterogeneous. The SPSC model accommodates both continuous and discrete covariates. Various specification tests are conducted to compare performance of the SPSC model. Using a unique generator level panel dataset spanning the period 1995-2006, we find that the impact of load factor, generator and facility types on power generation varies substantially in terms of magnitude and significance across different plant characteristics. The results have strong implication for generation policy in Korea as outlined in this study.
Estimation of productivity in Korean electric power plants:a semiparametric smooth coefficient model
Resumo:
This paper analyzes the impact of load factor, facility and generator types on the productivity of Korean electric power plants. In order to capture important differences in the effect of load policy on power output, we use a semiparametric smooth coefficient (SPSC) model that allows us to model heterogeneous performances across power plants and over time by allowing underlying technologies to be heterogeneous. The SPSC model accommodates both continuous and discrete covariates. Various specification tests are conducted to assess the performance of the SPSC model. Using a unique generator level panel dataset spanning the period 1995-2006, we find that the impact of load factor, generator and facility types on power generation varies substantially in terms of magnitude and significance across different plant characteristics. The results have strong implications for generation policy in Korea as outlined in this study.
Resumo:
The use of human brain electroencephalography (EEG) signals for automatic person identi cation has been investigated for a decade. It has been found that the performance of an EEG-based person identication system highly depends on what feature to be extracted from multi-channel EEG signals. Linear methods such as Power Spectral Density and Autoregressive Model have been used to extract EEG features. However these methods assumed that EEG signals are stationary. In fact, EEG signals are complex, non-linear, non-stationary, and random in nature. In addition, other factors such as brain condition or human characteristics may have impacts on the performance, however these factors have not been investigated and evaluated in previous studies. It has been found in the literature that entropy is used to measure the randomness of non-linear time series data. Entropy is also used to measure the level of chaos of braincomputer interface systems. Therefore, this thesis proposes to study the role of entropy in non-linear analysis of EEG signals to discover new features for EEG-based person identi- cation. Five dierent entropy methods including Shannon Entropy, Approximate Entropy, Sample Entropy, Spectral Entropy, and Conditional Entropy have been proposed to extract entropy features that are used to evaluate the performance of EEG-based person identication systems and the impacts of epilepsy, alcohol, age and gender characteristics on these systems. Experiments were performed on the Australian EEG and Alcoholism datasets. Experimental results have shown that, in most cases, the proposed entropy features yield very fast person identication, yet with compatible accuracy because the feature dimension is low. In real life security operation, timely response is critical. The experimental results have also shown that epilepsy, alcohol, age and gender characteristics have impacts on the EEG-based person identication systems.
Resumo:
In the study, the production efficiency of catfish in Cross River State was determined. Data was obtained from 120 fish farmers were randomly selected from Cross River Agricultural Zones, using a multistage random sampling technique. Multiple regression analysis model was the main tool of data analysis where different functions were tried. The results indicated that Cobb-Douglass production function had the best fit in explaining the relationship between output of catfish and inputs used, the coefficient of multiple determinant (R2 = 0.61) indicates that sixtyone percent of the variability in output of catfish is explained by the independent variables. The results also indicate that farmers’ educational level positively influence their level of efficiency in catfish production in the study area. The F-value of 16.427 indicates the overall significance of the model at 1 percent level, indicating that there is a significant linear relationship between the independent variables taken together and the yield of catfish produced in Cross River State. The marginal value products of fish pond size (farm size), labour and feed (diet) were N67.50, N 178.13 and N 728.00 respectively, while allocative efficiency for (farm size), labour and feed (diet) were (0.09 over utilized, 2.85 under utilized and 0.99 over utilized), respectively, there existed allocative in-efficiency, there is a high potential for catfish farmers to increase their yields and income. Based on the findings of this study, it is recommended that fish farmers should expand fish farms, improving on production efficiency and adopting new technologies. Regular awareness campaign about new technologies in fish farming should be embarked by extension agents to make fish farmers know the importance of adopting new technologies. KEYWORDS: Production efficiency, Catfish, Cobb-Douglass, Production function, Cross River State
Resumo:
The papers included in this thesis deal with a few aspects of insurance economics that have seldom been dealt with in the applied literature. In the first paper I apply for the first time the tools of the economics of crime to study the determinants of frauds, using data on Italian provinces. The contributions to the literature are manifold: -The price of insuring has a positive correlation with the propensity to defraud -Social norms constraint fraudulent behavior, but their strength is curtailed in economic downturns -I apply a simple extension of the Random Coefficient model, which allows for the presence of time invariant covariates and asymmetries in the impact of the regressors. The second paper assesses how the evolution of macro prudential regulation of insurance companies has been reflected in their equity price. I employ a standard event study methodology, deriving the definition of the “control” and “treatment” groups from what is implied by the regulatory framework. The main results are: -Markets care about the evolution of the legislation. Their perception has shifted from a first positive assessment of a possible implicit “too big to fail” subsidy to a more negative one related to its cost in terms of stricter capital requirement -The size of this phenomenon is positively related to leverage, size and on the geographical location of the insurance companies The third paper introduces a novel methodology to forecast non-life insurance premiums and profitability as function of macroeconomic variables, using the simultaneous equation framework traditionally employed macroeconometric models and a simple theoretical model of insurance pricing to derive a long term relationship between premiums, claims expenses and short term rates. The model is shown to provide a better forecast of premiums and profitability compared with the single equation specifications commonly used in applied analysis.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
OBJECTIVES: The complexity and heterogeneity of human bone, as well as ethical issues, frequently hinder the development of clinical trials. The purpose of this in vitro study was to determine the modulus of elasticity of a polyurethane isotropic experimental model via tension tests, comparing the results to those reported in the literature for mandibular bone, in order to validate the use of such a model in lieu of mandibular bone in biomechanical studies. MATERIAL AND METHODS: Forty-five polyurethane test specimens were divided into 3 groups of 15 specimens each, according to the ratio (A/B) of polyurethane reagents (PU-1: 1/0.5, PU-2: 1/1, PU-3: 1/1.5). RESULTS: Tension tests were performed in each experimental group and the modulus of elasticity values found were 192.98 MPa (SD=57.20) for PU-1, 347.90 MPa (SD=109.54) for PU-2 and 304.64 MPa (SD=25.48) for PU-3. CONCLUSION: The concentration of choice for building the experimental model was 1/1.
Resumo:
The aim of this study was to examine the effects of low carbohydrate (CHO) availability on heart rate variability (HRV) responses during moderate and severe exercise intensities until exhaustion. Six healthy males (age, 26.5 +/- 6.7 years; body mass, 78.4 +/- 7.7 kg; body fat %, 11.3 +/- 4.5%; (V) over dotO(2max), 39.5 +/- 6.6 mL kg(-1) min(-1)) volunteered for this study. All tests were performed in the morning, after 8-12 h overnight fasting, at a moderate intensity corresponding to 50% of the difference between the first (LT(1)) and second (LT(2)) lactate breakpoints and at a severe intensity corresponding to 25% of the difference between the maximal power output and LT(2). Forty-eight hours before each experimental session, the subjects performed a 90-min cycling exercise followed by 5-min rest periods and subsequent 1-min cycling bouts at 125% (V) over dotO(2max) (with 1-min rest periods) until exhaustion, in order to deplete muscle glycogen. A diet providing 10% (CHO(low)) or 65% (CHO(control)) of energy as carbohydrates was consumed for the following 2 days until the experimental test. The Poicare plots (standard deviations 1 and 2: SD1 and SD2, respectively) and spectral autoregressive model (low frequency LF, and high frequency HF) were applied to obtain HRV parameters. The CHO availability had no effect on the HRV parameters or ventilation during moderate-intensity exercise. However, the SD1 and SD2 parameters were significantly higher in CHO(low) than in CHO(control), as taken at exhaustion during the severe-intensity exercise (P < 0.05). The HF and LF frequencies (ms(2)) were also significantly higher in CHO(low) than in CHO(control) (P < 0.05). In addition, ventilation measured at the 5 and 10-min was higher in CHO(low) (62.5 +/- 4.4 and 74.8 +/- 6.5 L min(-1), respectively, P < 0.05) than in CHO(control) (70.0 +/- 3.6 and 79.6 +/- 5.1 L min(-1), respectively; P < 0.05) during the severe-intensity exercise. These results suggest that the CHO availability alters the HRV parameters during severe-, but not moderate-, intensity exercise, and this was associated with an increase in ventilation volume.
Resumo:
We used an event related fMRI design to study the BOLD response in Huntington’s disease (HD) patients during performance of a Simon interference task. We hypothesised that HD patients will demonstrate significantly slower RTs than controls, and that there will be significant differences in the pattern of brain activation between groups. Seventeen HD patients and 15 age and sex matched controls were scanned using 3T GE scanner (FOV = 24 cm2; TE = 40 ms; TR = 3 s; FA = 60°; slice thickness = 6 mm; in-plane resolution = 1.88x1.88 mm2). The task involved two activation conditions, namely congruent (for example, left pointing arrow appearing on the left side of the screen) and incongruent (for example, left pointing arrow appearing on the right side of the screen), and a baseline condition. Each stimulus was presented for 2500 ms followed by a blank screen for 500 ms. Subjects were instructed to press a button using the same hand as indicated by the direction of the arrow head and were given 3000 ms to respond. Data analysis was performed using SPM2 with a random effects analysis model. For each subject parameter estimates for combined task conditions (congruent and incongruent combined) were calculated. Comparisons such as these, based on block designs, have superior statistical power for detecting subtle changes in the BOLD response anywhere in the brain. The activations reported are significant at PFDR_corr
Resumo:
Purpose: To evaluate rates of visual field progression in eyes with optic disc hemorrhages and the effect of intraocular pressure (IOP) reduction on these rates. Design: Observational cohort study. Participants: The study included 510 eyes of 348 patients with glaucoma who were recruited from the Diagnostic Innovations in Glaucoma Study (DIGS) and followed for an average of 8.2 years. Methods: Eyes were followed annually with clinical examination, standard automated perimetry visual fields, and optic disc stereophotographs. The presence of optic disc hemorrhages was determined on the basis of masked evaluation of optic disc stereophotographs. Evaluation of rates of visual field change during follow-up was performed using the visual field index (VFI). Main Outcome Measures: The evaluation of the effect of optic disc hemorrhages on rates of visual field progression was performed using random coefficient models. Estimates of rates of change for individual eyes were obtained by best linear unbiased prediction (BLUP). Results: During follow-up, 97 (19%) of the eyes had at least 1 episode of disc hemorrhage. The overall rate of VFI change in eyes with hemorrhages was significantly faster than in eyes without hemorrhages (-0.88%/year vs. -0.38%/year, respectively, P < 0.001). The difference in rates of visual field loss pre- and post-hemorrhage was significantly related to the reduction of IOP in the post-hemorrhage period compared with the pre-hemorrhage period (r = -0.61; P < 0.001). Each 1 mmHg of IOP reduction was associated with a difference of 0.31%/year in the rate of VFI change. Conclusions: There was a beneficial effect of treatment in slowing rates of progressive visual field loss in eyes with optic disc hemorrhage. Further research should elucidate the reasons why some patients with hemorrhages respond well to IOP reduction and others seem to continue to progress despite a significant reduction in IOP levels. Financial Disclosure(s): Proprietary or commercial disclosure may be found after the references. Ophthalmology 2010; 117: 2061-2066 (C) 2010 by the American Academy of Ophthalmology.
Resumo:
Liver steatosis is mainly a textural abnormality of the hepatic parenchyma due to fat accumulation on the hepatic vesicles. Today, the assessment is subjectively performed by visual inspection. Here a classifier based on features extracted from ultrasound (US) images is described for the automatic diagnostic of this phatology. The proposed algorithm estimates the original ultrasound radio-frequency (RF) envelope signal from which the noiseless anatomic information and the textural information encoded in the speckle noise is extracted. The features characterizing the textural information are the coefficients of the first order autoregressive model that describes the speckle field. A binary Bayesian classifier was implemented and the Bayes factor was calculated. The classification has revealed an overall accuracy of 100%. The Bayes factor could be helpful in the graphical display of the quantitative results for diagnosis purposes.
Resumo:
This paper presents a spatial econometrics analysis for the number of road accidents with victims in the smallest administrative divisions of Lisbon, considering as a baseline a log-Poisson model for environmental factors. Spatial correlation on data is investigated for data alone and for the residuals of the baseline model without and with spatial-autocorrelated and spatial-lagged terms. In all the cases no spatial autocorrelation was detected.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics