959 resultados para empirical testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future of health care delivery is becoming more citizen-centred, as today’s user is more active, better informed and more demanding. The European Commission is promoting online health services and, therefore, member states will need to boost deployment and use of online services. This makes e-health adoption an important field to be studied and understood. This study applied the extended unified theory of acceptance and usage technology (UTAUT2) to explain patients’ individual adoption of e-health. An online questionnaire was administrated Portugal using mostly the same instrument used in UTAUT2 adapted to e-health context. We collected 386 valid answers. Performance expectancy, effort expectancy, social influence, and habit had the most significant explanatory power over behavioural intention and habit and behavioural intention over technology use. The model explained 52% of the variance in behavioural intention and 32% of the variance in technology use. Our research helps to understand the desired technology characteristics of ehealth. By testing an information technology acceptance model, we are able to determine what is more valued by patients when it comes to deciding whether to adopt e-health systems or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we develop a specification technique for building multiplicative time-varying GARCH models of Amado and Teräsvirta (2008, 2013). The variance is decomposed into an unconditional and a conditional component such that the unconditional variance component is allowed to evolve smoothly over time. This nonstationary component is defined as a linear combination of logistic transition functions with time as the transition variable. The appropriate number of transition functions is determined by a sequence of specification tests. For that purpose, a coherent modelling strategy based on statistical inference is presented. It is heavily dependent on Lagrange multiplier type misspecification tests. The tests are easily implemented as they are entirely based on auxiliary regressions. Finite-sample properties of the strategy and tests are examined by simulation. The modelling strategy is illustrated in practice with two real examples: an empirical application to daily exchange rate returns and another one to daily coffee futures returns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tests the Entrepreneurial Intention Model -which is adapted from the Theory of Planned Behavior- on a sample of 533 individuals from two quite different countries: one of them European (Spain) and the other South Asian (Taiwan). A newly developed Entrepreneurial Intention Questionnaire (EIQ) has being used which tries to overcome some of the limitations of previous instruments. Structural equations techniques were used in the empirical analysis. Results are generally satisfactory, indicating that the model is probably adequate for studying entrepreneurship. Support for the model was found not only in the combined sample, but also in each of the national ones. However, some differences arose that may indicate demographic variables contribute differently to the formation of perceptions in each culture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Local adaptation can drive the divergence of populations but identification of the traits under selection remains a major challenge in evolutionary biology. Reciprocal transplant experiments are ideal tests of local adaptation, yet rarely used for higher vertebrates because of the mobility and potential invasiveness of non-native organisms. Here, we reciprocally transplanted 2500 brown trout (Salmo trutta) embryos from five populations to investigate local adaptation in early life history traits. Embryos were bred in a full-factorial design and raised in natural riverbeds until emergence. Customized egg capsules were used to simulate the natural redd environment and allowed tracking the fate of every individual until retrieval. We predicted that 1) within sites, native populations would outperform non-natives, and 2) across sites, populations would show higher performance at 'home' compared to 'away' sites. RESULTS: There was no evidence for local adaptation but we found large differences in survival and hatching rates between sites, indicative of considerable variation in habitat quality. Survival was generally high across all populations (55% +/- 3%), but ranged from 4% to 89% between sites. Average hatching rate was 25% +/- 3% across populations ranging from 0% to 62% between sites. CONCLUSION: This study provides rare empirical data on variation in early life history traits in a population network of a salmonid, and large-scale breeding and transplantation experiments like ours provide powerful tests for local adaptation. Despite the recently reported genetic and morphological differences between the populations in our study area, local adaptation at the embryo level is small, non-existent, or confined to ecological conditions that our experiment could not capture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the role of deterministic components in the DGP and in the auxiliary regression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d ∈ [0, 1). This is an important test in many economic applications because I(d) processess with d & 1 are mean-reverting although, when 0.5 ≤ d & 1,, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributed tests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the role of deterministic components in the DGP and in the auxiliaryregression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d [0, 1). This is an important test in many economic applications because I(d) processess with d < 1 are mean-reverting although, when 0.5 = d < 1, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributedtests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One plausible mechanism through which financial market shocks may propagate across countriesis through the impact that past gains and losses may have on investors risk aversion and behavior. This paper presents a stylized model illustrating how heterogeneous changes in investors risk aversion affect portfolio allocation decisions and stock prices. Our empirical findings suggest that when funds returns are below average, they adjust their holdings toward the average (or benchmark) portfolio. In so doing, funds tend to sell the assets of countries in which they were overweight , increasing their exposure to countries in which they were underweight. Based on this insight, the paper constructs an index of financial interdependence which reflects the extent to which countries share overexposed funds. The index helps in explain the pattern of stock market comovement across countries. Moreover, a comparison of this interdependence measure to indices of trade or commercial bank linkages indicates that our index can improve predictions about which countries are more likely to be affected by contagion from crisis centers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent theoretical models of economic growth have emphasised the role of external effects on the accumulation of factors of production. Although most of the literature has considered the externalities across firms within a region, in this paper we go a step further and consider the possibility that these externalities cross the barriers of regional economies. We assess the role of these external effects in explaining growth and economic convergence. We present a simple growth model, which includes externalities across economies, developing a methodology for testing their existence and estimating their strength. In our view, spatial econometrics is naturally suited to an empirical consideration of these externalities. We obtain evidence on the presence of significant externalities both across Spanish and European regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent theoretical models of economic growth have emphasised the role of external effects on the accumulation of factors of production. Although most of the literature has considered the externalities across firms within a region, in this paper we go a step further and consider the possibility that these externalities cross the barriers of regional economies. We assess the role of these external effects in explaining growth and economic convergence. We present a simple growth model, which includes externalities across economies, developing a methodology for testing their existence and estimating their strength. In our view, spatial econometrics is naturally suited to an empirical consideration of these externalities. We obtain evidence on the presence of significant externalities both across Spanish and European regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The resilient modulus (MR) input parameters in the Mechanistic-Empirical Pavement Design Guide (MEPDG) program have a significant effect on the projected pavement performance. The MEPDG program uses three different levels of inputs depending on the desired level of accuracy. The primary objective of this research was to develop a laboratory testing program utilizing the Iowa DOT servo-hydraulic machine system for evaluating typical Iowa unbound materials and to establish a database of input values for MEPDG analysis. This was achieved by carrying out a detailed laboratory testing program designed in accordance with the AASHTO T307 resilient modulus test protocol using common Iowa unbound materials. The program included laboratory tests to characterize basic physical properties of the unbound materials, specimen preparation and repeated load triaxial tests to determine the resilient modulus. The MEPDG resilient modulus input parameter library for Iowa typical unbound pavement materials was established from the repeated load triaxial MR test results. This library includes the non-linear, stress-dependent resilient modulus model coefficients values for level 1 analysis, the unbound material properties values correlated to resilient modulus for level 2 analysis, and the typical resilient modulus values for level 3 analysis. The resilient modulus input parameters library can be utilized when designing low volume roads in the absence of any basic soil testing. Based on the results of this study, the use of level 2 analysis for MEPDG resilient modulus input is recommended since the repeated load triaxial test for level 1 analysis is complicated, time consuming, expensive, and requires sophisticated equipment and skilled operators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research is to determine whether the nationally calibrated performance models used in the Mechanistic-Empirical Pavement Design Guide (MEPDG) provide a reasonable prediction of actual field performance, and if the desired accuracy or correspondence exists between predicted and monitored performance for Iowa conditions. A comprehensive literature review was conducted to identify the MEPDG input parameters and the MEPDG verification/calibration process. Sensitivities of MEPDG input parameters to predictions were studied using different versions of the MEPDG software. Based on literature review and sensitivity analysis, a detailed verification procedure was developed. A total of sixteen different types of pavement sections across Iowa, not used for national calibration in NCHRP 1-47A, were selected. A database of MEPDG inputs and the actual pavement performance measures for the selected pavement sites were prepared for verification. The accuracy of the MEPDG performance models for Iowa conditions was statistically evaluated. The verification testing showed promising results in terms of MEPDG’s performance prediction accuracy for Iowa conditions. Recalibrating the MEPDG performance models for Iowa conditions is recommended to improve the accuracy of predictions. ****************** Large File**************************

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asphalt pavements suffer various failures due to insufficient quality within their design lives. The American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide (MEPDG) has been proposed to improve pavement quality through quantitative performance prediction. Evaluation of the actual performance (quality) of pavements requires in situ nondestructive testing (NDT) techniques that can accurately measure the most critical, objective, and sensitive properties of pavement systems. The purpose of this study is to assess existing as well as promising new NDT technologies for quality control/quality assurance (QC/QA) of asphalt mixtures. Specifically, this study examined field measurements of density via the PaveTracker electromagnetic gage, shear-wave velocity via surface-wave testing methods, and dynamic stiffness via the Humboldt GeoGauge for five representative paving projects covering a range of mixes and traffic loads. The in situ tests were compared against laboratory measurements of core density and dynamic modulus. The in situ PaveTracker density had a low correlation with laboratory density and was not sensitive to variations in temperature or asphalt mix type. The in situ shear-wave velocity measured by surface-wave methods was most sensitive to variations in temperature and asphalt mix type. The in situ density and in situ shear-wave velocity were combined to calculate an in situ dynamic modulus, which is a performance-based quality measurement. The in situ GeoGauge stiffness measured on hot asphalt mixtures several hours after paving had a high correlation with the in situ dynamic modulus and the laboratory density, whereas the stiffness measurement of asphalt mixtures cooled with dry ice or at ambient temperature one or more days after paving had a very low correlation with the other measurements. To transform the in situ moduli from surface-wave testing into quantitative quality measurements, a QC/QA procedure was developed to first correct the in situ moduli measured at different field temperatures to the moduli at a common reference temperature based on master curves from laboratory dynamic modulus tests. The corrected in situ moduli can then be compared against the design moduli for an assessment of the actual pavement performance. A preliminary study of microelectromechanical systems- (MEMS)-based sensors for QC/QA and health monitoring of asphalt pavements was also performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cold in-place recycling (CIR) has become an attractive method for rehabilitating asphalt roads that have good subgrade support and are suffering distress related to non-structural aging and cracking of the pavement layer. Although CIR is widely used, its use could be expanded if its performance were more predictable. Transportation officials have observed roads that were recycled under similar circumstances perform very differently for no clear reason. Moreover, a rational mix design has not yet been developed, design assumptions regarding the structural support of the CIR layer remain empirical and conservative, and there is no clear understanding of the cause-effect relationships between the choices made during the design/construction process and the resulting performance. The objective of this project is to investigate these relationships, especially concerning the age of the recycled pavement, cumulative traffic volume, support conditions, aged engineering properties of the CIR materials, and road performance. Twenty-four CIR asphalt roads constructed in Iowa from 1986 to 2004 were studied: 18 were selected from a sample of roads studied in a previous research project (HR-392), and 6 were selected from newer CIR projects constructed after 1999. This report describes the results of comprehensive field and laboratory testing for these CIR asphalt roads. The results indicate that the modulus of the CIR layer and the air voids of the CIR asphalt binder were the most important factors affecting CIR pavement performance for high-traffic roads. For low-traffic roads, the wet indirect tensile strength significantly affected pavement performance. The results of this research can help identify changes that should be made with regard to design, material selection, and construction in order to improve the performance and cost-effectiveness of future recycled roads.