976 resultados para empirical testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores Rizvi and Lingard’s (2010) idea of the “local vernacular” of the global education policy trend of using high-stakes testing to increase accountability and transparency, and by extension quality, within schools and education systems in Australia. In the first part of the paper a brief context of the policy trajectory of National Assessment Program – Literacy and Numeracy (NAPLAN) is given in Australia. In the second part, empirical evidence drawn from a survey of teachers in Western Australia (WA) and South Australia (SA) is used to explore teacher perceptions of the impacts a high-stakes testing regime is having on student learning, relationships with parents and pedagogy in specific sites. After the 2007 Australian Federal election, one of Labor’s policy objectives was to deliver an “Education Revolution” designed to improve both the equity and excellence in the Australian school system1 (Rudd & Gillard, 2008). This reform agenda aims to “deliver real changes” through: “raising the quality of teaching in our schools” and “improving transparency and accountability of schools and school systems” (Rudd & Gillard, 2008, p. 5). Central to this linking of accountability, the transparency of schools and school systems and raising teaching quality was the creation of a regime of testing (NAPLAN) that would generate data about the attainment of basic literacy and numeracy skills by students in Australian schools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to advance the methodology and use of time series analysis to quantify dynamic structures in psychophysiological processes and thereby to produce information on spontaneously coupled physiological responses and their behavioral and experiential correlates. Series of analyses using both simulated and empirical cardiac (IBI), electrodermal (EDA), and facial electromyographic (EMG) data indicated that, despite potential autocorrelated structures, smoothing increased the reliability of detecting response coupling from an interindividual distribution of intraindividual measures and that especially the measures of covariance produced accurate information on the extent of coupled responses. This methodology was applied to analyze spontaneously coupled IBI, EDA, and facial EMG responses and vagal activity in their relation to emotional experience and personality characteristics in a group of middle-aged men (n = 37) during the administration of the Rorschach testing protocol. The results revealed new characteristics in the relationship between phasic end-organ synchronization and vagal activity, on the one hand, and individual differences in emotional adjustment to novel situations on the other. Specifically, it appeared that the vagal system is intimately related to emotional and social responsivity. It was also found that the lack of spontaneously synchronized responses is related to decreased energetic arousal (e.g., depression, mood). These findings indicate that the present process analysis approach has many advantages for use in both experimental and applied research, and that it is a useful new paradigm in psychophysiological research. Keywords: Autonomic Nervous System; Emotion; Facial Electromyography; Individual Differences; Spontaneous Responses; Time Series Analysis; Vagal System

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth (AeH) systems–a new genre of eHealth systems designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypotheses relating to 9 constructs were tested using a structural equation modelling technique. The moderation effects on the hypotheses were also tested based on six moderation factors to understand their role on the designed research model. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. Hypothesis testing provided sufficient data to accept 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. All six moderation factors showed significant influence on the research model. A validation of this model with a wider survey cohort is recommended as a future study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mikael Juselius’ doctoral dissertation covers a range of significant issues in modern macroeconomics by empirically testing a number of important theoretical hypotheses. The first essay presents indirect evidence within the framework of the cointegrated VAR model on the elasticity of substitution between capital and labor by using Finnish manufacturing data. Instead of estimating the elasticity of substitution by using the first order conditions, he develops a new approach that utilizes a CES production function in a model with a 3-stage decision process: investment in the long run, wage bargaining in the medium run and price and employment decisions in the short run. He estimates the elasticity of substitution to be below one. The second essay tests the restrictions implied by the core equations of the New Keynesian Model (NKM) in a vector autoregressive model (VAR) by using both Euro area and U.S. data. Both the new Keynesian Phillips curve and the aggregate demand curve are estimated and tested. The restrictions implied by the core equations of the NKM are rejected on both U.S. and Euro area data. These results are important for further research. The third essay is methodologically similar to essay 2, but it concentrates on Finnish macro data by adopting a theoretical framework of an open economy. Juselius’ results suggests that the open economy NKM framework is too stylized to provide an adequate explanation for Finnish inflation. The final essay provides a macroeconometric model of Finnish inflation and associated explanatory variables and it estimates the relative importance of different inflation theories. His main finding is that Finnish inflation is primarily determined by excess demand in the product market and by changes in the long-term interest rate. This study is part of the research agenda carried out by the Research Unit of Economic Structure and Growth (RUESG). The aim of RUESG it to conduct theoretical and empirical research with respect to important issues in industrial economics, real option theory, game theory, organization theory, theory of financial systems as well as to study problems in labor markets, macroeconomics, natural resources, taxation and time series econometrics. RUESG was established at the beginning of 1995 and is one of the National Centers of Excellence in research selected by the Academy of Finland. It is financed jointly by the Academy of Finland, the University of Helsinki, the Yrjö Jahnsson Foundation, Bank of Finland and the Nokia Group. This support is gratefully acknowledged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Brittle-to-ductile-transition-temperature (BDTT) of free-standing Pt-aluminide (PtAl) coating specimens, i.e. stand-alone coating specimens without any substrate, was determined by micro-tensile testing technique. The effect of Pt content, expressed in terms of the thickness of initial electro-deposited Pt layer, on the BDTT of the coating has been evaluated and an empirical correlation drawn. Increase in the electrodeposited Pt layer thickness from nil to 10 mu m was found to cause an increase in the BDTT of the coating by about 100 degrees C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a visual search problem studied by Sripati and Olson where the objective is to identify an oddball image embedded among multiple distractor images as quickly as possible. We model this visual search task as an active sequential hypothesis testing problem (ASHT problem). Chernoff in 1959 proposed a policy in which the expected delay to decision is asymptotically optimal. The asymptotics is under vanishing error probabilities. We first prove a stronger property on the moments of the delay until a decision, under the same asymptotics. Applying the result to the visual search problem, we then propose a ``neuronal metric'' on the measured neuronal responses that captures the discriminability between images. From empirical study we obtain a remarkable correlation (r = 0.90) between the proposed neuronal metric and speed of discrimination between the images. Although this correlation is lower than with the L-1 metric used by Sripati and Olson, this metric has the advantage of being firmly grounded in formal decision theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper confirms presence of GARCH(1,1) effect on stock return time series of Vietnam’s newborn stock market. We performed tests on four different time series, namely market returns (VN-Index), and return series of the first four individual stocks listed on the Vietnamese exchange (the Ho Chi Minh City Securities Trading Center) since August 2000. The results have been quite relevant to previously reported empirical studies on different markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens automatically in Bayesian analysis, and to distinguish this correction from the Bayesian Ockham's-razor effect. Our second goal is to contrast empirical-Bayes and fully Bayesian approaches to variable selection through examples, theoretical results and simulations. Considerable differences between the two approaches are found. In particular, we prove a theorem that characterizes a surprising aymptotic discrepancy between fully Bayes and empirical Bayes. This discrepancy arises from a different source than the failure to account for hyperparameter uncertainty in the empirical-Bayes estimate. Indeed, even at the extreme, when the empirical-Bayes estimate converges asymptotically to the true variable-inclusion probability, the potential for a serious difference remains. © Institute of Mathematical Statistics, 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the claim for the market for a new business management to ensure the presence of women in decision -making to respond to new social needs addressed. Thus, this paper analyzes the influence of gender diversity of the directors on the profitability and the level of debt for a sample of 5,199 Spanish cooperatives. Unlike capitalist societies, these organizations have a number of peculiarities in their government, and that the partners are themselves major time, agents and customers. The study focuses on the Spanish context, where there is an open debate on the importance of women's business management, as in other countries, driven by the proliferation of legislation on gender equality, being, in addition, Spain, the pioneer in having specific legislation on Social Economy. The results show that cooperatives with greater female representation in theirs Boards have higher profitability. On the other hand, those Boards with a higher percentage of women show a lower level of indebtedness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the stability of the benefit transfer function across 42 recreational forests in the British Isles. A working definition of reliable function transfer is Put forward, and a suitable statistical test is provided. A novel split sample method is used to test the sensitivity of the models' log-likelihood values to the removal of contingent valuation (CV) responses collected at individual forest sites, We find that a stable function improves Our measure of transfer reliability, but not by much. We conclude that, in empirical Studies on transferability, considerations of function stability are secondary to the availability and quality of site attribute data. Modellers' can study the advantages of transfer function stability vis-a-vis the value of additional information on recreation site attributes. (c) 2008 Elsevier GmbH. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examined the relationship between Individualism/Collectivism and generalized social trust across 31 European nations participating in the European Social Survey. Using multi-level regression analyses, the current study provides the first empirical investigation of the effects of cultural norms of Individualism/Collectivism on generalized social trust while accounting for individuals' own cultural orientations within the same analysis. The results provide clear support for Yamagishi and Yamagishi (1994) emancipation theory of trust, showing a significant and positive relationship between Individualism/Collectivism and generalized social trust, over and above the effect of a country political history of communism and ethnic heterogeneity. Having controlled for individual effects of Individualism/Collectivism it is clear that the results of the current analysis cannot be reduced to an individual-level explanation, but must be interpreted within the context Of macrosocial processes. We conclude by discussing potential mechanisms that could explain why national individualism is more likely to foster trust among people than collectivism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tests a simple market fraction asset pricing model with heterogeneous
agents. By selecting a set of structural parameters of the model through a systematic procedure, we show that the autocorrelations (of returns, absolute returns and squared returns) of the market fraction model share the same pattern as those of the DAX 30. By conducting econometric analysis via Monte Carlo simulations, we characterize these power-law behaviours and find that estimates of the power-law decay indices, the (FI)GARCH parameters, and the tail index of the selected market fraction model closely match those of the DAX 30. The results strongly support the explanatory power of the heterogeneous agent models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Side-channel analysis of cryptographic systems can allow for the recovery of secret information by an adversary even where the underlying algorithms have been shown to be provably secure. This is achieved by exploiting the unintentional leakages inherent in the underlying implementation of the algorithm in software or hardware. Within this field of research, a class of attacks known as profiling attacks, or more specifically as used here template attacks, have been shown to be extremely efficient at extracting secret keys. Template attacks assume a strong adversarial model, in that an attacker has an identical device with which to profile the power consumption of various operations. This can then be used to efficiently attack the target device. Inherent in this assumption is that the power consumption across the devices under test is somewhat similar. This central tenet of the attack is largely unexplored in the literature with the research community generally performing the profiling stage on the same device as being attacked. This is beneficial for evaluation or penetration testing as it is essentially the best case scenario for an attacker where the model built during the profiling stage matches exactly that of the target device, however it is not necessarily a reflection on how the attack will work in reality. In this work, a large scale evaluation of this assumption is performed, comparing the key recovery performance across 20 identical smart-cards when performing a profiling attack.