127 resultados para Test data generation
Resumo:
There is large uncertainty about the magnitude of warming and how rainfall patterns will change in response to any given scenario of future changes in atmospheric composition and land use. The models used for future climate projections were developed and calibrated using climate observations from the past 40 years. The geologic record of environmental responses to climate changes provides a unique opportunity to test model performance outside this limited climate range. Evaluation of model simulations against palaeodata shows that models reproduce the direction and large-scale patterns of past changes in climate, but tend to underestimate the magnitude of regional changes. As part of the effort to reduce model-related uncertainty and produce more reliable estimates of twenty-first century climate, the Palaeoclimate Modelling Intercomparison Project is systematically applying palaeoevaluation techniques to simulations of the past run with the models used to make future projections. This evaluation will provide assessments of model performance, including whether a model is sufficiently sensitive to changes in atmospheric composition, as well as providing estimates of the strength of biosphere and other feedbacks that could amplify the model response to these changes and modify the characteristics of climate variability.
Resumo:
Developing models to predict the effects of social and economic change on agricultural landscapes is an important challenge. Model development often involves making decisions about which aspects of the system require detailed description and which are reasonably insensitive to the assumptions. However, important components of the system are often left out because parameter estimates are unavailable. In particular, measurements of the relative influence of different objectives, such as risk, environmental management, on farmer decision making, have proven difficult to quantify. We describe a model that can make predictions of land use on the basis of profit alone or with the inclusion of explicit additional objectives. Importantly, our model is specifically designed to use parameter estimates for additional objectives obtained via farmer interviews. By statistically comparing the outputs of this model with a large farm-level land-use data set, we show that cropping patterns in the United Kingdom contain a significant contribution from farmer’s preference for objectives other than profit. In particular, we found that risk aversion had an effect on the accuracy of model predictions, whereas preference for a particular number of crops grown was less important. While nonprofit objectives have frequently been identified as factors in farmers’ decision making, our results take this analysis further by demonstrating the relationship between these preferences and actual cropping patterns.
Resumo:
This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.
Resumo:
Details are given of the development and application of a 2D depth-integrated, conformal boundary-fitted, curvilinear model for predicting the depth-mean velocity field and the spatial concentration distribution in estuarine and coastal waters. A numerical method for conformal mesh generation, based on a boundary integral equation formulation, has been developed. By this method a general polygonal region with curved edges can be mapped onto a regular polygonal region with the same number of horizontal and vertical straight edges and a multiply connected region can be mapped onto a regular region with the same connectivity. A stretching transformation on the conformally generated mesh has also been used to provide greater detail where it is needed close to the coast, with larger mesh sizes further offshore, thereby minimizing the computing effort whilst maximizing accuracy. The curvilinear hydrodynamic and solute model has been developed based on a robust rectilinear model. The hydrodynamic equations are approximated using the ADI finite difference scheme with a staggered grid and the solute transport equation is approximated using a modified QUICK scheme. Three numerical examples have been chosen to test the curvilinear model, with an emphasis placed on complex practical applications
Resumo:
This paper assesses the performance of a vocabulary test designed to measure second language productive vocabulary knowledge.The test, Lex30, uses a word association task to elicit vocabulary, and uses word frequency data to measure the vocabulary produced. Here we report firstly on the reliability of the test as measured by a test-retest study, a parallel test forms experiment and an internal consistency measure. We then investigate the construct validity of the test by looking at changes in test performance over time, analyses of correlations with scores on similar tests, and comparison of spoken and written test performance. Last, we examine the theoretical bases of the two main test components: eliciting vocabulary and measuring vocabulary. Interpretations of our findings are discussed in the context of test validation research literature. We conclude that the findings reported here present a robust argument for the validity of the test as a research tool, and encourage further investigation of its validity in an instructional context
Resumo:
n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.
Resumo:
In financial research, the sign of a trade (or identity of trade aggressor) is not always available in the transaction dataset and it can be estimated using a simple set of rules called the tick test. In this paper we investigate the accuracy of the tick test from an analytical perspective by providing a closed formula for the performance of the prediction algorithm. By analyzing the derived equation, we provide formal arguments for the use of the tick test by proving that it is bounded to perform better than chance (50/50) and that the set of rules from the tick test provides an unbiased estimator of the trade signs. On the empirical side of the research, we compare the values from the analytical formula against the empirical performance of the tick test for fifteen heavily traded stocks in the Brazilian equity market. The results show that the formula is quite realistic in assessing the accuracy of the prediction algorithm in a real data situation.
Resumo:
We compare hypothetical and observed (experimental) willingness to pay (WTP) for a gradual improvement in the environmental performance of a marketed good (an office table). First, following usual practices in marketing research, subjects’ stated WTP for the improvement is obtained. Second, the same subjects participate in a real reward experiment designed to replicate the scenario valued in the hypothetical question. Our results show that, independently of the degree of the improvement, there are no significant median differences between stated and experimental data. However, subjects reporting extreme values of WTP (low or high) exhibit a more moderate behavior in the experiment.
Resumo:
We conducted 2 longitudinal meditational studies to test an integrative model of goals, stress and coping, and well‐being. Study 1 documented avoidance personal goals as an antecedent of life stressors and life stressors as a partial mediator of the relation between avoidance goals and longitudinal change in subjective well‐being (SWB). Study 2 fully replicated Study 1 and likewise validated avoidance goals as an antecedent of avoidance coping and avoidance coping as a partial mediator of the relation between avoidance goals and longitudinal change in SWB. It also showed that avoidance coping partially mediates the link between avoidance goals and life stressors and validated a sequential meditational model involving both avoidance coping and life stressors. The aforementioned results held when controlling for social desirability, basic traits, and general motivational dispositions. The findings are discussed with regard to the integration of various strands of research on self‐regulation. (PsycINFO Database Record (c) 2012 APA, all rights reserved)(journal abstract)
Resumo:
It has been reported that the ability to solve syllogisms is highly g-loaded. In the present study, using a self-administered shortened version of a syllogism-solving test, the BAROCO Short, we examined whether robust findings generated by previous research regarding IQ scores were also applicable to BAROCO Short scores. Five syllogism-solving problems were included in a questionnaire as part of a postal survey conducted by the Keio Twin Research Center. Data were collected from 487 pairs of twins (1021 individuals) who were Japanese junior high or high school students (ages 13–18) and from 536 mothers and 431 fathers. Four findings related to IQ were replicated: 1) The mean level increased gradually during adolescence, stayed unchanged from the 30s to the early 50s, and subsequently declined after the late 50s. 2) The scores for both children and parents were predicted by the socioeconomic status of the family. 3) The genetic effect increased, although the shared environmental effect decreased during progression from adolescence to adulthood. 4) Children's scores were genetically correlated with school achievement. These findings further substantiate the close association between syllogistic reasoning ability and g.
Resumo:
A number of methods of evaluating the validity of interval forecasts of financial data are analysed, and illustrated using intraday FTSE100 index futures returns. Some existing interval forecast evaluation techniques, such as the Markov chain approach of Christoffersen (1998), are shown to be inappropriate in the presence of periodic heteroscedasticity. Instead, we consider a regression-based test, and a modified version of Christoffersen's Markov chain test for independence, and analyse their properties when the financial time series exhibit periodic volatility. These approaches lead to different conclusions when interval forecasts of FTSE100 index futures returns generated by various GARCH(1,1) and periodic GARCH(1,1) models are evaluated.
Resumo:
Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.
Resumo:
This paper considers the effect of using a GARCH filter on the properties of the BDS test statistic as well as a number of other issues relating to the application of the test. It is found that, for certain values of the user-adjustable parameters, the finite sample distribution of the test is far-removed from asymptotic normality. In particular, when data generated from some completely different model class are filtered through a GARCH model, the frequency of rejection of iid falls, often substantially. The implication of this result is that it might be inappropriate to use non-rejection of iid of the standardised residuals of a GARCH model as evidence that the GARCH model ‘fits’ the data.
Resumo:
This paper presents and implements a number of tests for non-linear dependence and a test for chaos using transactions prices on three LIFFE futures contracts: the Short Sterling interest rate contract, the Long Gilt government bond contract, and the FTSE 100 stock index futures contract. While previous studies of high frequency futures market data use only those transactions which involve a price change, we use all of the transaction prices on these contracts whether they involve a price change or not. Our results indicate irrefutable evidence of non-linearity in two of the three contracts, although we find no evidence of a chaotic process in any of the series. We are also able to provide some indications of the effect of the duration of the trading day on the degree of non-linearity of the underlying contract. The trading day for the Long Gilt contract was extended in August 1994, and prior to this date there is no evidence of any structure in the return series. However, after the extension of the trading day we do find evidence of a non-linear return structure.
Resumo:
Pollen data from China for 6000 and 18,000 14C yr bp were compiled and used to reconstruct palaeovegetation patterns, using complete taxon lists where possible and a biomization procedure that entailed the assignment of 645 pollen taxa to plant functional types. A set of 658 modern pollen samples spanning all biomes and regions provided a comprehensive test for this procedure and showed convincing agreement between reconstructed biomes and present natural vegetation types, both geographically and in terms of the elevation gradients in mountain regions of north-eastern and south-western China. The 6000 14C yr bp map confirms earlier studies in showing that the forest biomes in eastern China were systematically shifted northwards and extended westwards during the mid-Holocene. Tropical rain forest occurred on mainland China at sites characterized today by either tropical seasonal or broadleaved evergreen/warm mixed forest. Broadleaved evergreen/warm mixed forest occurred further north than today, and at higher elevation sites within the modern latitudinal range of this biome. The northern limit of temperate deciduous forest was shifted c. 800 km north relative to today. The 18,000 14C yr bp map shows that steppe and even desert vegetation extended to the modern coast of eastern China at the last glacial maximum, replacing today’s temperate deciduous forest. Tropical forests were excluded from China and broadleaved evergreen/warm mixed forest had retreated to tropical latitudes, while taiga extended southwards to c. 43°N.