15 resultados para THRESHOLD
em CentAUR: Central Archive University of Reading - UK
Resumo:
In this review, we consider three possible criteria by which knowledge might be regarded as implicit or inaccessible: It might be implicit only in the sense that it is difficult to articulate freely, or it might be implicit according to either an objective threshold or a subjective threshold. We evaluate evidence for these criteria in relation to artificial grammar learning, the control of complex systems, and sequence learning, respectively. We argue that the convincing evidence is not yet in, but construing the implicit nature of implicit learning in terms of a subjective threshold is most likely to prove fruitful for future research. Furthermore, the subjective threshold criterion may demarcate qualitatively different types of knowledge. We argue that (1) implicit, rather than explicit, knowledge is often relatively inflexible in transfer to different domains, (2) implicit, rather than explicit, learning occurs when attention is focused on specific items and not underlying rules, and (3) implicit learning and the resulting knowledge are often relatively robust.
Resumo:
Recent studies into price transmission have recognized the important role played by transport and transaction costs. Threshold models are one approach to accommodate such costs. We develop a generalized Threshold Error Correction Model to test for the presence and form of threshold behavior in price transmission that is symmetric around equilibrium. We use monthly wheat, maize, and soya prices from the United States, Argentina, and Brazil to demonstrate this model. Classical estimation of these generalized models can present challenges but Bayesian techniques avoid many of these problems. Evidence for thresholds is found in three of the five commodity price pairs investigated.
Resumo:
Globally there have been a number of concerns about the development of genetically modified crops many of which relate to the implications of gene flow at various levels. In Europe these concerns have led the European Union (EU) to promote the concept of 'coexistence' to allow the freedom to plant conventional and genetically modified (GM) varieties but to minimise the presence of transgenic material within conventional crops. Should a premium for non-GM varieties emerge on the market, the presence of transgenes would generate a 'negative externality' to conventional growers. The establishment of maximum tolerance level for the adventitious presence of GM material in conventional crops produces a threshold effect in the external costs. The existing literature suggests that apart from the biological characteristics of the plant under consideration (e.g. self-pollination rates, entomophilous species, anemophilous species, etc.), gene flow at the landscape level is affected by the relative size of the source and sink populations and the spatial arrangement of the fields in the landscape. In this paper, we take genetically modified herbicide tolerant oilseed rape (GM HT OSR) as a model crop. Starting from an individual pollen dispersal function, we develop a spatially explicit numerical model in order to assess the effect of the size of the source/sink populations and the degree of spatial aggregation on the extent of gene flow into conventional OSR varieties under two alternative settings. We find that when the transgene presence in conventional produce is detected at the field level, the external cost will increase with the size of the source area and with the level of spatial disaggregation. on the other hand when the transgene presence is averaged among all conventional fields in the landscape (e.g. because of grain mixing before detection), the external cost will only depend on the relative size of the source area. The model could readily be incorporated into an economic evaluation of policies to regulate adoption of GM HT OSR. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Light Detection And Ranging (LIDAR) is an important modality in terrain and land surveying for many environmental, engineering and civil applications. This paper presents the framework for a recently developed unsupervised classification algorithm called Skewness Balancing for object and ground point separation in airborne LIDAR data. The main advantages of the algorithm are threshold-freedom and independence from LIDAR data format and resolution, while preserving object and terrain details. The framework for Skewness Balancing has been built in this contribution with a prediction model in which unknown LIDAR tiles can be categorised as “hilly” or “moderate” terrains. Accuracy assessment of the model is carried out using cross-validation with an overall accuracy of 95%. An extension to the algorithm is developed to address the overclassification issue for hilly terrain. For moderate terrain, the results show that from the classified tiles detached objects (buildings and vegetation) and attached objects (bridges and motorway junctions) are separated from bare earth (ground, roads and yards) which makes Skewness Balancing ideal to be integrated into geographic information system (GIS) software packages.
Resumo:
A new approach is presented to identify the number of incoming signals in antenna array processing. The new method exploits the inherent properties existing in the noise eigenvalues of the covariance matrix of the array output. A single threshold has been established concerning information about the signal and noise strength, data length, and array size. When the subspace-based algorithms are adopted the computation cost of the signal number detector can almost be neglected. The performance of the threshold is robust against low SNR and short data length.
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
This mixed-method study tracked social interaction and adaptation among 20 international postgraduates on a 1-year programme in the UK, examining assumptions that language proficiency and interactional engagement directly underpin sociocultural adaptation. Participants remained frustrated by a perceived ‘threshold’ barring successful interaction with English speakers, while reporting reluctance to take up available opportunities, independent of language proficiency and sociocultural adaptation. We challenge linear models of adaptation and call for assistance to international students in crossing the threshold to successful interaction.
Resumo:
We test the expectations theory of the term structure of U.S. interest rates in nonlinear systems. These models allow the response of the change in short rates to past values of the spread to depend upon the level of the spread. The nonlinear system is tested against a linear system, and the results of testing the expectations theory in both models are contrasted. We find that the results of tests of the implications of the expectations theory depend on the size and sign of the spread. The long maturity spread predicts future changes of the short rate only when it is high.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
This paper combines and generalizes a number of recent time series models of daily exchange rate series by using a SETAR model which also allows the variance equation of a GARCH specification for the error terms to be drawn from more than one regime. An application of the model to the French Franc/Deutschmark exchange rate demonstrates that out-of-sample forecasts for the exchange rate volatility are also improved when the restriction that the data it is drawn from a single regime is removed. This result highlights the importance of considering both types of regime shift (i.e. thresholds in variance as well as in mean) when analysing financial time series.
Resumo:
Taste and smell detection threshold measurements are frequently time consuming especially when the method involves reversing the concentrations presented to replicate and improve accuracy of results. These multiple replications are likely to cause sensory and cognitive fatigue which may be more pronounced in elderly populations. A new rapid detection threshold methodology was developed that quickly located the likely position of each individuals sensory detection threshold then refined this by providing multiple concentrations around this point to determine their threshold. This study evaluates the reliability and validity of this method. Findings indicate that this new rapid detection threshold methodology was appropriate to identify differences in sensory detection thresholds between different populations and has positive benefits in providing a shorter assessment of detection thresholds. The results indicated that this method is appropriate at determining individual as well as group detection thresholds.
Resumo:
Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.
Resumo:
Objectives: This study provides the first large scale analysis of the age at which adolescents in medieval England entered and completed the pubertal growth spurt. This new method has implications for expanding our knowledge of adolescent maturation across different time periods and regions. Methods: In total, 994 adolescent skeletons (10-25 years) from four urban sites in medieval England (AD 900-1550) were analysed for evidence of pubertal stage using new osteological techniques developed from the clinical literature (i.e. hamate hook development, CVM, canine mineralisation, iliac crest ossification, radial fusion). Results: Adolescents began puberty at a similar age to modern children at around 10-12 years, but the onset of menarche in girls was delayed by up to 3 years, occurring around 15 for most in the study sample and 17 years for females living in London. Modern European males usually complete their maturation by 16-18 years; medieval males took longer with the deceleration stage of the growth spurt extending as late as 21 years. Conclusions: This research provides the first attempt to directly assess the age of pubertal development in adolescents during the tenth to seventeenth centuries. Poor diet, infections, and physical exertion may have contributed to delayed development in the medieval adolescents, particularly for those living in the city of London. This study sheds new light on the nature of adolescence in the medieval period, highlighting an extended period of physical and social transition.