946 resultados para asset pricing tests


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce and explore an approach to estimating statistical significance of classification accuracy, which is particularly useful in scientific applications of machine learning where high dimensionality of the data and the small number of training examples render most standard convergence bounds too loose to yield a meaningful guarantee of the generalization ability of the classifier. Instead, we estimate statistical significance of the observed classification accuracy, or the likelihood of observing such accuracy by chance due to spurious correlations of the high-dimensional data patterns with the class labels in the given training set. We adopt permutation testing, a non-parametric technique previously developed in classical statistics for hypothesis testing in the generative setting (i.e., comparing two probability distributions). We demonstrate the method on real examples from neuroimaging studies and DNA microarray analysis and suggest a theoretical analysis of the procedure that relates the asymptotic behavior of the test to the existing convergence bounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes two programs for generating tests for digital circuits that exploit several kinds of expert knowledge not used by previous approaches. First, many test generation problems can be solved efficiently using operation relations, a novel representation of circuit behavior that connects internal component operations with directly executable circuit operations. Operation relations can be computed efficiently by searching traces of simulated circuit behavior. Second, experts write test programs rather than test vectors because programs are more readable and compact. Test programs can be constructed automatically by merging program fragments using expert-supplied goal-refinement rules and domain-independent planning techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In practice, piles are most often modelled as "Beams on Non-Linear Winkler Foundation" (also known as “p-y spring” approach) where the soil is idealised as p-y springs. These p-y springs are obtained through semi-empirical approach using element test results of the soil. For liquefied soil, a reduction factor (often termed as p-multiplier approach) is applied on a standard p-y curve for the non-liquefied condition to obtain the p-y curve liquefied soil condition. This paper presents a methodology to obtain p-y curves for liquefied soil based on element testing of liquefied soil considering physically plausible mechanisms. Validation of the proposed p-y curves is carried out through the back analysis of physical model tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ap Gwilym, Owain, McManus, Ian, and Thomas, Stephen, 'Fractional versus decimal pricing: Evidence from the UK Long Gilt futures market', Journal of Futures Markets (2005) 25(5) pp.419-442 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Google AdSense Program is a successful internet advertisement program where Google places contextual adverts on third-party websites and shares the resulting revenue with each publisher. Advertisers have budgets and bid on ad slots while publishers set reserve prices for the ad slots on their websites. Following previous modelling efforts, we model the program as a two-sided market with advertisers on one side and publishers on the other. We show a reduction from the Generalised Assignment Problem (GAP) to the problem of computing the revenue maximising allocation and pricing of publisher slots under a first-price auction. GAP is APX-hard but a (1-1/e) approximation is known. We compute truthful and revenue-maximizing prices and allocation of ad slots to advertisers under a second-price auction. The auctioneer's revenue is within (1-1/e) second-price optimal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The desire to obtain competitive advantage is a motivator for implementing Enterprise Resource Planning (ERP) Systems (Adam & O’Doherty, 2000). However, while it is accepted that Information Technology (IT) in general may contribute to the improvement of organisational performance (Melville, Kraemer, & Gurbaxani, 2004), the nature and extent of that contribution is poorly understood (Jacobs & Bendoly, 2003; Ravichandran & Lertwongsatien, 2005). Accordingly, Henderson and Venkatraman (1993) assert that it is the application of business and IT capabilities to develop and leverage a firm’s IT resources for organisational transformation, rather than the acquired technological functionality, that secures competitive advantage for firms. Application of the Resource Based View of the firm (Wernerfelt, 1984) and Dynamic Capabilities Theory (DCT) (Teece and Pisano (1998) in particular) may yield insights into whether or not the use of Enterprise Systems enhances organisations’ core capabilities and thereby obtains competitive advantage, sustainable or otherwise (Melville et al., 2004). An operational definition of Core Capabilities that is independent of the construct of Sustained Competitive Advantage is formulated. This Study proposes and utilises an applied Dynamic Capabilities framework to facilitate the investigation of the role of Enterprise Systems. The objective of this research study is to investigate the role of Enterprise Systems in the Core Dynamic Capabilities of Asset Lifecycle Management. The Study explores the activities of Asset Lifecycle Management, the Core Dynamic Capabilities inherent in Asset Lifecycle Management and the footprint of Enterprise Systems on those Dynamic Capabilities. Additionally, the study explains the mechanisms by which Enterprise Systems sustain the Exploitability and the Renewability of those Core Dynamic Capabilities. The study finds that Enterprise Systems contribute directly to the Value, Exploitability and Renewability of Core Dynamic Capabilities and indirectly to their Inimitability and Non-substitutability. The study concludes by presenting an applied Dynamic Capabilities framework, which integrates Alter (1992)’s definition of Information Systems with Teece and Pisano (1998)’s model of Dynamic Capabilities to provide a robust diagnostic for determining the sustained value generating contributions of Enterprise Systems. These frameworks are used in the conclusions to frame the findings of the study. The conclusions go on to assert that these frameworks are free - standing and analytically generalisable, per Siggelkow (2007) and Yin (2003).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We firstly examine the model of Hobson and Rogers for the volatility of a financial asset such as a stock or share. The main feature of this model is the specification of volatility in terms of past price returns. The volatility process and the underlying price process share the same source of randomness and so the model is said to be complete. Complete models are advantageous as they allow a unique, preference independent price for options on the underlying price process. One of the main objectives of the model is to reproduce the `smiles' and `skews' seen in the market implied volatilities and this model produces the desired effect. In the first main piece of work we numerically calibrate the model of Hobson and Rogers for comparison with existing literature. We also develop parameter estimation methods based on the calibration of a GARCH model. We examine alternative specifications of the volatility and show an improvement of model fit to market data based on these specifications. We also show how to process market data in order to take account of inter-day movements in the volatility surface. In the second piece of work, we extend the Hobson and Rogers model in a way that better reflects market structure. We extend the model to take into account both first and second order effects. We derive and numerically solve the pde which describes the price of options under this extended model. We show that this extension allows for a better fit to the market data. Finally, we analyse the parameters of this extended model in order to understand intuitively the role of these parameters in the volatility surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Serologic methods have been used widely to test for celiac disease and have gained importance in diagnostic definition and in new epidemiologic findings. However, there is no standardization, and there are no reference protocols and materials. METHODS: The European working group on Serological Screening for Celiac Disease has defined robust noncommercial test protocols for immunoglobulin (Ig)G and IgA gliadin antibodies and for IgA autoantibodies against endomysium and tissue transglutaminase. Standard curves were linear in the decisive range, and intra-assay variation coefficients were less than 5% to 10%. Calibration was performed with a group reference serum. Joint cutoff limits were used. Seven laboratories took part in the final collaborative study on 252 randomized sera classified by histology (103 pediatric and adult patients with active celiac disease, 89 disease control subjects, and 60 blood donors). RESULTS: IgA autoantibodies against endomysium and tissue transglutaminase rendered superior sensitivity (90% and 93%, respectively) and specificity (99% and 95%, respectively) over IgA and IgG gliadin antibodies. Tissue transglutaminase antibody testing showed superior receiver operating characteristic performance compared with gliadin antibodies. The K values for interlaboratory reproducibility showed superiority for IgA endomysium (0.93) in comparison with tissue transglutaminase antibodies (0.83) and gliadin antibodies (0.82 for IgG, 0.62 for IgA). CONCLUSIONS: Basic criteria of standardization and quality assessment must be fulfilled by any given test protocol proposed for serologic investigation of celiac disease. The working group has produced robust test protocols and reference materials available for standardization to further improve reliability of serologic testing for celiac disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pigeons and other animals soon learn to wait (pause) after food delivery on periodic-food schedules before resuming the food-rewarded response. Under most conditions the steady-state duration of the average waiting time, t, is a linear function of the typical interfood interval. We describe three experiments designed to explore the limits of this process. In all experiments, t was associated with one key color and the subsequent food delay, T, with another. In the first experiment, we compared the relation between t (waiting time) and T (food delay) under two conditions: when T was held constant, and when T was an inverse function of t. The pigeons could maximize the rate of food delivery under the first condition by setting t to a consistently short value; optimal behavior under the second condition required a linear relation with unit slope between t and T. Despite this difference in optimal policy, the pigeons in both cases showed the same linear relation, with slope less than one, between t and T. This result was confirmed in a second parametric experiment that added a third condition, in which T + t was held constant. Linear waiting appears to be an obligatory rule for pigeons. In a third experiment we arranged for a multiplicative relation between t and T (positive feedback), and produced either very short or very long waiting times as predicted by a quasi-dynamic model in which waiting time is strongly determined by the just-preceding food delay.