121 resultados para Analytic functions
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
In this study we set out to dissociate the developmental time course of automatic symbolic number processing and cognitive control functions in grade 1-3 British primary school children. Event-related potential (ERP) and behavioral data were collected in a physical size discrimination numerical Stroop task. Task-irrelevant numerical information was processed automatically already in grade 1. Weakening interference and strengthening facilitation indicated the parallel development of general cognitive control and automatic number processing. Relationships among ERP and behavioral effects suggest that control functions play a larger role in younger children and that automaticity of number processing increases from grade 1 to 3.
Resumo:
This paper illustrates the damage identification and condition assessment of a three story bookshelf structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). A major obstacle of using measured frequency response function data is a large size input variables to ANNs. This problem is overcome by applying a data reduction technique called principal component analysis (PCA). In the proposed procedure, ANNs with their powerful pattern recognition and classification ability were used to extract damage information such as damage locations and severities from measured FRFs. Therefore, simple neural network models are developed, trained by Back Propagation (BP), to associate the FRFs with the damage or undamaged locations and severity of the damage of the structure. Finally, the effectiveness of the proposed method is illustrated and validated by using the real data provided by the Los Alamos National Laboratory, USA. The illustrated results show that the PCA based artificial Neural Network method is suitable and effective for damage identification and condition assessment of building structures. In addition, it is clearly demonstrated that the accuracy of proposed damage detection method can also be improved by increasing number of baseline datasets and number of principal components of the baseline dataset.
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, very few attempts have been made to explore the structure damage with noise polluted data which is unavoidable effect in real world. The measurement data are contaminated by noise because of test environment as well as electronic devices and this noise tend to give error results with structural damage identification methods. Therefore it is important to investigate a method which can perform better with noise polluted data. This paper introduces a new damage index using principal component analysis (PCA) for damage detection of building structures being able to accept noise polluted frequency response functions (FRFs) as input. The FRF data are obtained from the function datagen of MATLAB program which is available on the web site of the IASC-ASCE (International Association for Structural Control– American Society of Civil Engineers) Structural Health Monitoring (SHM) Task Group. The proposed method involves a five-stage process: calculation of FRFs, calculation of damage index values using proposed algorithm, development of the artificial neural networks and introducing damage indices as input parameters and damage detection of the structure. This paper briefly describes the methodology and the results obtained in detecting damage in all six cases of the benchmark study with different noise levels. The proposed method is applied to a benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring, which was developed in order to facilitate the comparison of various damage identification methods. The illustrated results show that the PCA-based algorithm is effective for structural health monitoring with noise polluted FRFs which is of common occurrence when dealing with industrial structures.
Resumo:
This study examined the effect that temporal order within the entrepreneurial discovery exploitation process has on the outcomes of venture creation. Consistent with sequential theories of discovery-exploitation, the general flow of venture creation was found to be directed from discovery toward exploitation in a random sample of nascent ventures. However, venture creation attempts which specifically follow this sequence derive poor outcomes. Moreover, simultaneous discovery-exploitation was the most prevalent temporal order observed, and venture attempts that proceed in this manner more likely become operational. These findings suggest that venture creation is a multi-scale phenomenon that is at once directional in time, and simultaneously driven by symbiotically coupled discovery and exploitation.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.
Resumo:
Traditional analytic models for power system fault diagnosis are usually formulated as an unconstrained 0–1 integer programming problem. The key issue of the models is to seek the fault hypothesis that minimizes the discrepancy between the actual and the expected states of the concerned protective relays and circuit breakers. The temporal information of alarm messages has not been well utilized in these methods, and as a result, the diagnosis results may be not unique and hence indefinite, especially when complicated and multiple faults occur. In order to solve this problem, this paper presents a novel analytic model employing the temporal information of alarm messages along with the concept of related path. The temporal relationship among the actions of protective relays and circuit breakers, and the different protection configurations in a modern power system can be reasonably represented by the developed model, and therefore, the diagnosed results will be more definite under different circumstances of faults. Finally, an actual power system fault was served to verify the proposed method.
Resumo:
This paper documents the use of bibliometrics as a methodology to bring forth a structured, systematic and rigorous way to analyse and evaluate a range of literature. When starting out and reading broadly for my doctoral studies, one article by Trigwell and Prosser (1996b) led me to reflect about my level of comprehension as the content, concepts and methodology did not resonate with my epistemology. A disconnection between our paradigms emerged. Further reading unveiled the work by Doyle (1987) who categorised research in teaching and teacher education by three main areas: teacher characteristics, methods research and teacher behaviour. My growing concerns that there were gaps in the knowledge also exposed the difficulties in documenting said gaps. As an early researcher who required support to locate myself in the field and to find my research voice, I identified bibliometrics (Budd, 1988; Yeoh & Kaur, 2007) as an appropriate methodology to add value and rigour in three ways. Firstly, the application of bibliometrics to analyse articles is systematic, builds a picture from the characteristics of the literature, and offers a way to elicit themes within the categories. Secondly, by systematic analysis there is occasion to identify gaps within the body of work, limitations in methodology or areas in need of further research. Finally, extension and adaptation of the bibliometrics methodology, beyond citation or content analysis, to investigate the merit of methodology, participants and instruments as a determinant for research worth allowed the researcher to build confidence and contribute new knowledge to the field. Therefore, this paper frames research in the pedagogic field of Higher Education through teacher characteristics, methods research and teacher behaviour, visually represents the literature analysis and locates my research self within methods research. Through my research voice I will present the bibliometrics methodology, the outcomes and document the landscape of pedagogy in the field of Higher Education.
Resumo:
The research described in this paper forms part of an in-depth investigation of safety culture in one of Australia’s largest construction companies. The research builds on a previous qualitative study with organisational safety leaders and further investigates how safety culture is perceived and experienced by organisational members, as well as how this relates to their safety behaviour and related outcomes at work. Participants were 2273 employees of the case study organisation, with 689 from the Construction function and 1584 from the Resources function. The results of several analyses revealed some interesting organisational variance on key measures. Specifically, the Construction function scored significantly higher on all key measures: safety climate, safety motivation, safety compliance, and safety participation. The results are discussed in terms of relevance in an applied research context.
Resumo:
This study makes out the case for the use of the Conversational Analytic method as a research approach that might both extricate and chronicle the features of the journalism interview. It seeks to encourage such research to help inform understanding of this form and to provide further lessons as to the nature of journalism practice. Such studies might follow many paths but this paper focuses more particularly on the outcomes for the debate as to the continued relevance of "objectivity" in informing journalism professional practice. To make out the case for the veracity of CA as a means through which the conduct of journalism practice might be explored the paper examines: the theories of the interaction order that gave rise to the CA method; outlines the key features of the journalism interview as explicated through the CA approach; outlines the implications of such research for the establishment of the standing of "objectivity". It concludes as to the wider relevance of such studies of journalism practice for a fracturing journalism field, which suffers from a lack of benchmarks to measure the public benefit of the range of forms that now proliferate on the internet.
Resumo:
In 1980 Alltop produced a family of cubic phase sequences that nearly meet the Welch bound for maximum non-peak correlation magnitude. This family of sequences were shown by Wooters and Fields to be useful for quantum state tomography. Alltop’s construction used a function that is not planar, but whose difference function is planar. In this paper we show that Alltop type functions cannot exist in fields of characteristic 3 and that for a known class of planar functions, x^3 is the only Alltop type function.