969 resultados para Fisher, Jim
Resumo:
We extended an earlier study (Vision Research, 45, 1967–1974, 2005) in which we investigated limits at which induced blur of letter targets becomes noticeable, troublesome and objectionable. Here we used a deformable adaptive optics mirror to vary spherical defocus for conditions of a white background with correction of astigmatism; a white background with reduction of all aberrations other than defocus; and a monochromatic background with reduction of all aberrations other than defocus. We used seven cyclopleged subjects, lines of three high-contrast letters as targets, 3–6 mm artificial pupils, and 0.1–0.6 logMAR letter sizes. Subjects used a method of adjustment to control the defocus component of the mirror to set the 'just noticeable', 'just troublesome' and 'just objectionable' defocus levels. For the white-no adaptive optics condition combined with 0.1 logMAR letter size, mean 'noticeable' blur limits were ±0.30, ±0.24 and ±0.23 D at 3, 4 and 6 mm pupils, respectively. White-adaptive optics and monochromatic-adaptive optics conditions reduced blur limits by 8% and 20%, respectively. Increasing pupil size from 3–6 mm decreased blur limits by 29%, and increasing letter size increased blur limits by 79%. Ratios of troublesome to noticeable, and of objectionable to noticeable, blur limits were 1.9 and 2.7 times, respectively. The study shows that the deformable mirror can be used to vary defocus in vision experiments. Overall, the results of noticeable, troublesome and objectionable blur agreed well with those of the previous study. Attempting to reduce higher-order aberrations or chromatic aberrations, reduced blur limits to only a small extent.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
The paper describes Personal Access Tutor (PAT),an Intelligent Tutoring System which helps students to learn how to create forms and reports in MS Access. We present the architecture and components of PAT and also the services that PAT provides to the students. Results for an external (system) evaluation of PAT (both qualitative and quantitative data) are presented and discussed.
Resumo:
This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah [1989. The dynamic effects of aggregate demand and supply disturbances. The American Economic Review 79, 655–673], and shows that structural equations with known permanent shocks cannot contain error correction terms, thereby freeing up the latter to be used as instruments in estimating their parameters. The approach is illustrated by a re-examination of the identification schemes used by Wickens and Motto [2001. Estimating shocks and impulse response functions. Journal of Applied Econometrics 16, 371–387], Shapiro and Watson [1988. Sources of business cycle fluctuations. NBER Macroeconomics Annual 3, 111–148], King et al. [1991. Stochastic trends and economic fluctuations. American Economic Review 81, 819–840], Gali [1992. How well does the ISLM model fit postwar US data? Quarterly Journal of Economics 107, 709–735; 1999. Technology, employment, and the business cycle: Do technology shocks explain aggregate fluctuations? American Economic Review 89, 249–271] and Fisher [2006. The dynamic effects of neutral and investment-specific technology shocks. Journal of Political Economy 114, 413–451].
Resumo:
Rapid advances in information and communications technology (ICT) - particularly the development of online technologies -have transformed the nature of economic, social and cultural relations across the globe. In the context of higher education in post-industrial societies, technological change has had a significant impact on university operating environments. In a broad sense, technological advancement has contributed significantly to the increasing complexity of global economies and societies, which is reflected in the rise of lifelong learning discourses with which universities are engaging. More specifically, the ever-expanding array of ICT available within the university sector has generated new management and pedagogical imperatives for higher education in the information age.
Resumo:
Today more than ever, generating and managing knowledge is an essential source of competitive advantage for every organization, and particularly for Multinational corporations (MNC). However, despite the undisputed agreement about the importance of creating and managing knowledge, there are still a large number of corporations that act unethically or illegally. Clearly, there is a lack of attention in gaining more knowledge about the management of ethical knowledge in organizations. This paper refers to value-based knowledge, as the process of recognise and manage those values that stand at the heart of decision-making and action in organizations. In order to support MNCs in implementing value-based knowledge process, the managerial ethical profile (MEP) has been presented as a valuable tool to facilitate knowledge management process at both the intra-organizational network level and at the inter-organizational network level.
Resumo:
In an automotive environment, the performance of a speech recognition system is affected by environmental noise if the speech signal is acquired directly from a microphone. Speech enhancement techniques are therefore necessary to improve the speech recognition performance. In this paper, a field-programmable gate array (FPGA) implementation of dual-microphone delay-and-sum beamforming (DASB) for speech enhancement is presented. As the first step towards a cost-effective solution, the implementation described in this paper uses a relatively high-end FPGA device to facilitate the verification of various design strategies and parameters. Experimental results show that the proposed design can produce output waveforms close to those generated by a theoretical (floating-point) model with modest usage of FPGA resources. Speech recognition experiments are also conducted on enhanced in-car speech waveforms produced by the FPGA in order to compare recognition performance with the floating-point representation running on a PC.
Resumo:
his collection of essays honouring the late Emeritus Keith Jackson addresses the public interest in New Zealand. This subject is of increasing importance at a time when politicians are grappling with serious issues that call into question the boundaries between the private and public spheres. The essays, by leading scholars and acknowledged experts in their field, reflect Keith's own preoccupations with institutional politics and with communication
Resumo:
This review chapter provides an overview of English language literacy education in the contexts of cultural and economic globalisation. Drawing case study examples of India and China, the authors outline three complementary models: the development paradigm, the hegemony paradigm and the new literacies paradigm. The analysis focuses on effects of the spread of English on vernacular languages and the non-synchronous issues raised by digital production cultures. Noting the difficulties of education systems in contending with new literacies - it argues for the reframing of transnational relations, global material conditions and new communications technologies as the objects of critical literacy education.
Resumo:
Objective: To demonstrate properties of the International Classification of the External Cause of Injury (ICECI) as a tool for use in injury prevention research. Methods: The Childhood Injury Prevention Study (CHIPS) is a prospective longitudinal follow up study of a cohort of 871 children 5–12 years of age, with a nested case crossover component. The ICECI is the latest tool in the International Classification of Diseases (ICD) family and has been designed to improve the precision of coding injury events. The details of all injury events recorded in the study, as well as all measured injury related exposures, were coded using the ICECI. This paper reports a substudy on the utility and practicability of using the ICECI in the CHIPS to record exposures. Interrater reliability was quantified for a sample of injured participants using the Kappa statistic to measure concordance between codes independently coded by two research staff. Results: There were 767 diaries collected at baseline and event details from 563 injuries and exposure details from injury crossover periods. There were no event, location, or activity details which could not be coded using the ICECI. Kappa statistics for concordance between raters within each of the dimensions ranged from 0.31 to 0.93 for the injury events and 0.94 and 0.97 for activity and location in the control periods. Discussion: This study represents the first detailed account of the properties of the ICECI revealed by its use in a primary analytic epidemiological study of injury prevention. The results of this study provide considerable support for the ICECI and its further use.
Resumo:
Children with early and continuously treated phenylketonuria (ECT-PKU) remain at risk of developing executive function (EF) deficits. There is some evidence that a high phenylalanine to tyrosine ratio (phe:tyr) is more strongly associated with impaired EF development than high phenylalanine alone. This study examined EF in a sample of 11 adolescents against concurrent and historical levels of phenylalanine, phe:tyr, and tyrosine. Lifetime measures of phe:tyr were more strongly associated with EF than phenylalanine-only measures. Children with a lifetime phe:tyr less than 6 demonstrated normal EF, whereas children who had a lifetime phe:tyr above 6, on average, demonstrated clinically impaired EF.
Resumo:
South Africa's modern architecture is not confined to the cities, but the ideas of the movement were mostly disseminated by architects and academics in Johannesburg, Pretoria, Durban and Cape Town, its four major urban centres. The lay out of significant areas of each city was also influenced by international modernist plans. In outlining the achievements and innovative designs of architects in these cities between the 1930s and 1970s, this article draws a picture of the importance of modernism in South African urban space, and of its diversity. It also draws attention to the political nature of the South African landscape and space, where modernist design was used for racial purposes, and to past and present conservation ideologies. The second part of the article concerns the conservation of modern buildings in these centres; it quotes bibliographies and lists the registers, those existing or under construction. It concludes with an overview of the conservation legislation in place and the challenges of conservation in a context of changing cultural values.