937 resultados para q-Analysis
Resumo:
Migraine is a painful disorder for which the etiology remains obscure. Diagnosis is largely based on International Headache Society criteria. However, no feature occurs in all patients who meet these criteria, and no single symptom is required for diagnosis. Consequently, this definition may not accurately reflect the phenotypic heterogeneity or genetic basis of the disorder. Such phenotypic uncertainty is typical for complex genetic disorders and has encouraged interest in multivariate statistical methods for classifying disease phenotypes. We applied three popular statistical phenotyping methods—latent class analysis, grade of membership and grade of membership “fuzzy” clustering (Fanny)—to migraine symptom data, and compared heritability and genome-wide linkage results obtained using each approach. Our results demonstrate that different methodologies produce different clustering structures and non-negligible differences in subsequent analyses. We therefore urge caution in the use of any single approach and suggest that multiple phenotyping methods be used.
Resumo:
Purpose: This study explored the spatial distribution of notified cryptosporidiosis cases and identified major socioeconomic factors associated with the transmission of cryptosporidiosis in Brisbane, Australia. Methods: We obtained the computerized data sets on the notified cryptosporidiosis cases and their key socioeconomic factors by statistical local area (SLA) in Brisbane for the period of 1996 to 2004 from the Queensland Department of Health and Australian Bureau of Statistics, respectively. We used spatial empirical Bayes rates smoothing to estimate the spatial distribution of cryptosporidiosis cases. A spatial classification and regression tree (CART) model was developed to explore the relationship between socioeconomic factors and the incidence rates of cryptosporidiosis. Results: Spatial empirical Bayes analysis reveals that the cryptosporidiosis infections were primarily concentrated in the northwest and southeast of Brisbane. A spatial CART model shows that the relative risk for cryptosporidiosis transmission was 2.4 when the value of the social economic index for areas (SEIFA) was over 1028 and the proportion of residents with low educational attainment in an SLA exceeded 8.8%. Conclusions: There was remarkable variation in spatial distribution of cryptosporidiosis infections in Brisbane. Spatial pattern of cryptosporidiosis seems to be associated with SEIFA and the proportion of residents with low education attainment.
Resumo:
Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.
Resumo:
Programs written in languages of the Oberon family usually contain runtime tests on the dynamic type of variables. In some cases it may be desirable to reduce the number of such tests. Typeflow analysis is a static method of determining bounds on the types that objects may possess at runtime. We show that this analysis is able to reduce the number of tests in certain plausible circumstances. Furthermore, the same analysis is able to detect certain program errors at compile time, which would normally only be detected at program execution. This paper introduces the concepts of typeflow analysis and details its use in the reduction of runtime overhead in Oberon-2.
A discrete-trial approach to the functional analysis of aggressive behaviour in two boys with autism
Resumo:
Intervention to reduce challenging behaviour may be enhanced when based on a prior functional analysis. The present study describes a discrete-trial approach for the functional analysis of aggressive behaviour in two boys with autism. Twenty brief assessment trials were conducted in the classroom by the teacher under each of three conditions (i.e., attention, task and tangible). The results showed a clear pattern to each child's aggressive behaviour and suggested logical intervention strategies, although the study is limited because it involved only two children. The discrete-trial approach would appear to represent a practical and ecologically valid technique for conducting a functional analysis of challenging behaviour in applied settings
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
This paper aims to develop an effective numerical simulation technique for the dynamic deflection analysis of nanotubes-based nanoswitches. The nanoswitch is simplified to a continuum structure, and some key material parameters are extracted from typical molecular dynamics (MD). An advanced local meshless formulation is applied to obtain the discretized dynamic equations for the numerical solution. The developed numerical technique is firstly validated by the static deflection analyses of nanoswitches, and then, the fundamental dynamic properties of nanoswitches are analyzed. A parametric comparison with the results in the literature and from experiments shows that the developed modelling approach is accurate, efficient and effective.
Resumo:
Malcolm Shepherd Knowles was a key writer and theorist in the field of adult education in the United States. He died in 1997 and left a large legacy of books and journal articles. This thesis traced the development of his thinking over the 46-year period from 1950 to 1995. It examined the 25 works authored, co-authored, edited, reissued and revised by him during that period. The writings were scrutinised using a literature research methodology to expose the theoretical content, and a history of thought lens to identify and account for the development of major ideas. The methodology enabled a gradual unfolding of the history. A broadly-consistent and sequential pattern of thought focusing on the notion of andragogy emerged. The study revealed that after the initial phases of exploratory thinking, Knowles developed a practical-theoretical framework he believed could function as a comprehensive theory of adult learning. As his thinking progressed, his theory developed into a unified framework for human resource development and, later, into a model for the development of self-directed lifelong learners. The study traced the development of Knowles’ thinking through the phases of thought, identified the writings that belonged within each phase and produced a series of diagrammatic representations showing the evolution of his conceptual framework. The production of a history of the development of Knowles’ thought is the major outcome of the study. In addition to plotting the narrative sequence of thought-events, the history helps to explicate the factors and conditions that influenced Knowles’ thinking and to show the interrelationships between ideas. The study should help practitioners in their use and appreciation of Knowles’ works.
Resumo:
Multicarrier code division multiple access (MC-CDMA) is a very promising candidate for the multiple access scheme in fourth generation wireless communi- cation systems. During asynchronous transmission, multiple access interference (MAI) is a major challenge for MC-CDMA systems and significantly affects their performance. The main objectives of this thesis are to analyze the MAI in asyn- chronous MC-CDMA, and to develop robust techniques to reduce the MAI effect. Focus is first on the statistical analysis of MAI in asynchronous MC-CDMA. A new statistical model of MAI is developed. In the new model, the derivation of MAI can be applied to different distributions of timing offset, and the MAI power is modelled as a Gamma distributed random variable. By applying the new statistical model of MAI, a new computer simulation model is proposed. This model is based on the modelling of a multiuser system as a single user system followed by an additive noise component representing the MAI, which enables the new simulation model to significantly reduce the computation load during computer simulations. MAI reduction using slow frequency hopping (SFH) technique is the topic of the second part of the thesis. Two subsystems are considered. The first sub- system involves subcarrier frequency hopping as a group, which is referred to as GSFH/MC-CDMA. In the second subsystem, the condition of group hopping is dropped, resulting in a more general system, namely individual subcarrier frequency hopping MC-CDMA (ISFH/MC-CDMA). This research found that with the introduction of SFH, both of GSFH/MC-CDMA and ISFH/MC-CDMA sys- tems generate less MAI power than the basic MC-CDMA system during asyn- chronous transmission. Because of this, both SFH systems are shown to outper- form MC-CDMA in terms of BER. This improvement, however, is at the expense of spectral widening. In the third part of this thesis, base station polarization diversity, as another MAI reduction technique, is introduced to asynchronous MC-CDMA. The com- bined system is referred to as Pol/MC-CDMA. In this part a new optimum com- bining technique namely maximal signal-to-MAI ratio combining (MSMAIRC) is proposed to combine the signals in two base station antennas. With the applica- tion of MSMAIRC and in the absents of additive white Gaussian noise (AWGN), the resulting signal-to-MAI ratio (SMAIR) is not only maximized but also in- dependent of cross polarization discrimination (XPD) and antenna angle. In the case when AWGN is present, the performance of MSMAIRC is still affected by the XPD and antenna angle, but to a much lesser degree than the traditional maximal ratio combining (MRC). Furthermore, this research found that the BER performance for Pol/MC-CDMA can be further improved by changing the angle between the two receiving antennas. Hence the optimum antenna angles for both MSMAIRC and MRC are derived and their effects on the BER performance are compared. With the derived optimum antenna angle, the Pol/MC-CDMA system is able to obtain the lowest BER for a given XPD.
Resumo:
The aim of this exploratory study was to gain an insight into Asian and Western public relations practices by investigating them through job advertisements and thus reflecting on what organisations expect from the public relations professionals. Grunig's (1984) four models of public relations and the concept of relationships management were used as the foundation for this study. Australia was used to represent the Western region and India was used to represent the Asian region. Sample sets of public relations recruitment advertisements from both countries were examined against Grunig's one-way communication, two-way communication and relationship management attributes.
Resumo:
The present study used a university sample to assess the test-retest reliability and validity of the Australian Propensity for Angry Driving Scale (Aus-PADS). The scale has stability over time, and convergent validity was established, as Aus-PADS scores correlated significantly with established anger and impulsivity measures. Discriminant validity was also established, as Aus-PADS scores did not correlate with Venturesomeness scores. The Aus-PADS has demonstrated criterion validity, as scores were correlated with behavioural measures, such as yelling at other drivers, gesturing at other drivers, and feeling angry but not doing anything. Aus-PADS scores reliably predicted the frequency of these behaviours over and above other study variables. No significant relationship between aggressive driving and crash involvement was observed. It was concluded that the Aus-PADS is a reliable and valid tool appropriate for use in Australian research, and that the potential relationship between aggressive driving and crash involvement warrants further investigation with a more representative (and diverse) driver sample.
Resumo:
We describe the design and evaluation of a platform for networks of cameras in low-bandwidth, low-power sensor networks. In our work to date we have investigated two different DSP hardware/software platforms for undertaking the tasks of compression and object detection and tracking. We compare the relative merits of each of the hardware and software platforms in terms of both performance and energy consumption. Finally we discuss what we believe are the ongoing research questions for image processing in WSNs.