35 resultados para Hierarchical bayesian space-time models
em University of Queensland eSpace - Australia
Resumo:
We investigate whether relative contributions of genetic and shared environmental factors are associated with an increased risk in melanoma. Data from the Queensland Familial Melanoma Project comprising 15,907 subjects arising from 1912 families were analyzed to estimate the additive genetic, common and unique environmental contributions to variation in the age at onset of melanoma. Two complementary approaches for analyzing correlated time-to-onset family data were considered: the generalized estimating equations (GEE) method in which one can estimate relationship-specific dependence simultaneously with regression coefficients that describe the average population response to changing covariates; and a subject-specific Bayesian mixed model in which heterogeneity in regression parameters is explicitly modeled and the different components of variation may be estimated directly. The proportional hazards and Weibull models were utilized, as both produce natural frameworks for estimating relative risks while adjusting for simultaneous effects of other covariates. A simple Markov Chain Monte Carlo method for covariate imputation of missing data was used and the actual implementation of the Bayesian model was based on Gibbs sampling using the free ware package BUGS. In addition, we also used a Bayesian model to investigate the relative contribution of genetic and environmental effects on the expression of naevi and freckles, which are known risk factors for melanoma.
Resumo:
This paper derives the performance union bound of space-time trellis codes in orthogonal frequency division multiplexing system (STTC-OFDM) over quasi-static frequency selective fading channels based on the distance spectrum technique. The distance spectrum is the enumeration of the codeword difference measures and their multiplicities by exhausted searching through all the possible error event paths. Exhaustive search approach can be used for low memory order STTC with small frame size. However with moderate memory order STTC and moderate frame size the computational cost of exhaustive search increases exponentially, and may become impractical for high memory order STTCs. This requires advanced computational techniques such as Genetic Algorithms (GAS). In this paper, a GA with sharing function method is used to locate the multiple solutions of the distance spectrum for high memory order STTCs. Simulation evaluates the performance union bound and the complexity comparison of non-GA aided and GA aided distance spectrum techniques. It shows that the union bound give a close performance measure at high signal-to-noise ratio (SNR). It also shows that GA sharing function method based distance spectrum technique requires much less computational time as compared with exhaustive search approach but with satisfactory accuracy.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
Three kinds of integrable Kondo impurity additions to one-dimensional q-deformed extended Hubbard models are studied by means of the boundary Z(2)-graded quantum inverse scattering method. The boundary K matrices depending on the local magnetic moments of the impurities are presented as nontrivial realisations of the reflection equation algebras in an impurity Hilbert space. The models are solved by using the algebraic Bethe ansatz method, and the Bethe ansatz equations are obtained.
Resumo:
Subsequent to the influential paper of [Chan, K.C., Karolyi, G.A., Longstaff, F.A., Sanders, A.B., 1992. An empirical comparison of alternative models of the short-term interest rate. Journal of Finance 47, 1209-1227], the generalised method of moments (GMM) has been a popular technique for estimation and inference relating to continuous-time models of the short-term interest rate. GMM has been widely employed to estimate model parameters and to assess the goodness-of-fit of competing short-rate specifications. The current paper conducts a series of simulation experiments to document the bias and precision of GMM estimates of short-rate parameters, as well as the size and power of [Hansen, L.P., 1982. Large sample properties of generalised method of moments estimators. Econometrica 50, 1029-1054], J-test of over-identifying restrictions. While the J-test appears to have appropriate size and good power in sample sizes commonly encountered in the short-rate literature, GMM estimates of the speed of mean reversion are shown to be severely biased. Consequently, it is dangerous to draw strong conclusions about the strength of mean reversion using GMM. In contrast, the parameter capturing the levels effect, which is important in differentiating between competing short-rate specifications, is estimated with little bias. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The paper presents investigations into multiple input multiple output wireless communication systems, which are carried out from an electromagnetic perspective. The first part of the paper focuses on signal propagation models, which can be used for determining the MIMO system capacity or its performance when various space-time coding schemes are applied. Two types of models are considered. In the first model, array antennas are treated in an exact electromagnetic manner but interactions with scattering objects are incorporated using an approximate single bounce scattering approach. The other model is a simple but exact electromagnetic (EM) model, which takes into account EM interactions between antennas and scatterers. In this model, parallel wire dipoles represent antennas as well as scatterers. The second part of the paper reports on investigations into two types of MIMO testbeds. The first one is a simple transmit/receive diversity tested while the other one is a full MIMO testbed. The paper briefly describes the results obtained during the undertaken investigations
Resumo:
Arriving in Brisbane some six years ago, I could not help being impressed by what may be prosaically described as its atmospheric amenity resources. Perhaps this in part was due to my recent experiences in major urban centres in North America, but since that time, that sparkling quality and the blue skies seem to have progressively diminished. Unfortunately, there is also objective evidence available to suggest that this apparent deterioration is not merely the result of habituation of the senses. Air pollution data for the city show trends of increasing concentrations of those very substances that have destroyed the attractiveness of major population centres elsewhere, with climates initially as salubrious. Indeed, present figures indicate that photochemical smog in unacceptably high concentrations is rapidly becoming endemic also over Brisbane. These regrettable developments should come as no surprise. The society at large has not been inclined to respond purposefully to warnings of impending environmental problems, despite the experiences and publicity from overseas and even from other cities within Australia. Nor, up to the present, have certain politicians and government officials displayed stances beyond those necessary for the maintenance of a decorum of concern. At this stage, there still exists the possibility for meaningful government action without the embarrassment of losing political favour with the electorate. To the contrary, there is every chance that such action may be turned to advantage with increased public enlightenment. It would be more than a pity to miss perhaps the final remaining opportunity: Queensland is one of the few remaining places in the world with sufficient resources to permit both rational development and high environmental quality. The choice appears to be one of making a relatively minor investment now for a large financial and social gain the near future, or, permitting Brisbane to degenerate gradually into just another stagnated Los Angeles or Sydney. The present monograph attempts to introduce the problem by reviewing the available research on air quality in the Brisbane area. It also tries to elucidate some seemingly obvious, but so far unapplied management approaches. By necessity, such a broad treatment needs to make inroads into extensive ranges of subject areas, including political and legal practices to public perceptions, scientific measurement and statistical analysis to dynamics of air flow. Clearly, it does not pretend to be definitive in any of these fields, but it does try to emphasize those adjustable facets of the human use system of natural resources, too often neglected in favour of air pollution control technology. The crossing of disciplinary boundaries, however, needs no apology: air quality problems are ubiquitous, touching upon space, time and human interaction.
Resumo:
Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.
Resumo:
A causally well-behaved solution of the localization problem for the free electron is given, with natural space-time transformation properties, in terms of Dirac's position operator x. It is shown that, although x is not an observable in the usual sense, and has no positive-energy (generalized) eigenstates, the four-vector density (rho(x, t), j(x, t)/c) is observable, and can be localized arbitrarily precisely about any point in space, at any instant of time, using only positive energy states. A suitable spin operator can be diagonalized at the same time.
Resumo:
Genetic algorithms (GAs) are known to locate the global optimal solution provided sufficient population and/or generation is used. Practically, a near-optimal satisfactory result can be found by Gas with a limited number of generations. In wireless communications, the exhaustive searching approach is widely applied to many techniques, such as maximum likelihood decoding (MLD) and distance spectrum (DS) techniques. The complexity of the exhaustive searching approach in the MLD or the DS technique is exponential in the number of transmit antennas and the size of the signal constellation for the multiple-input multiple-output (MIMO) communication systems. If a large number of antennas and a large size of signal constellations, e.g. PSK and QAM, are employed in the MIMO systems, the exhaustive searching approach becomes impractical and time consuming. In this paper, the GAs are applied to the MLD and DS techniques to provide a near-optimal performance with a reduced computational complexity for the MIMO systems. Two different GA-based efficient searching approaches are proposed for the MLD and DS techniques, respectively. The first proposed approach is based on a GA with sharing function method, which is employed to locate the multiple solutions of the distance spectrum for the Space-time Trellis Coded Orthogonal Frequency Division Multiplexing (STTC-OFDM) systems. The second approach is the GA-based MLD that attempts to find the closest point to the transmitted signal. The proposed approach can return a satisfactory result with a good initial signal vector provided to the GA. Through simulation results, it is shown that the proposed GA-based efficient searching approaches can achieve near-optimal performance, but with a lower searching complexity comparing with the original MLD and DS techniques for the MIMO systems.
Resumo:
This paper uses the international education sector in Australia as a case study to argue against understanding globalization as an exogenous force. It introduces the notion of globalization as a governmentality and discusses alternative interpretations which take into account notions of subjectivity, positionality and space/time. The paper examines the types of global imaginaries used to govern international education. A discourse of cultural hybridity is mobilized to construct Australia as a safe multicultural study destination. The expressions of hybridity which are sanctioned within the international university are scripted by a neoliberal text, limiting the possibilities for more sophisticated intellectual engagements with the global.
Resumo:
The use of a fully parametric Bayesian method for analysing single patient trials based on the notion of treatment 'preference' is described. This Bayesian hierarchical modelling approach allows for full parameter uncertainty, use of prior information and the modelling of individual and patient sub-group structures. It provides updated probabilistic results for individual patients, and groups of patients with the same medical condition, as they are sequentially enrolled into individualized trials using the same medication alternatives. Two clinically interpretable criteria for determining a patient's response are detailed and illustrated using data from a previously published paper under two different prior information scenarios. Copyright (C) 2005 John Wiley & Sons, Ltd.
Resumo:
The aim of this study was to apply multifailure survival methods to analyze time to multiple occurrences of basal cell carcinoma (BCC). Data from 4.5 years of follow-up in a randomized controlled trial, the Nambour Skin Cancer Prevention Trial (1992-1996), to evaluate skin cancer prevention were used to assess the influence of sunscreen application on the time to first BCC and the time to subsequent BCCs. Three different approaches of time to ordered multiple events were applied and compared: the Andersen-Gill, Wei-Lin-Weissfeld, and Prentice-Williams-Peterson models. Robust variance estimation approaches were used for all multifailure survival models. Sunscreen treatment was not associated with time to first occurrence of a BCC (hazard ratio = 1.04, 95% confidence interval: 0.79, 1.45). Time to subsequent BCC tumors using the Andersen-Gill model resulted in a lower estimated hazard among the daily sunscreen application group, although statistical significance was not reached (hazard ratio = 0.82, 95% confidence interval: 0.59, 1.15). Similarly, both the Wei-Lin-Weissfeld marginal-hazards and the Prentice-Williams-Peterson gap-time models revealed trends toward a lower risk of subsequent BCC tumors among the sunscreen intervention group. These results demonstrate the importance of conducting multiple-event analysis for recurring events, as risk factors for a single event may differ from those where repeated events are considered.
Resumo:
An inverse methodology for the design of biologically loaded radio-frequency (RF) coils for magnetic resonance imaging applications is described. Free space time-harmonic electromagnetic Green's functions and de-emphasized B-1 target fields are used to calculate the current density on the coil cylinder. In theory, with the B-1 field de-emphasized in the middle of the RF transverse plane, the calculated current distribution can generate an internal magnetic field that can reduce the central overemphasis effect caused by field/tissue interactions at high frequencies. The current distribution of a head coil operating at 4 T (170 MHz) is calculated using an inverse methodology with de-emphasized B-1. target fields. An in-house finite-difference time-domain routine is employed to evaluate B-1 field and signal intensity inside a homogenous cylindrical phantom and then a complete human head model. A comparison with a conventional RF birdcage coil is carried out and demonstrates that this method can help in decreasing the normal bright region caused by field/tissue interactions in head images at 170 MHz and higher field strengths.