959 resultados para pretest probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel multiple regression method (RM) is developed to predict identity-by-descent probabilities at a locus L (IBDL), among individuals without pedigree, given information on surrounding markers and population history. These IBDL probabilities are a function of the increase in linkage disequilibrium (LD) generated by drift in a homogeneous population over generations. Three parameters are sufficient to describe population history: effective population size (Ne), number of generations since foundation (T), and marker allele frequencies among founders (p). IBD L are used in a simulation study to map a quantitative trait locus (QTL) via variance component estimation. RM is compared to a coalescent method (CM) in terms of power and robustness of QTL detection. Differences between RM and CM are small but significant. For example, RM is more powerful than CM in dioecious populations, but not in monoecious populations. Moreover, RM is more robust than CM when marker phases are unknown or when there is complete LD among founders or Ne is wrong, and less robust when p is wrong. CM utilises all marker haplotype information, whereas RM utilises information contained in each individual marker and all possible marker pairs but not in higher order interactions. RM consists of a family of models encompassing four different population structures, and two ways of using marker information, which contrasts with the single model that must cater for all possible evolutionary scenarios in CM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new deterministic method for predicting simultaneous inbreeding coefficients at three and four loci is presented. The method involves calculating the conditional probability of IBD (identical by descent) at one locus given IBD at other loci, and multiplying this probability by the prior probability of the latter loci being simultaneously IBD. The conditional probability is obtained applying a novel regression model, and the prior probability from the theory of digenic measures of Weir and Cockerham. The model was validated for a finite monoecious population mating at random, with a constant effective population size, and with or without selfing, and also for an infinite population with a constant intermediate proportion of selfing. We assumed discrete generations. Deterministic predictions were very accurate when compared with simulation results, and robust to alternative forms of implementation. These simultaneous inbreeding coefficients were more sensitive to changes in effective population size than in marker spacing. Extensions to predict simultaneous inbreeding coefficients at more than four loci are now possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The power of testing for a population-wide association between a biallelic quantitative trait locus and a linked biallelic marker locus is predicted both empirically and deterministically for several tests. The tests were based on the analysis of variance (ANOVA) and on a number of transmission disequilibrium tests (TDT). Deterministic power predictions made use of family information, and were functions of population parameters including linkage disequilibrium, allele frequencies, and recombination rate. Deterministic power predictions were very close to the empirical power from simulations in all scenarios considered in this study. The different TDTs had very similar power, intermediate between one-way and nested ANOVAs. One-way ANOVA was the only test that was not robust against spurious disequilibrium. Our general framework for predicting power deterministically can be used to predict power in other association tests. Deterministic power calculations are a powerful tool for researchers to plan and evaluate experiments and obviate the need for elaborate simulation studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Efficient effective child product safety (PS) responses require data on hazards, injury severity and injury probability. PS responses in Australia largely rely on reports from manufacturers/retailers, other jurisdictions/regulators, or consumers. The extent to which reactive responses reflect actual child injury priorities is unknown. Aims/Objectives/Purpose This research compared PS issues for children identified using data compiled from PS regulatory data and data compiled from health data sources in Queensland, Australia. Methods PS regulatory documents describing issues affecting children in Queensland in 2008–2009 were compiled and analysed to identify frequent products and hazards. Three health data sources (ED, injury surveillance and hospital data) were analysed to identify frequent products and hazards. Results/Outcomes Projectile toys/squeeze toys were the priority products for PS regulators with these toys having the potential to release small parts presenting choking hazards. However, across all health datasets, falls were the most common mechanism of injury, and several of the products identified were not subject to a PS system response. While some incidents may not require a response, a manual review of injury description text identified child poisonings and burns as common mechanisms of injuries in the health data where there was substantial documentation of product-involvement, yet only 10% of PS system responses focused on these two mechanisms combined. Significance/contribution to the field Regulatory data focused on products that fail compliance checks with ‘potential’ to cause harm, and health data identified actual harm, resulting in different prioritisation of products/mechanisms. Work is needed to better integrate health data into PS responses in Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In March 2008, the Australian Government announced its intention to introduce a national Emissions Trading Scheme (ETS), now expected to start in 2015. This impending development provides an ideal setting to investigate the impact an ETS in Australia will have on the market valuation of Australian Securities Exchange (ASX) firms. This is the first empirical study into the pricing effects of the ETS in Australia. Primarily, we hypothesize that firm value will be negatively related to a firm's carbon intensity profile. That is, there will be a greater impact on firm value for high carbon emitters in the period prior (2007) to the introduction of the ETS, whether for reasons relating to the existence of unbooked liabilities associated with future compliance and/or abatement costs, or for reasons relating to reduced future earnings. Using a sample of 58 Australian listed firms (constrained by the current availability of emissions data) which comprise larger, more profitable and less risky listed Australian firms, we first undertake an event study focusing on five distinct information events argued to impact the probability of the proposed ETS being enacted. Here, we find direct evidence that the capital market is indeed pricing the proposed ETS. Second, using a modified version of the Ohlson (1995) valuation model, we undertake a valuation analysis designed not only to complement the event study results, but more importantly to provide insights into the capital market's assessment of the magnitude of the economic impact of the proposed ETS as reflected in market capitalization. Here, our results show that the market assesses the most carbon intensive sample firms a market value decrement relative to other sample firms of between 7% and 10% of market capitalization. Further, based on the carbon emission profile of the sample firms we imply a ‘future carbon permit price’ of between AUD$17 per tonne and AUD$26 per tonne of carbon dioxide emitted. This study is more precise than industry reports, which set a carbon price of between AUD$15 to AUD$74 per tonne.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context: Anti-Müllerian hormone (AMH) concentration reflects ovarian aging and is argued to be a useful predictor of age at menopause (AMP). It is hypothesized that AMH falling below a critical threshold corresponds to follicle depletion, which results in menopause. With this threshold, theoretical predictions of AMP can be made. Comparisons of such predictions with observed AMP from population studies support the role for AMH as a forecaster of menopause. Objective: The objective of the study was to investigate whether previous relationships between AMH and AMP are valid using a much larger data set. Setting: AMH was measured in 27 563 women attending fertility clinics. Study Design: From these data a model of age-related AMH change was constructed using a robust regression analysis. Data on AMP from subfertile women were obtained from the population-based Prospect-European Prospective Investigation into Cancer and Nutrition (Prospect- EPIC) cohort (n � 2249). By constructing a probability distribution of age at which AMH falls below a critical threshold and fitting this to Prospect-EPIC menopausal age data using maximum likelihood, such a threshold was estimated. Main Outcome: The main outcome was conformity between observed and predicted AMP. Results: To get a distribution of AMH-predicted AMP that fit the Prospect-EPIC data, we found the critical AMH threshold should vary among women in such a way that women with low age-specific AMH would have lower thresholds, whereas women with high age-specific AMH would have higher thresholds (mean 0.075 ng/mL; interquartile range 0.038–0.15 ng/mL). Such a varying AMH threshold for menopause is a novel and biologically plausible finding. AMH became undetectable (�0.2 ng/mL) approximately 5 years before the occurrence of menopause, in line with a previous report. Conclusions: The conformity of the observed and predicted distributions of AMP supports the hypothesis that declining population averages of AMH are associated with menopause, making AMH an excellent candidate biomarker for AMP prediction. Further research will help establish the accuracy of AMH levels to predict AMP within individuals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. Previous research assumed a quantum-like model in which the semantic network was modelled as entangled qubits, however the level of activation was clearly being over-estimated. This paper explores three variations of this model, each of which are distinguished by a scaling factor designed to compensate the overestimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: There is a well developed literature on research investigating the relationship between various driving behaviours and road crash involvement. However, this research has predominantly been conducted in developed economies dominated by western types of cultural environments. To date no research has been published that has empirically investigated this relationship within the context of the emerging economies such as Oman. Objective: The present study aims to investigate driving behaviour as indexed in the Driving Behaviour Questionnaire (DBQ) among a group of Omani university students and staff. Methods: A convenience non-probability self- selection sampling approach was utilized with Omani university students and staff. Results: A total of 1003 Omani students (n= 632) and staff (n=371) participated in the survey. Factor analysis of the BDQ revealed four main factors that were errors, speeding violation, lapses and aggressive violation. In the multivariate logistic backward regression analysis, the following factors were identified as significant predictors of being involved in causing at least one crash: driving experience, history of offences and two DBQ components i.e. errors and aggressive violation. Conclusion: This study indicates that errors and aggressive violation of the traffic regulations as well as history of having traffic offences are major risk factors for road traffic crashes among the sample. While previous international research has demonstrated that speeding is a primary cause of crashing, in the current context, the results indicate that an array of factors is associated with crashes. Further research using more rigorous methodology is warranted to inform the development of road safety countermeasures in Oman that improves overall traffic safety culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE There has been a dramatic increase in vitamin D testing in Australia in recent years, prompting calls for targeted testing. We sought to develop a model to identify people most at risk of vitamin D deficiency. DESIGN AND PARTICIPANTS This is a cross-sectional study of 644 60- to 84-year-old participants, 95% of whom were Caucasian, who took part in a pilot randomized controlled trial of vitamin D supplementation. MEASUREMENTS Baseline 25(OH)D was measured using the Diasorin Liaison platform. Vitamin D insufficiency and deficiency were defined using 50 and 25 nmol/l as cut-points, respectively. A questionnaire was used to obtain information on demographic characteristics and lifestyle factors. We used multivariate logistic regression to predict low vitamin D and calculated the net benefit of using the model compared with 'test-all' and 'test-none' strategies. RESULTS The mean serum 25(OH)D was 42 (SD 14) nmol/1. Seventy-five per cent of participants were vitamin D insufficient and 10% deficient. Serum 25(OH)D was positively correlated with time outdoors, physical activity, vitamin D intake and ambient UVR, and inversely correlated with age, BMI and poor self-reported health status. These predictors explained approximately 21% of the variance in serum 25(OH)D. The area under the ROC curve predicting vitamin D deficiency was 0·82. Net benefit for the prediction model was higher than that for the 'test-all' strategy at all probability thresholds and higher than the 'test-none' strategy for probabilities up to 60%. CONCLUSION Our model could predict vitamin D deficiency with reasonable accuracy, but it needs to be validated in other populations before being implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dose kernels may be used to calculate dose distributions in radiotherapy (as described by Ahnesjo et al., 1999). Their calculation requires use of Monte Carlo methods, usually by forcing interactions to occur at a point. The Geant4 Monte Carlo toolkit provides a capability to force interactions to occur in a particular volume. We have modified this capability and created a Geant4 application to calculate dose kernels in cartesian, cylindrical, and spherical scoring systems. The simulation considers monoenergetic photons incident at the origin of a 3 m x 3 x 9 3 m water volume. Photons interact via compton, photo-electric, pair production, and rayleigh scattering. By default, Geant4 models photon interactions by sampling a physical interaction length (PIL) for each process. The process returning the smallest PIL is then considered to occur. In order to force the interaction to occur within a given length, L_FIL, we scale each PIL according to the formula: PIL_forced = L_FIL 9 (1 - exp(-PIL/PILo)) where PILo is a constant. This ensures that the process occurs within L_FIL, whilst correctly modelling the relative probability of each process. Dose kernels were produced for an incident photon energy of 0.1, 1.0, and 10.0 MeV. In order to benchmark the code, dose kernels were also calculated using the EGSnrc Edknrc user code. Identical scoring systems were used; namely, the collapsed cone approach of the Edknrc code. Relative dose difference images were then produced. Preliminary results demonstrate the ability of the Geant4 application to reproduce the shape of the dose kernels; median relative dose differences of 12.6, 5.75, and 12.6 % were found for an incident photon energy of 0.1, 1.0, and 10.0 MeV respectively.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Completing a PhD on time is a complex process, influenced by many interacting factors. In this paper we take a Bayesian Network approach to analyzing the factors perceived to be important in achieving this aim. Focusing on a single research group in Mathematical Sciences, we develop a conceptual model to describe the factors considered to be important to students and then quantify the network based on five individual perspectives: the students, a supervisor and a university research students centre manager. The resultant network comprised 37 factors and 40 connections, with an overall probability of timely completion of between 0.6 and 0.8. Across all participants, the four factors that were considered to most directly influence timely completion were personal aspects, the research environment, the research project, and incoming skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the decision-making of multi-area ATC (Available Transfer Capacity) in electricity market environment, the existing resources of transmission network should be optimally dispatched and coordinately employed on the premise that the secure system operation is maintained and risk associated is controllable. The non-sequential Monte Carlo simulation is used to determine the ATC probability density distribution of specified areas under the influence of several uncertainty factors, based on which, a coordinated probabilistic optimal decision-making model with the maximal risk benefit as its objective is developed for multi-area ATC. The NSGA-II is applied to calculate the ATC of each area, which considers the risk cost caused by relevant uncertainty factors and the synchronous coordination among areas. The essential characteristics of the developed model and the employed algorithm are illustrated by the example of IEEE 118-bus test system. Simulative result shows that, the risk of multi-area ATC decision-making is influenced by the uncertainties in power system operation and the relative importance degrees of different areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language has been of interest to numerous economists since the late 20th century, with the majority of the studies focusing on its effects on immigrants’ labour market outcomes; earnings in particular. However, language is an endogenous variable, which along with its susceptibility to measurement error causes biases in ordinary-least-squares estimates. The instrumental variables method overcomes the shortcomings of ordinary least squares in modelling endogenous explanatory variables. In this dissertation, age at arrival combined with country of origin form an instrument creating a difference-in-difference scenario, to address the issue of endogeneity and attenuation error in language proficiency. The first half of the study aims to investigate the extent to which English speaking ability of immigrants improves their labour market outcomes and social assimilation in Australia, with the use of the 2006 Census. The findings have provided evidence that support the earlier studies. As expected, immigrants in Australia with better language proficiency are able to earn higher income, attain higher level of education, have higher probability of completing tertiary studies, and have more hours of work per week. Language proficiency also improves social integration, leading to higher probability of marriage to a native and higher probability of obtaining citizenship. The second half of the study further investigates whether language proficiency has similar effects on a migrant’s physical and mental wellbeing, health care access and lifestyle choices, with the use of three National Health Surveys. However, only limited evidence has been found with respect to the hypothesised causal relationship between language and health for Australian immigrants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cooperative communication systems, several wireless communication terminals collaborate to form a virtual-multiple antenna array system and exploit the spatial diversity to achieve a better performance. This thesis proposes a practical slotted protocol for cooperative communication systems with half-duplex single antennas. The performance of the proposed slotted cooperative communication protocol is evaluated in terms of the pairwise error probability and the bit error rate. The proposed protocol achieves the multiple-input single-output performance bound with a novel relay ordering and scheduling strategy.