150 resultados para Probability of choice


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Messenger RNAs (mRNAs) can be repressed and degraded by small non-coding RNA molecules. In this paper, we formulate a coarsegrained Markov-chain description of the post-transcriptional regulation of mRNAs by either small interfering RNAs (siRNAs) or microRNAs (miRNAs). We calculate the probability of an mRNA escaping from its domain before it is repressed by siRNAs/miRNAs via cal- culation of the mean time to threshold: when the number of bound siRNAs/miRNAs exceeds a certain threshold value, the mRNA is irreversibly repressed. In some cases,the analysis can be reduced to counting certain paths in a reduced Markov model. We obtain explicit expressions when the small RNA bind irreversibly to the mRNA and we also discuss the reversible binding case. We apply our models to the study of RNA interference in the nucleus, examining the probability of mRNAs escaping via small nuclear pores before being degraded by siRNAs. Using the same modelling framework, we further investigate the effect of small, decoy RNAs (decoys) on the process of post-transcriptional regulation, by studying regulation of the tumor suppressor gene, PTEN : decoys are able to block binding sites on PTEN mRNAs, thereby educing the number of sites available to siRNAs/miRNAs and helping to protect it from repression. We calculate the probability of a cytoplasmic PTEN mRNA translocating to the endoplasmic reticulum before being repressed by miRNAs. We support our results with stochastic simulations

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Derailments are a significant cost to the Australian sugar industry with damage to rail infrastructure and rolling stock in excess of $2 M per annum. Many factors can contribute to cane rail derailments. The more prevalent factors are discussed. Derailment statistics on likely causes for cane rail derailments are presented with the case of empty wagons on the main line being the highest contributor to business cost. Historically, the lateral to vertical wheel load ratio, termed the derailment ratio, has been used to indicate the derailment probability of rolling stock. When the derailment ratio reaches the Nadal limit of 0.81 for cane rail operations, there is a high probability that a derailment will occur. Contributing factors for derailments include the operating forces, the geometric variables of the rolling stock and the geometric deviations of the railway track. These combined, have the capacity to affect the risk of derailment for a cane rail transport operating system. The derailment type that is responsible for creating the most damage to assets and creating mill stops is the flange climb derailment, as these derailments usually occur at speed with a full rake of empty wagons. The typical forces that contribute to the flange climb derailment case for cane rail operations are analysed and a practical derailment model is developed to enable operators to better appreciate the most significant contributing factors to this type of derailment. The paper aims to: (a) improve awareness of the significance of physical operating parameters so that these principles can be included in locomotive driver training and (b) improve awareness of track and wagon variables related to the risk of derailment so that maintainers of the rail system can allocate funds for maintenance more effectively.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lentiviral vectors pseudotyped with vesicular stomatitis virus glycoprotein (VSV-G) are emerging as the vectors of choice for in vitro and in vivo gene therapy studies. However, the current method for harvesting lentivectors relies upon ultracentrifugation at 50 000 g for 2 h. At this ultra-high speed, rotors currently in use generally have small volume capacity. Therefore, preparations of large volumes of high-titre vectors are time-consuming and laborious to perform. In the present study, viral vector supernatant harvests from vector-producing cells (VPCs) were pre-treated with various amounts of poly-L-lysine (PLL) and concentrated by low speed centrifugation. Optimal conditions were established when 0.005% of PLL (w/v) was added to vector supernatant harvests, followed by incubation for 30 min and centrifugation at 10 000 g for 2 h at 4 degreesC. Direct comparison with ultracentrifugation demonstrated that the new method consistently produced larger volumes (6 ml) of high-titre viral vector at 1 x 10(8) transduction unit (TU)/ml (from about 3000 ml of supernatant) in one round of concentration. Electron microscopic analysis showed that PLL/viral vector formed complexes, which probably facilitated easy precipitation at low-speed concentration (10 000 g), a speed which does not usually precipitate viral particles efficiently. Transfection of several cell lines in vitro and transduction in vivo in the liver with the lentivector/PLL complexes demonstrated efficient gene transfer without any significant signs of toxicity. These results suggest that the new method provides a convenient means for harvesting large volumes of high-titre lentivectors, facilitate gene therapy experiments in large animal or human gene therapy trials, in which large amounts of lentiviral vectors are a prerequisite.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We derive a new method for determining size-transition matrices (STMs) that eliminates probabilities of negative growth and accounts for individual variability. STMs are an important part of size-structured models, which are used in the stock assessment of aquatic species. The elements of STMs represent the probability of growth from one size class to another, given a time step. The growth increment over this time step can be modelled with a variety of methods, but when a population construct is assumed for the underlying growth model, the resulting STM may contain entries that predict negative growth. To solve this problem, we use a maximum likelihood method that incorporates individual variability in the asymptotic length, relative age at tagging, and measurement error to obtain von Bertalanffy growth model parameter estimates. The statistical moments for the future length given an individual's previous length measurement and time at liberty are then derived. We moment match the true conditional distributions with skewed-normal distributions and use these to accurately estimate the elements of the STMs. The method is investigated with simulated tag-recapture data and tag-recapture data gathered from the Australian eastern king prawn (Melicertus plebejus).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Speed is recognised as a key contributor to crash likelihood and severity, and to road safety performance in general. Its fundamental role has been recognised by making Safe Speeds one of the four pillars of the Safe System. In this context, impact speeds above which humans are likely to sustain fatal injuries have been accepted as a reference in many Safe System infrastructure policy and planning discussions. To date, there have been no proposed relationships for impact speeds above which humans are likely to sustain fatal or serious (severe) injury, a more relevant Safe System measure. A research project on Safe System intersection design required a critical review of published literature on the relationship between impact speed and probability of injury. This has led to a number of questions being raised about the origins, accuracy and appropriateness of the currently accepted impact speed–fatality probability relationships (Wramborg 2005) in many policy documents. The literature review identified alternative, more recent and more precise relationships derived from the US crash reconstruction databases (NASS/CDS). The paper proposes for discussion a set of alternative relationships between vehicle impact speed and probability of MAIS3+ (fatal and serious) injury for selected common crash types. Proposed Safe System critical impact speed values are also proposed for use in road infrastructure assessment. The paper presents the methodology and assumptions used in developing these relationships. It identifies further research needed to confirm and refine these relationships. Such relationships would form valuable inputs into future road safety policies in Australia and New Zealand.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate whether Nobel laureates’ collaborative activities undergo a negative change following prize reception by using publication records of 198 Nobel laureates and analyzing their coauthorship patterns before and after the Nobel Prize. The results overall indicate less collaboration with new coauthors post award than pre award. Nobel laureates are more loyal to collaborations that started before the Prize: looking at coauthorship drop-out rates, we find that these differ significantly between coauthorships that started before the Prize and coauthorships after the Prize. We also find that the greater the intensity of pre-award cooperation and the longer the period of pre-award collaboration, the higher the probability of staying in the coauthor network after the award, implying a higher loyalty to the Nobel laureate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is well-known that new particle formation (NPF) in the atmosphere is inhibited by pre-existing particles in the air that act as condensation sinks to decrease the concentration and, thus, the supersaturation of precursor gases. In this study, we investigate the effects of two parameters - atmospheric visibility, expressed as the particle back-scatter coefficient (BSP), and PM10 particulate mass concentration, on the occurrences of NPF events in an urban environment where the majority of precursor gases originate from motor vehicle and industrial sources. This is the first attempt to derive direct relationships between each of these two parameters and the occurrence of NPF. NPF events were identified from data obtained with a neutral cluster and air ion spectrometer over 245 days within a calendar year. Bayesian logistic regression was used to determine the probability of observing NPF as functions of BSP and PM10. We show that the BSP at 08 h on a given day is a reliable indicator of an NPF event later that day. The posterior median probability of observing an NPF event was greater than 0.5 (95%) when the BSP at 08 h was less than 6.8 Mm-1.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose The purpose of this paper is to discuss the relation between dissatisfaction with housing conditions and considering moving among residents of Finnish rental multifamily buildings. The paper examines physical attributes, socioeconomic factors, and subjective opinions related to housing conditions and satisfaction with housing. Design/methodology/approach Logistic regression analysis is used to examine survey data to analyse which factors contribute to dissatisfaction with the housing unit and the apartment building and whether dissatisfaction is related to consideration of moving. Findings The findings indicate that dissatisfaction with the building and individual housing unit are associated with greater probability of considering moving. Satisfaction with kitchen, living room, storage, and building age are the most important indicators of satisfaction with the housing unit, and satisfaction with living room, bathroom, storage, and building age are associated with satisfaction with the apartment building. These are the areas in which landlords could invest in renovations to increase satisfaction in an attempt to reduce turnover. Research limitations/implications The study is conducted with Finnish data only. The sample is not a representative sample of the Finnish population. A longitudinal study would be needed to determine whether dissatisfied residents indending to move actually change residence. Originality/value This study is the first of its kind in the Finnish housing market. It tests a general model that has been suggested to be customized to local conditions. In addition, much of the research on this topic is more than 20 years old. Examination of the model under current housing and socioeconomic conditions is necessary to determine if relationships have changed over time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The male-to-female sex ratio at birth is constant across world populations with an average of 1.06 (106 male to 100 female live births) for populations of European descent. The sex ratio is considered to be affected by numerous biological and environmental factors and to have a heritable component. The aim of this study was to investigate the presence of common allele modest effects at autosomal and chromosome X variants that could explain the observed sex ratio at birth. We conducted a large-scale genome-wide association scan (GWAS) meta-analysis across 51 studies, comprising overall 114 863 individuals (61 094 women and 53 769 men) of European ancestry and 2 623 828 common (minor allele frequency >0.05) single-nucleotide polymorphisms (SNPs). Allele frequencies were compared between men and women for directly-typed and imputed variants within each study. Forward-time simulations for unlinked, neutral, autosomal, common loci were performed under the demographic model for European populations with a fixed sex ratio and a random mating scheme to assess the probability of detecting significant allele frequency differences. We do not detect any genome-wide significant (P < 5 x 10(-8)) common SNP differences between men and women in this well-powered meta-analysis. The simulated data provided results entirely consistent with these findings. This large-scale investigation across ~115 000 individuals shows no detectable contribution from common genetic variants to the observed skew in the sex ratio. The absence of sex-specific differences is useful in guiding genetic association study design, for example when using mixed controls for sex-biased traits.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For zygosity diagnosis in the absence of genotypic data, or in the recruitment phase of a twin study where only single twins from same-sex pairs are being screened, or to provide a test for sample duplication leading to the false identification of a dizygotic pair as monozygotic, the appropriate analysis of respondents' answers to questions about zygosity is critical. Using data from a young adult Australian twin cohort (N = 2094 complete pairs and 519 singleton twins from same-sex pairs with complete responses to all zygosity items), we show that application of latent class analysis (LCA), fitting a 2-class model, yields results that show good concordance with traditional methods of zygosity diagnosis, but with certain important advantages. These include the ability, in many cases, to assign zygosity with specified probability on the basis of responses of a single informant (advantageous when one zygosity type is being oversampled); and the ability to quantify the probability of misassignment of zygosity, allowing prioritization of cases for genotyping as well as identification of cases of probable laboratory error. Out of 242 twins (from 121 like-sex pairs) where genotypic data were available for zygosity confirmation, only a single case was identified of incorrect zygosity assignment by the latent class algorithm. Zygosity assignment for that single case was identified by the LCA as uncertain (probability of being a monozygotic twin only 76%), and the co-twin's responses clearly identified the pair as dizygotic (probability of being dizygotic 100%). In the absence of genotypic data, or as a safeguard against sample duplication, application of LCA for zygosity assignment or confirmation is strongly recommended.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: To determine the cost-effectiveness of the MobileMums intervention. MobileMums is a 12-week programme which assists mothers with young children to be more physically active, primarily through the use of personalised SMS text-messages. Design: A cost-effectiveness analysis using a Markov model to estimate and compare the costs and consequences of MobileMums and usual care. Setting: This study considers the cost-effectiveness of MobileMums in Queensland, Australia. Participants: A hypothetical cohort of over 36 000 women with a child under 1 year old is considered. These women are expected to be eligible and willing to participate in the intervention in Queensland, Australia. Data sources: The model was informed by the effectiveness results from a 9-month two-arm community-based randomised controlled trial undertaken in 2011 and registered retrospectively with the Australian Clinical Trials Registry (ACTRN12611000481976). Baseline characteristics for the model cohort, treatment effects and resource utilisation were all informed by this trial. Main outcome measures: The incremental cost per quality-adjusted life year (QALY) of MobileMums compared with usual care. Results: The intervention is estimated to lead to an increase of 131 QALYs for an additional cost to the health system of 1.1 million Australian dollars (AUD). The expected incremental cost-effectiveness ratio for MobileMums is 8608 AUD per QALY gained. MobileMums has a 98% probability of being cost-effective at a cost-effectiveness threshold of 64 000 AUD. Varying modelling assumptions has little effect on this result. Conclusions: At a cost-effectiveness threshold of 64 000 AUD, MobileMums would likely be a cost-effective use of healthcare resources in Queensland, Australia. Trial registration number: Australian Clinical Trials Registry; ACTRN12611000481976.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Much of our understanding and management of ecological processes requires knowledge of the distribution and abundance of species. Reliable abundance or density estimates are essential for managing both threatened and invasive populations, yet are often challenging to obtain. Recent and emerging technological advances, particularly in unmanned aerial vehicles (UAVs), provide exciting opportunities to overcome these challenges in ecological surveillance. UAVs can provide automated, cost-effective surveillance and offer repeat surveys for pest incursions at an invasion front. They can capitalise on manoeuvrability and advanced imagery options to detect species that are cryptic due to behaviour, life-history or inaccessible habitat. UAVs may also cause less disturbance, in magnitude and duration, for sensitive fauna than other survey methods such as transect counting by humans or sniffer dogs. The surveillance approach depends upon the particular ecological context and the objective. For example, animal, plant and microbial target species differ in their movement, spread and observability. Lag-times may exist between a pest species presence at a site and its detectability, prompting a need for repeat surveys. Operationally, however, the frequency and coverage of UAV surveys may be limited by financial and other constraints, leading to errors in estimating species occurrence or density. We use simulation modelling to investigate how movement ecology should influence fine-scale decisions regarding ecological surveillance using UAVs. Movement and dispersal parameter choices allow contrasts between locally mobile but slow-dispersing populations, and species that are locally more static but invasive at the landscape scale. We find that low and slow UAV flights may offer the best monitoring strategy to predict local population densities in transects, but that the consequent reduction in overall area sampled may sacrifice the ability to reliably predict regional population density. Alternative flight plans may perform better, but this is also dependent on movement ecology and the magnitude of relative detection errors for different flight choices. Simulated investigations such as this will become increasingly useful to reveal how spatio-temporal extent and resolution of UAV monitoring should be adjusted to reduce observation errors and thus provide better population estimates, maximising the efficacy and efficiency of unmanned aerial surveys.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Predicting temporal responses of ecosystems to disturbances associated with industrial activities is critical for their management and conservation. However, prediction of ecosystem responses is challenging due to the complexity and potential non-linearities stemming from interactions between system components and multiple environmental drivers. Prediction is particularly difficult for marine ecosystems due to their often highly variable and complex natures and large uncertainties surrounding their dynamic responses. Consequently, current management of such systems often rely on expert judgement and/or complex quantitative models that consider only a subset of the relevant ecological processes. Hence there exists an urgent need for the development of whole-of-systems predictive models to support decision and policy makers in managing complex marine systems in the context of industry based disturbances. This paper presents Dynamic Bayesian Networks (DBNs) for predicting the temporal response of a marine ecosystem to anthropogenic disturbances. The DBN provides a visual representation of the problem domain in terms of factors (parts of the ecosystem) and their relationships. These relationships are quantified via Conditional Probability Tables (CPTs), which estimate the variability and uncertainty in the distribution of each factor. The combination of qualitative visual and quantitative elements in a DBN facilitates the integration of a wide array of data, published and expert knowledge and other models. Such multiple sources are often essential as one single source of information is rarely sufficient to cover the diverse range of factors relevant to a management task. Here, a DBN model is developed for tropical, annual Halophila and temperate, persistent Amphibolis seagrass meadows to inform dredging management and help meet environmental guidelines. Specifically, the impacts of capital (e.g. new port development) and maintenance (e.g. maintaining channel depths in established ports) dredging is evaluated with respect to the risk of permanent loss, defined as no recovery within 5 years (Environmental Protection Agency guidelines). The model is developed using expert knowledge, existing literature, statistical models of environmental light, and experimental data. The model is then demonstrated in a case study through the analysis of a variety of dredging, environmental and seagrass ecosystem recovery scenarios. In spatial zones significantly affected by dredging, such as the zone of moderate impact, shoot density has a very high probability of being driven to zero by capital dredging due to the duration of such dredging. Here, fast growing Halophila species can recover, however, the probability of recovery depends on the presence of seed banks. On the other hand, slow growing Amphibolis meadows have a high probability of suffering permanent loss. However, in the maintenance dredging scenario, due to the shorter duration of dredging, Amphibolis is better able to resist the impacts of dredging. For both types of seagrass meadows, the probability of loss was strongly dependent on the biological and ecological status of the meadow, as well as environmental conditions post-dredging. The ability to predict the ecosystem response under cumulative, non-linear interactions across a complex ecosystem highlights the utility of DBNs for decision support and environmental management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a ‘magnitude-based inference’ approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.