989 resultados para sequential methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential Design Molecular Weight Range Functional Monomers: Possibilities, Limits, and Challenges Block Copolymers: Combinations, Block Lengths, and Purities Modular Design End-Group Chemistry Ligation Protocols Conclusions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate methods for monitoring monthly aggregated hospital adverse event data that display clustering, non-linear trends and possible autocorrelation. Design Retrospective audit. Setting The Northern Hospital, Melbourne, Australia. Participants 171,059 patients admitted between January 2001 and December 2006. Measurements The analysis is illustrated with 72 months of patient fall injury data using a modified Shewhart U control chart, and charts derived from a quasi-Poisson generalised linear model (GLM) and a generalised additive mixed model (GAMM) that included an approximate upper control limit. Results The data were overdispersed and displayed a downward trend and possible autocorrelation. The downward trend was followed by a predictable period after December 2003. The GLM-estimated incidence rate ratio was 0.98 (95% CI 0.98 to 0.99) per month. The GAMM-fitted count fell from 12.67 (95% CI 10.05 to 15.97) in January 2001 to 5.23 (95% CI 3.82 to 7.15) in December 2006 (p<0.001). The corresponding values for the GLM were 11.9 and 3.94. Residual plots suggested that the GLM underestimated the rate at the beginning and end of the series and overestimated it in the middle. The data suggested a more rapid rate fall before 2004 and a steady state thereafter, a pattern reflected in the GAMM chart. The approximate upper two-sigma equivalent control limit in the GLM and GAMM charts identified 2 months that showed possible special-cause variation. Conclusion Charts based on GAMM analysis are a suitable alternative to Shewhart U control charts with these data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to improve individual and organisational performance in primary health care (PHC) by identifying the relationship between organisational culture, leadership behaviour and job satisfaction. The study used a sequential explanatory mixed methods design, to investigate the relationships between organisational culture, leadership behaviour, and job satisfaction among 550 PHCC professionals in Saudi Arabia. From surveying the PHC professionals, the results highlighted the importance of human caring qualities, including praise and recognition, consideration, and support, with respect to their perceptions of job satisfaction, leadership behaviour, and organisational culture. As a consequence a management framework was proposed to address these issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We aim to design strategies for sequential decision making that adjust to the difficulty of the learning problem. We study this question both in the setting of prediction with expert advice, and for more general combinatorial decision tasks. We are not satisfied with just guaranteeing minimax regret rates, but we want our algorithms to perform significantly better on easy data. Two popular ways to formalize such adaptivity are second-order regret bounds and quantile bounds. The underlying notions of 'easy data', which may be paraphrased as "the learning problem has small variance" and "multiple decisions are useful", are synergetic. But even though there are sophisticated algorithms that exploit one of the two, no existing algorithm is able to adapt to both. In this paper we outline a new method for obtaining such adaptive algorithms, based on a potential function that aggregates a range of learning rates (which are essential tuning parameters). By choosing the right prior we construct efficient algorithms and show that they reap both benefits by proving the first bounds that are both second-order and incorporate quantiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the use of exploratory focus groups to inform the development of a survey instrument in a sequential phase mixed methods study investigating differences in secondary students’ career choice capability. Five focus groups were conducted with 23 year 10 students in the state of New South Wales (NSW), Australia. Analysis of the focus group data informed the design of the instrument for the second phase of the research project: a large-scale cross-sectional survey. In this paper, we discuss the benefits of using sequential phase mixed method approaches when inquiring into complex phenomena such as human capability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces a new way of using prior information in a spatial model and develops scalable algorithms for fitting this model to large imaging datasets. These methods are employed for image-guided radiation therapy and satellite based classification of land use and water quality. This study has utilized a pre-computation step to achieve a hundredfold improvement in the elapsed runtime for model fitting. This makes it much more feasible to apply these models to real-world problems, and enables full Bayesian inference for images with a million or more pixels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Clinically, the Cobb angle method measures the overall scoliotic curve in the coronal plane but does not measure individual vertebra and disc wedging. The contributions of the vertebrae and discs in the growing scoliotic spine were measured to investigate coronal plane deformity progression with growth. Methods A 0.49mm isotropic 3D MRI technique was developed to investigate the level-by-level changes that occur in the growing spine of a group of Adolescent Idiopathic Scoliosis (AIS) patients, who received two to four sequential scans (spaced 3-12 months apart). The coronal plane wedge angles of each vertebra and disc in the major curve were measured to capture any changes that occurred during their adolescent growth phase. Results Seventeen patients had at least two scans. Mean patient age was 12.9 years (SD 1.5 years). Sixteen were classified as right-sided major thoracic Lenke Type 1 (one left sided). Mean standing Cobb angle at initial presentation was 31° (SD 12°). Six received two scans, nine three scans and two four scans, with 65% showing a Cobb angle progression of 5° or more between scans. Overall, there was no clear pattern of deformity progression of individual vertebrae and discs, nor between patients who progressed and those who didn’t. There were measurable changes in the wedging of the vertebrae and discs in all patients. In sequential scans, change in direction of wedging was also seen. In several patients there was reverse wedging in the discs that counteracted increased wedging of the vertebrae such that no change in overall Cobb angle was seen. Conclusion Sequential MRI data showed complex patterns of deformity progression. Changes to the wedging of individual vertebrae and discs may occur in patients who have no increase in Cobb angle measure; the Cobb method alone may be insufficient to capture the complex mechanisms of deformity progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION. Clinically, the Cobb angle method measures the overall scoliotic curve in the coronal plane but does not measure individual vertebra and disc wedging. The contributions of the vertebrae and discs in the growing scoliotic spine were measured to investigate coronal plane deformity progression with growth. METHODS. A 0.49mm isotropic 3D MRI technique was developed to investigate the level-by-level changes that occur in the growing spine of a group of Adolescent Idiopathic Scoliosis (AIS) patients, who received two to four sequential scans (spaced 3-12 months apart). The coronal plane wedge angles of each vertebra and disc in the major curve were measured to capture any changes that occurred during their adolescent growth phase. RESULTS. Seventeen patients had at least two scans. Mean patient age was 12.9 years (SD 1.5 years). Sixteen were classified as right-sided major thoracic Lenke Type 1 (one left sided). Mean standing Cobb angle at initial presentation was 31° (SD 12°). Six received two scans, nine three scans and two four scans, with 65% showing a Cobb angle progression of 5° or more between scans. Overall, there was no clear pattern of deformity progression of individual vertebrae and discs, nor between patients who progressed and those who didn’t. There were measurable changes in the wedging of the vertebrae and discs in all patients. In sequential scans, change in direction of wedging was also seen. In several patients there was reverse wedging in the discs that counteracted increased wedging of the vertebrae such that no change in overall Cobb angle was seen. CONCLUSION. Sequential MRI data showed complex patterns of deformity progression. Changes to the wedging of individual vertebrae and discs may occur in patients who have no increase in Cobb angle measure; the Cobb method alone may be insufficient to capture the complex mechanisms of deformity progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary. Interim analysis is important in a large clinical trial for ethical and cost considerations. Sometimes, an interim analysis needs to be performed at an earlier than planned time point. In that case, methods using stochastic curtailment are useful in examining the data for early stopping while controlling the inflation of type I and type II errors. We consider a three-arm randomized study of treatments to reduce perioperative blood loss following major surgery. Owing to slow accrual, an unplanned interim analysis was required by the study team to determine whether the study should be continued. We distinguish two different cases: when all treatments are under direct comparison and when one of the treatments is a control. We used simulations to study the operating characteristics of five different stochastic curtailment methods. We also considered the influence of timing of the interim analyses on the type I error and power of the test. We found that the type I error and power between the different methods can be quite different. The analysis for the perioperative blood loss trial was carried out at approximately a quarter of the planned sample size. We found that there is little evidence that the active treatments are better than a placebo and recommended closure of the trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose two treatments with binary responses are available for patients with some disease and that each patient will receive one of the two treatments. In this paper we consider the interests of patients both within and outside a trial using a Bayesian bandit approach and conclude that equal allocation is not appropriate for either group of patients. It is suggested that Gittins indices should be used (using an approach called dynamic discounting by choosing the discount rate based on the number of future patients in the trial) if the disease is rare, and the least failures rule if the disease is common. Some analytical and simulation results are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: The requirement for an allied health workforce is expanding as the global burden of disease increases internationally. To safely meet the demand for an expanded workforce of orthotist/prosthetists in Australia, competency based standards, which are up-to-date and evidence-based, are required. The aims of this study were to determine the minimum level for entry into the orthotic/prosthetic profession; to develop entry level competency standards for the profession; and to validate the developed entry-level competency standards within the profession nationally, using an evidence-based approach. Methods: A mixed-methods research design was applied, using a three-step sequential exploratory design, where step 1 involved collecting and analyzing qualitative data from two focus groups; step 2 involved exploratory instrument development and testing, developing the draft competency standards; and step 3 involved quantitative data collection and analysis – a Delphi survey. In stage 1 (steps 1 and 2), the two focus groups – an expert and a recent graduate group of Australian orthotist/prosthetists – were led by an experienced facilitator, to identify gaps in the current competency standards and then to outline a key purpose, and work roles and tasks for the profession. The resulting domains and activities of the first draft of the competency standards were synthesized using thematic analysis. In stage 2 (step 3), the draft-competency standards were circulated to a purposive sample of the membership of the Australian Orthotic Prosthetic Association, using three rounds of Delphi survey. A project reference group of orthotist/prosthetists reviewed the results of both stages. Results: In stage 1, the expert (n = 10) and the new graduate (n = 8) groups separately identified work roles and tasks, which formed the initial draft of the competency standards. Further drafts were refined and performance criteria added by the project reference group, resulting in the final draft-competency standards. In stage 2, the final draft-competency standards were circulated to 56 members (n = 44 final round) of the Association, who agreed on the key purpose, 6 domains, 18 activities, and 68 performance criteria of the final competency standards. Conclusion: This study outlines a rigorous and evidence-based mixed-methods approach for developing and endorsing professional competency standards, which is representative of the views of the profession of orthotist/prosthetists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quantification and characterisation of soil phosphorus (P) is of agricultural and environmental importance and different extraction methods are widely used to asses the bioavailability of P and to characterize soil P reserves. However, the large variety of extractants, pre-treatments and sample preparation procedures complicate the comparison of published results. In order to improve our understanding of the behaviour and cycling of P in soil, it is crucial to know the scientific relevance of the methods used for various purposes. The knowledge of the factors affecting the analytical outcome is a prerequisite for justified interpretation of the results. The aim of this thesis was to study the effects of sample preparation procedures on soil P and to determine the dependence of the recovered P pool on the chemical nature of extractants. Sampling is a critical step in soil testing and sampling strategy is dependent on the land-use history and the purpose of sampling. This study revealed that pre-treatments changed soil properties and air-drying was found to affect soil P, particularly extractable organic P, by disrupting organic matter. This was evidenced by an increase in the water-extractable small-sized (<0.2 µm) P that, at least partly, took place at the expense of the large-sized (>0.2 µm) P. However, freezing induced only insignificant changes and thus, freezing can be taken to be a suitable method for storing soils from the boreal zone that naturally undergo periodic freezing. The results demonstrated that chemical nature of the extractant affects its sensitivity to detect changes in soil P solubility. Buffered extractants obscured the alterations in P solubility induced by pH changes; however, water extraction, though sensitive to physicochemical changes, can be used to reveal short term changes in soil P solubility. As for the organic P, the analysis was found to be sensitive to the sample preparation procedures: filtering may leave a large proportion of extractable organic P undetected, whereas the outcome of centrifugation was found to be affected by the ionic strength of the extractant. Widely used sequential fractionation procedures proved to be able to detect land-use -derived differences in the distribution of P among fractions of different solubilities. However, interpretation of the results from extraction experiments requires better understanding of the biogeochemical function of the recovered P fraction in the P cycle in differently managed soils under dissimilar climatic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sequential firings with fixed time delays are frequently observed in simultaneous recordings from multiple neurons. Such temporal patterns are potentially indicative of underlying microcircuits and it is important to know when a repeatedly occurring pattern is statistically significant. These sequences are typically identified through correlation counts. In this paper we present a method for assessing the significance of such correlations. We specify the null hypothesis in terms of a bound on the conditional probabilities that characterize the influence of one neuron on another. This method of testing significance is more general than the currently available methods since under our null hypothesis we do not assume that the spiking processes of different neurons are independent. The structure of our null hypothesis also allows us to rank order the detected patterns. We demonstrate our method on simulated spike trains.