990 resultados para sequential data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dose-finding trials are a form of clinical data collection process in which the primary objective is to estimate an optimum dose of an investigational new drug when given to a patient. This thesis develops and explores three novel dose-finding design methodologies. All design methodologies presented in this thesis are pragmatic. They use statistical models, incorporate clinicians' prior knowledge efficiently, and prematurely stop a trial for safety or futility reasons. Designing actual dose-finding trials using these methodologies will minimize practical difficulties, improve efficiency of dose estimation, be flexible to stop early and reduce possible patient discomfort or harm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate methods for monitoring monthly aggregated hospital adverse event data that display clustering, non-linear trends and possible autocorrelation. Design Retrospective audit. Setting The Northern Hospital, Melbourne, Australia. Participants 171,059 patients admitted between January 2001 and December 2006. Measurements The analysis is illustrated with 72 months of patient fall injury data using a modified Shewhart U control chart, and charts derived from a quasi-Poisson generalised linear model (GLM) and a generalised additive mixed model (GAMM) that included an approximate upper control limit. Results The data were overdispersed and displayed a downward trend and possible autocorrelation. The downward trend was followed by a predictable period after December 2003. The GLM-estimated incidence rate ratio was 0.98 (95% CI 0.98 to 0.99) per month. The GAMM-fitted count fell from 12.67 (95% CI 10.05 to 15.97) in January 2001 to 5.23 (95% CI 3.82 to 7.15) in December 2006 (p<0.001). The corresponding values for the GLM were 11.9 and 3.94. Residual plots suggested that the GLM underestimated the rate at the beginning and end of the series and overestimated it in the middle. The data suggested a more rapid rate fall before 2004 and a steady state thereafter, a pattern reflected in the GAMM chart. The approximate upper two-sigma equivalent control limit in the GLM and GAMM charts identified 2 months that showed possible special-cause variation. Conclusion Charts based on GAMM analysis are a suitable alternative to Shewhart U control charts with these data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new transdimensional Sequential Monte Carlo (SMC) algorithm called SM- CVB is proposed. In an SMC approach, a weighted sample of particles is generated from a sequence of probability distributions which ‘converge’ to the target distribution of interest, in this case a Bayesian posterior distri- bution. The approach is based on the use of variational Bayes to propose new particles at each iteration of the SMCVB algorithm in order to target the posterior more efficiently. The variational-Bayes-generated proposals are not limited to a fixed dimension. This means that the weighted particle sets that arise can have varying dimensions thereby allowing us the option to also estimate an appropriate dimension for the model. This novel algorithm is outlined within the context of finite mixture model estimation. This pro- vides a less computationally demanding alternative to using reversible jump Markov chain Monte Carlo kernels within an SMC approach. We illustrate these ideas in a simulated data analysis and in applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Clinically, the Cobb angle method measures the overall scoliotic curve in the coronal plane but does not measure individual vertebra and disc wedging. The contributions of the vertebrae and discs in the growing scoliotic spine were measured to investigate coronal plane deformity progression with growth. Methods A 0.49mm isotropic 3D MRI technique was developed to investigate the level-by-level changes that occur in the growing spine of a group of Adolescent Idiopathic Scoliosis (AIS) patients, who received two to four sequential scans (spaced 3-12 months apart). The coronal plane wedge angles of each vertebra and disc in the major curve were measured to capture any changes that occurred during their adolescent growth phase. Results Seventeen patients had at least two scans. Mean patient age was 12.9 years (SD 1.5 years). Sixteen were classified as right-sided major thoracic Lenke Type 1 (one left sided). Mean standing Cobb angle at initial presentation was 31° (SD 12°). Six received two scans, nine three scans and two four scans, with 65% showing a Cobb angle progression of 5° or more between scans. Overall, there was no clear pattern of deformity progression of individual vertebrae and discs, nor between patients who progressed and those who didn’t. There were measurable changes in the wedging of the vertebrae and discs in all patients. In sequential scans, change in direction of wedging was also seen. In several patients there was reverse wedging in the discs that counteracted increased wedging of the vertebrae such that no change in overall Cobb angle was seen. Conclusion Sequential MRI data showed complex patterns of deformity progression. Changes to the wedging of individual vertebrae and discs may occur in patients who have no increase in Cobb angle measure; the Cobb method alone may be insufficient to capture the complex mechanisms of deformity progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION. Clinically, the Cobb angle method measures the overall scoliotic curve in the coronal plane but does not measure individual vertebra and disc wedging. The contributions of the vertebrae and discs in the growing scoliotic spine were measured to investigate coronal plane deformity progression with growth. METHODS. A 0.49mm isotropic 3D MRI technique was developed to investigate the level-by-level changes that occur in the growing spine of a group of Adolescent Idiopathic Scoliosis (AIS) patients, who received two to four sequential scans (spaced 3-12 months apart). The coronal plane wedge angles of each vertebra and disc in the major curve were measured to capture any changes that occurred during their adolescent growth phase. RESULTS. Seventeen patients had at least two scans. Mean patient age was 12.9 years (SD 1.5 years). Sixteen were classified as right-sided major thoracic Lenke Type 1 (one left sided). Mean standing Cobb angle at initial presentation was 31° (SD 12°). Six received two scans, nine three scans and two four scans, with 65% showing a Cobb angle progression of 5° or more between scans. Overall, there was no clear pattern of deformity progression of individual vertebrae and discs, nor between patients who progressed and those who didn’t. There were measurable changes in the wedging of the vertebrae and discs in all patients. In sequential scans, change in direction of wedging was also seen. In several patients there was reverse wedging in the discs that counteracted increased wedging of the vertebrae such that no change in overall Cobb angle was seen. CONCLUSION. Sequential MRI data showed complex patterns of deformity progression. Changes to the wedging of individual vertebrae and discs may occur in patients who have no increase in Cobb angle measure; the Cobb method alone may be insufficient to capture the complex mechanisms of deformity progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinically, the Cobb angle method measures the overall scoliotic curve in the coronal plane but does not measure individual vertebra and disc wedging. The contributions of the vertebrae and discs in the growing scoliotic spine were measured to investigate coronal plane deformity progression with growth. Sequential MRI data in this project showed complex patterns of deformity progression. Changes to the wedging of individual vertebrae and discs may occur in patients who have no increase in Cobb angle measure; the Cobb method alone may be insufficient to capture the complex mechanisms of deformity progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Haemodialysis nurses form long term relationships with patients in a technologically complex work environment. Previous studies have highlighted that haemodialysis nurses face stressors related to the nature of their work and also their work environments leading to reported high levels of burnout. Using Kanters (1997) Structural Empowerment Theory as a guiding framework, the aim of this study was to explore the factors contributing to satisfaction with the work environment, job satisfaction, job stress and burnout in haemodialysis nurses. Methods: Using a sequential mixed-methods design, the first phase involved an on-line survey comprising demographic and work characteristics, Brisbane Practice Environment Measure (B-PEM), Index of Work Satisfaction(IWS), Nursing Stress Scale (NSS) and the Maslach Burnout Inventory (MBI). The second phase involved conducting eight semi-structured interviews with data thematically analyzed. Results: From the 417 nurses surveyed the majority were female (90.9 %), aged over 41 years of age (74.3 %), and 47.4 % had worked in haemodialysis for more than 10 years. Overall the work environment was perceived positively and there was a moderate level of job satisfaction. However levels of stress and emotional exhaustion (burnout) were high. Two themes, ability to care and feeling successful as a nurse, provided clarity to the level of job satisfaction found in phase 1. While two further themes, patients as quasi-family and intense working teams, explained why working as a haemodialysis nurse was both satisfying and stressful. Conclusions: Nurse managers can use these results to identify issues being experienced by haemodialysis nurses working in the unit they are supervising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary. Interim analysis is important in a large clinical trial for ethical and cost considerations. Sometimes, an interim analysis needs to be performed at an earlier than planned time point. In that case, methods using stochastic curtailment are useful in examining the data for early stopping while controlling the inflation of type I and type II errors. We consider a three-arm randomized study of treatments to reduce perioperative blood loss following major surgery. Owing to slow accrual, an unplanned interim analysis was required by the study team to determine whether the study should be continued. We distinguish two different cases: when all treatments are under direct comparison and when one of the treatments is a control. We used simulations to study the operating characteristics of five different stochastic curtailment methods. We also considered the influence of timing of the interim analyses on the type I error and power of the test. We found that the type I error and power between the different methods can be quite different. The analysis for the perioperative blood loss trial was carried out at approximately a quarter of the planned sample size. We found that there is little evidence that the active treatments are better than a placebo and recommended closure of the trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is an empirical study of how two words in Icelandic, "nú" and "núna", are used in contemporary Icelandic conversation. My aims in this study are, first, to explain the differences between the temporal functions of "nú" and "núna", and, second, to describe the non-temporal functions of "nú". In the analysis, a focus is placed on comparing the sequential placement of the two words, on their syntactical distribution, and on their prosodic realization. The empirical data comprise 14 hours and 11 minutes of naturally occurring conversation recorded between 1996 and 2003. The selected conversations represent a wide range of interactional contexts including informal dinner parties, institutional and non-institutional telephone conversations, radio programs for teenagers, phone-in programs, and, finally, a political debate on television. The theoretical and methodological framework is interactional linguistics, which can be described as linguistically oriented conversation analysis (CA). A comparison of "nú" and "núna" shows that the two words have different syntactic distributions. "Nú" has a clear tendency to occur in the front field, before the finite verb, while "núna" typically occurs in the end field, after the object. It is argued that this syntactic difference reflects a functional difference between "nú" and "núna". A sequential analysis of "núna" shows that the word refers to an unspecified period of time which includes the utterance time as well as some time in the past and in the future. This temporal relation is referred to as reference time. "Nú", by contrast, is mainly used in three different environments: a) in temporal comparisons, 2) in transitions, and 3) when the speaker is taking an affective stance. The non-temporal functions of "nú" are divided into three categories: a) "nú" as a tone particle, 2) "nú" as an utterance particle, and 3) "nú" as a dialogue particle. "Nú" as a tone particle is syntactically integrated and can occur in two syntactic positions: pre-verbally and post-verbally. I argue that these instances are employed in utterances in which a speaker is foregrounding information or marking it as particularly important. The study shows that, although these instances are typically prosodically non-prominent and unstressed, they are in some cases delivered with stress and with a higher pitch than the surrounding talk. "Nú" as an utterance particle occurs turn-initially and is syntactically non-integrated. By using "nú", speakers show continuity between turns and link new turns to prior ones. These instances initiate either continuations by the same speaker or new turns after speaker shifts. "Nú" as a dialogue particle occurs as a turn of its own. The study shows that these instances register informings in prior turns as unexpected or as a departure from the normal state of affairs. "Nú" as a dialogue particle is often delivered with a prolonged vowel and a recognizable intonation contour. A comparative sequential and prosodic analysis shows that in these cases there is a correlation between the function of "nú" and the intonation contour by which it is delivered. Finally, I argue that despite the many functions of "nú", all the instances can be said to have a common denominator, which is to display attention towards the present moment and the utterances which are produced prior or after the production of "nú". Instead of anchoring the utterances in external time or reference time, these instances position the utterance in discourse internal time, or discourse time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wheat grain industry is Australia's second largest agricultural export commodity. There is an increasing demand for accurate, objective and near real-time crop production information by industry. The advent of the Moderate Resolution Imaging Spectroradiometer (MODIS) satellite platform has augmented the capability of satellite-based applications to capture reflectance over large areas at acceptable pixel scale, cost and accuracy. The use of multi-temporal MODIS-enhanced vegetation index (EVI) imagery to determine crop area was investigated in this article. Here the rigour of the harmonic analysis of time-series (HANTS) and early-season metric approaches was assessed when extrapolating over the entire Queensland (QLD) cropping region for the 2005 and 2006 seasons. Early-season crop area estimates, at least 4 months before harvest, produced high accuracy at pixel and regional scales with percent errors of -8.6% and -26% for the 2005 and 2006 seasons, respectively. In discriminating among crops at pixel and regional scale, the HANTS approach showed high accuracy. The errors for specific area estimates for wheat, barley and chickpea were 9.9%, -5.2% and 10.9% (for 2005) and -2.8%, -78% and 64% (for 2006), respectively. Area estimates of total winter crop, wheat, barley and chickpea resulted in coefficient of determination (R(2)) values of 0.92, 0.89, 0.82 and 0.52, when contrasted against the actual shire-scale data. A significantly high coefficient of determination (0.87) was achieved for total winter crop area estimates in August across all shires for the 2006 season. Furthermore, the HANTS approach showed high accuracy in discriminating cropping area from non-cropping area and highlighted the need for accurate and up-to-date land use maps. The extrapolability of these approaches to determine total and specific winter crop area estimates, well before flowering, showed good utility across larger areas and seasons. Hence, it is envisaged that this technology might be transferable to different regions across Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Ankylosing spondylitis (AS) is an immune-mediated arthritis particularly targeting the spine and pelvis and is characterised by inflammation, osteoproliferation and frequently ankylosis. Current treatments that predominately target inflammatory pathways have disappointing efficacy in slowing disease progression. Thus, a better understanding of the causal association and pathological progression from inflammation to bone formation, particularly whether inflammation directly initiates osteoproliferation, is required. Methods The proteoglycan-induced spondylitis (PGISp) mouse model of AS was used to histopathologically map the progressive axial disease events, assess molecular changes during disease progression and define disease progression using unbiased clustering of semi-quantitative histology. PGISp mice were followed over a 24-week time course. Spinal disease was assessed using a novel semi-quantitative histological scoring system that independently evaluated the breadth of pathological features associated with PGISp axial disease, including inflammation, joint destruction and excessive tissue formation (osteoproliferation). Matrix components were identified using immunohistochemistry. Results Disease initiated with inflammation at the periphery of the intervertebral disc (IVD) adjacent to the longitudinal ligament, reminiscent of enthesitis, and was associated with upregulated tumor necrosis factor and metalloproteinases. After a lag phase, established inflammation was temporospatially associated with destruction of IVDs, cartilage and bone. At later time points, advanced disease was characterised by substantially reduced inflammation, excessive tissue formation and ectopic chondrocyte expansion. These distinct features differentiated affected mice into early, intermediate and advanced disease stages. Excessive tissue formation was observed in vertebral joints only if the IVD was destroyed as a consequence of the early inflammation. Ectopic excessive tissue was predominantly chondroidal with chondrocyte-like cells embedded within collagen type II- and X-rich matrix. This corresponded with upregulation of mRNA for cartilage markers Col2a1, sox9 and Comp. Osteophytes, though infrequent, were more prevalent in later disease. Conclusions The inflammation-driven IVD destruction was shown to be a prerequisite for axial disease progression to osteoproliferation in the PGISp mouse. Osteoproliferation led to vertebral body deformity and fusion but was never seen concurrent with persistent inflammation, suggesting a sequential process. The findings support that early intervention with anti-inflammatory therapies will be needed to limit destructive processes and consequently prevent progression of AS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of detecting statistically significant sequential patterns in multineuronal spike trains. These patterns are characterized by ordered sequences of spikes from different neurons with specific delays between spikes. We have previously proposed a data-mining scheme to efficiently discover such patterns, which occur often enough in the data. Here we propose a method to determine the statistical significance of such repeating patterns. The novelty of our approach is that we use a compound null hypothesis that not only includes models of independent neurons but also models where neurons have weak dependencies. The strength of interaction among the neurons is represented in terms of certain pair-wise conditional probabilities. We specify our null hypothesis by putting an upper bound on all such conditional probabilities. We construct a probabilistic model that captures the counting process and use this to derive a test of significance for rejecting such a compound null hypothesis. The structure of our null hypothesis also allows us to rank-order different significant patterns. We illustrate the effectiveness of our approach using spike trains generated with a simulator.