944 resultados para sequential coalescence
Resumo:
The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.
Resumo:
When designing a practical swarm robotics system, self-organized task allocation is key to make best use of resources. Current research in this area focuses on task allocation which is either distributed (tasks must be performed at different locations) or sequential (tasks are complex and must be split into simpler sub-tasks and processed in order). In practice, however, swarms will need to deal with tasks which are both distributed and sequential. In this paper, a classic foraging problem is extended to incorporate both distributed and sequential tasks. The problem is analysed theoretically, absolute limits on performance are derived, and a set of conditions for a successful algorithm are established. It is shown empirically that an algorithm which meets these conditions, by causing emergent cooperation between robots can achieve consistently high performance under a wide range of settings without the need for communication. © 2013 IEEE.
Resumo:
∗ Supported by the Serbian Scientific Foundation, grant No 04M01
Resumo:
Sequential pattern mining is an important subject in data mining with broad applications in many different areas. However, previous sequential mining algorithms mostly aimed to calculate the number of occurrences (the support) without regard to the degree of importance of different data items. In this paper, we propose to explore the search space of subsequences with normalized weights. We are not only interested in the number of occurrences of the sequences (supports of sequences), but also concerned about importance of sequences (weights). When generating subsequence candidates we use both the support and the weight of the candidates while maintaining the downward closure property of these patterns which allows to accelerate the process of candidate generation.
Resumo:
Heterogeneous datasets arise naturally in most applications due to the use of a variety of sensors and measuring platforms. Such datasets can be heterogeneous in terms of the error characteristics and sensor models. Treating such data is most naturally accomplished using a Bayesian or model-based geostatistical approach; however, such methods generally scale rather badly with the size of dataset, and require computationally expensive Monte Carlo based inference. Recently within the machine learning and spatial statistics communities many papers have explored the potential of reduced rank representations of the covariance matrix, often referred to as projected or fixed rank approaches. In such methods the covariance function of the posterior process is represented by a reduced rank approximation which is chosen such that there is minimal information loss. In this paper a sequential Bayesian framework for inference in such projected processes is presented. The observations are considered one at a time which avoids the need for high dimensional integrals typically required in a Bayesian approach. A C++ library, gptk, which is part of the INTAMAP web service, is introduced which implements projected, sequential estimation and adds several novel features. In particular the library includes the ability to use a generic observation operator, or sensor model, to permit data fusion. It is also possible to cope with a range of observation error characteristics, including non-Gaussian observation errors. Inference for the covariance parameters is explored, including the impact of the projected process approximation on likelihood profiles. We illustrate the projected sequential method in application to synthetic and real datasets. Limitations and extensions are discussed. © 2010 Elsevier Ltd.
Resumo:
This paper addresses a problem with an argument in Kranich, Perea, and Peters (2005) supporting their definition of the Weak Sequential Core and their characterization result. We also provide the remedy, a modification of the definition, to rescue the characterization.
Resumo:
This paper uses self-efficacy to predict the success of women in introductory physics. We show how sequential logistic regression demonstrates the predictive ability of self-efficacy, and reveals variations with type of physics course. Also discussed are the sources of self-efficacy that have the largest impact on predictive ability.
Resumo:
Bicellar lipid mixture dispersions progressively coalesce to larger structures on warming. This phase behaviour is particularly sensitive to interactions that perturb bilayer properties. In this study, ²H NMR was used to study the perturbation of bicellar lipid mixtures by two peptides (SP-B₆₃₋₇₈, a lung surfactant protein fragment and Magainin 2, an antimicrobial peptide) which are structurally similar. Particular attention was paid to the relation between peptide-induced perturbation and lipid composition. In bicellar dispersions containing only zwitterionic lipids (DMPC-d₅₄/DMPC/DHPC (3:1:1)) both peptides had little to no effect on the temperature at which coalescence to larger structures occurred. Conversely, in mixtures containing anionic lipids (DMPC-d₅₄/DMPG/DHPC (3:1:1)), both peptides modified bicellar phase behaviour. In mixtures containing SP-B₆₃₋₇₈, the presence of peptide decreased the temperature of the ribbon-like to extended lamellar phase transition. The addition of Magainin 2 to DMPCd₅₄/ DMPG/DHPC (3:1:1) mixtures, in contrast, increased the temperature of this transition and yielded a series of spectra resembling DMPC/DHPC (4:1) mixtures. Additional studies of lipid dispersions containing deuterated anionic lipids were done to determine whether the observed perturbation involved a peptide-induced separation of zwitterionic and anionic lipids. Comparison of DMPC/DMPG-d₅₄/DHPC (3:1:1) and DMPC-d₅₄/DMPG/DHPC (3:1:1) mixtures showed that DMPC and DMPG occupy similar environments in the presence of SP-B₆₃₋₇₈, but different lipid environments in the presence of Magainin 2. This might reflect the promotion of anionic lipid clustering by Magainin 2. These results demonstrate the variability of mechanisms of peptide-induced perturbation and suggest that lipid composition is an important factor in the peptide-induced perturbation of lipid structures.
Resumo:
Acknowledgments Y.Y. acknowledges the financial support from “973” Program (2012CB721006) and National Natural Science Foundation of China (31570033). R.E., K.K., H.D., and M.J. acknowledge the financial support of the Leverhulme Trust-Royal Society Africa Award (AA090088).
Resumo:
The authors would like to express their gratitude to organizations and people that supported this research. Piotr Omenzetter’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research. Ben Ryder of Aurecon and Graeme Cummings of HEB Construction assisted in obtaining access to the bridge and information for modelling. Luke Williams and Graham Bougen, undergraduate research students, assisted with testing.
Resumo:
The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.
Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.
Resumo:
Le fichiers qui accompagnent mon document ont été réalisés avec le logiciel Mathematica
Resumo:
Phosphorus is an essential nutrient for life. In the ocean, phosphorus burial regulates marine primary production**1, 2. Phosphorus is removed from the ocean by sedimentation of organic matter, and the subsequent conversion of organic phosphorus to phosphate minerals such as apatite, and ultimately phosphorite deposits**3, 4. Bacteria are thought to mediate these processes**5, but the mechanism of sequestration has remained unclear. Here, we present results from laboratory incubations in which we labelled organic-rich sediments from the Benguela upwelling system, Namibia, with a 33P-radiotracer, and tracked the fate of the phosphorus. We show that under both anoxic and oxic conditions, large sulphide-oxidizing bacteria accumulate 33P in their cells, and catalyse the nearly instantaneous conversion of phosphate to apatite. Apatite formation was greatest under anoxic conditions. Nutrient analyses of Namibian upwelling waters and sediments suggest that the rate of phosphate-to-apatite conversion beneath anoxic bottom waters exceeds the rate of phosphorus release during organic matter mineralization in the upper sediment layers. We suggest that bacterial apatite formation is a significant phosphorus sink under anoxic bottom-water conditions. Expanding oxygen minimum zones are projected in simulations of future climate change**6, potentially increasing sequestration of marine phosphate, and restricting marine productivity.