24 resultados para COMPLEX METHOD

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced. © 2008 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recently proposed colour based tracking algorithm has been established to track objects in real circumstances [Zivkovic, Z., Krose, B. 2004. An EM-like algorithm for color-histogram-based object tracking. In: Proc, IEEE Conf. on Computer Vision and Pattern Recognition, pp. 798-803]. To improve the performance of this technique in complex scenes, in this paper we propose a new algorithm for optimally adapting the ellipse outlining the objects of interest. This paper presents a Lagrangian based method to integrate a regularising component into the covariance matrix to be computed. Technically, we intend to reduce the residuals between the estimated probability distribution and the expected one. We argue that, by doing this, the shape of the ellipse can be properly adapted in the tracking stage. Experimental results show that the proposed method has favourable performance in shape adaption and object localisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the sample size required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for the error term of the ANOVA testing effects of particular interest. Finally, it is important to always consider the design of the experiment because this determines the appropriate ANOVA to use. Hence, it is necessary to be able to identify the different forms of ANOVA appropriate to different experimental designs and to recognise when a design is a split-plot or incorporates a repeated measure. If there is any doubt about which ANOVA to use in a specific circumstance, the researcher should seek advice from a statistician with experience of research in applied microbiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantum dots (Qdots) are fluorescent nanoparticles that have great potential as detection agents in biological applications. Their optical properties, including photostability and narrow, symmetrical emission bands with large Stokes shifts, and the potential for multiplexing of many different colours, give them significant advantages over traditionally used fluorescent dyes. Here, we report the straightforward generation of stable, covalent quantum dot-protein A/G bioconjugates that will be able to bind to almost any IgG antibody, and therefore can be used in many applications. An additional advantage is that the requirement for a secondary antibody is removed, simplifying experimental design. To demonstrate their use, we show their application in multiplexed western blotting. The sensitivity of Qdot conjugates is found to be superior to fluorescent dyes, and comparable to, or potentially better than, enhanced chemiluminescence. We show a true biological validation using a four-colour multiplexed western blot against a complex cell lysate background, and have significantly improved previously reported non-specific binding of the Qdots to cellular proteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present experimental studies and numerical modeling based on a combination of the Bidirectional Beam Propagation Method and Finite Element Modeling that completely describes the wavelength spectra of point by point femtosecond laser inscribed fiber Bragg gratings, showing excellent agreement with experiment. We have investigated the dependence of different spectral parameters such as insertion loss, all dominant cladding and ghost modes and their shape relative to the position of the fiber Bragg grating in the core of the fiber. Our model is validated by comparing model predictions with experimental data and allows for predictive modeling of the gratings. We expand our analysis to more complicated structures, where we introduce symmetry breaking; this highlights the importance of centered gratings and how maintaining symmetry contributes to the overall spectral quality of the inscribed Bragg gratings. Finally, the numerical modeling is applied to superstructure gratings and a comparison with experimental results reveals a capability for dealing with complex grating structures that can be designed with particular wavelength characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thirteen experiments investigated the dynamics of stream segregation. Experiments 1-6b used a similar method, where a same-frequency induction sequence (usually 10 repetitions of an identical pure tone) promoted segregation in a subsequent, briefer test sequence (of alternating low- and high-frequency tones). Experiments 1-2 measured streaming using a direct report of perception and a temporal-discrimination task, respectively. Creating a single deviant by altering the final inducer (e.g. in level or replacement with silence) reduced segregation, often substantially. As the prior inducers remained unaltered, it is proposed that the single change actively reset build-up. The extent of resetting varied gradually with the size of a frequency change, once noticeable (experiments 3a-3b). By manipulating the serial position of a change, experiments 4a-4b demonstrated that resetting only occurred when the final inducer was replaced with silence, as build-up is very rapid during a same-frequency induction sequence. Therefore, the observed resetting cannot be explained by fewer inducers being presented. Experiment 5 showed that resetting caused by a single deviant did not increase when prior inducers were made unpredictable in frequency (four-semitone range). Experiments 6a-6b demonstrated that actual and perceived continuity have a similar effect on subsequent streaming judgements promoting either integration or segregation, depending on listening context. Experiment 7 found that same-frequency inducers were considerably more effective at promoting segregation than an alternating-frequency inducer, and that a trend for deviant-tone resetting was only apparent for the same-frequency case. Using temporal-order judgments, experiments 8-9 demonstrated the stream segregation of pure-tone-like percepts, evoked by sudden changes in amplitude or interaural time difference for individual components of a complex tone, Active resetting was observed when a deviant was inserted into a sequence of these percepts (Experiment 10). Overall, these experiments offer new insight into the segregation-promotIng effect of induction sequences, and the factors which can reset this effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The DNA binding fusion protein, LacI-His6-GFP, together with the conjugate PEG-IDA-Cu(II) (10 kDa) was evaluated as a dual affinity system for the pUC19 plasmid extraction from an alkaline bacterial cell lysate in poly(ethylene glycol) (PEG)/dextran (DEX) aqueous two-phase systems (ATPS). In a PEG 600-DEX 40 ATPS containing 0.273 nmol of LacI fusion protein and 0.14% (w/w) of the functionalised PEG-IDA-Cu(II), more than 72% of the plasmid DNA partitioned to the PEG phase, without RNA or genomic DNA contamination as evaluated by agarose gel electrophoresis. In a second extraction stage, the elution of pDNA from the LacI binding complex proved difficult using either dextran or phosphate buffer as second phase, though more than 75% of the overall protein was removed in both systems. A maximum recovery of approximately 27% of the pCU19 plasmid was achieved using the PEG-dextran system as a second extraction system, with 80-90% of pDNA partitioning to the bottom phase. This represents about 7.4 microg of pDNA extracted per 1 mL of pUC19 desalted lysate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - Lung cancer is the commonest cause of cancer in Scotland and is usually advanced at diagnosis. Median time between symptom onset and consultation is 14 weeks, so an intervention to prompt earlier presentation could support earlier diagnosis and enable curative treatment in more cases. Aim - To develop and optimise an intervention to reduce the time between onset and first consultation with symptoms that might indicate lung cancer. Design and setting - Iterative development of complex healthcare intervention according to the MRC Framework conducted in Northeast Scotland. Method - The study produced a complex intervention to promote early presentation of lung cancer symptoms. An expert multidisciplinary group developed the first draft of the intervention based on theory and existing evidence. This was refined following focus groups with health professionals and high-risk patients. Results - First draft intervention components included: information communicated persuasively, demonstrations of early consultation and its benefits, behaviour change techniques, and involvement of spouses/partners. Focus groups identified patient engagement, achieving behavioural change, and conflict at the patient–general practice interface as challenges and measures were incorporated to tackle these. Final intervention delivery included a detailed self-help manual and extended consultation with a trained research nurse at which specific action plans were devised. Conclusion -The study has developed an intervention that appeals to patients and health professionals and has theoretical potential for benefit. Now it requires evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MEG beamformer algorithms work by making the assumption that correlated and spatially distinct local field potentials do not develop in the human brain. Despite this assumption, images produced by such algorithms concur with those from other non-invasive and invasive estimates of brain function. In this paper we set out to develop a method that could be applied to raw MEG data to explicitly test his assumption. We show that a promax rotation of MEG channel data can be used as an approximate estimator of the number of spatially distinct correlated sources in any frequency band.