960 resultados para new methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-04

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The new methods of laser microdissection microscopy have received wide acceptance in biology and have been applied in a small number of parasitology investigations. Here, the techniques and applications of laser microdissection microscopy are reviewed with suggestions of how the systems might be used to explore applied questions in parasite molecular biology and host-parasite interactions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of new methods of producing hypersonic wind-tunnel flows at increasing velocities during the last few decades is reviewed with attention to airbreathing propulsion, hypervelocity aerodynamics and superorbital aerodynamics. The role of chemical reactions in these flows leads to use of a binary scaling simulation parameter, which can be related to the Reynolds number, and which demands that smaller wind tunnels require higher reservoir pressure levels for simulation of flight phenomena. The use of combustion heated vitiated wind tunnels for propulsive research is discussed, as well as the use of reflected shock tunnels for the same purpose. A flight experiment validating shock-tunnel results is described, and relevant developments in shock tunnel instrumentation are outlined. The use of shock tunnels for hypervelocity testing is reviewed, noting the role of driver gas contamination in determining test time, and presenting examples of air dissociation effects on model flows. Extending the hypervelocity testing range into the superorbital regime with useful test times is seen to be possible by use of expansion tube/tunnels with a free piston driver.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Insects have a much smaller repertoire of voltage-gated calcium (Ca-v) channels than vertebrates. Drosophila melanogaster harbors only a single ortholog of each of the vertebrate Ca(v)1, Ca(v)2, and Ca(v)3 subtypes, although its basal inventory is expanded by alternative splicing and editing of Ca-v channel transcripts. Nevertheless, there appears to be little functional plasticity within this limited panel of insect Ca-v channels, since severe loss-of-function mutations in genes encoding the pore-forming a, subunits in Drosophila are embryonic lethal. Since the primary role of spider venom is to paralyze or kill insect prey, it is not surprising that most, if not all, spider venoms contain peptides that potently modify the activity of these functionally critical insect Ca-v channels. Unfortunately, it has proven difficult to determine the precise ion channel subtypes recognized by these peptide toxins since insect Ca-v channels have significantly different pharmacology to their vertebrate counterparts, and cloned insect Ca-v channels are not available for electrophysiological studies. However, biochemical and genetic studies indicate that some of these spider toxins might ultimately become the defining pharmacology for certain subtypes of insect Ca-v channels. This review focuses on peptidic spider toxins that specifically target insect Ca-v channels. In addition to providing novel molecular tools for ion channel characterization, some of these toxins are being used as leads to develop new methods for controlling insect pests. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Infection of the external structures of the eye is one of the commonest types of eye disease worldwide. In addition, although relatively impermeable to microorganisms, infection within the eye can result from trauma, surgery or systemic disease. This article reviews the general biology of viruses, bacteria, fungi and protozoa and the major ocular infections that they cause. In addition, the effectiveness of the various antimicrobial agents in controlling ocular disease is discussed. Because of changes in the normal ocular flora, continuous monitoring of the microbiology of the eye will continue to be important in predicting future types of eye infection. Basic research is also needed into the interactions of microbes at the ocular surface. There is increasing microbial resistance to the antimicrobial agents used to treat ocular infections and hence, new antimicrobial agents will continue to be needed together with new methods of drug delivery to increase the effectiveness of existing antimicrobial agents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim To undertake a national study of teaching, learning and assessment in UK schools of pharmacy. Design Triangulation of course documentation, 24 semi-structured interviews undertaken with 29 representatives from the schools and a survey of all final year students (n=1,847) in the 15 schools within the UK during 2003–04. Subjects and setting All established UK pharmacy schools and final year MPharm students. Outcome measures Data were combined and analysed under the topics of curriculum, teaching and learning, assessment, multi-professional teaching and learning, placement education and research projects. Results Professional accreditation was the main driver for curriculum design but links to preregistration training were poor. Curricula were consistent but offered little student choice. On average half the curriculum was science-based. Staff supported the science content but students less so. Courses were didactic but schools were experimenting with new methods of learning. Examinations were the principal form of assessment but the contribution of practice to the final degree ranged considerably (21–63%). Most students considered the assessment load to be about right but with too much emphasis upon knowledge. Assessment of professional competence was focused upon dispensing and pharmacy law. All schools undertook placement teaching in hospitals but there was little in community/primary care. There was little inter-professional education. Resources and logistics were the major limiters. Conclusions There is a need for an integrated review of the accreditation process for the MPharm and preregistration training and redefinition of professional competence at an undergraduate level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last 30 years, the field of problem structuring methods (PSMs) has been pioneered by a handful of 'gurus'—the most visible of whom have contributed other viewpoints to this special issue. As this generation slowly retires, it is opportune to survey the field and their legacy. We focus on the progress the community has made up to 2000, as work that started afterwards is ongoing and its impact on the field will probably only become apparent in 5–10 years time. We believe that up to 2000, research into PSMs was stagnating. We believe that this was partly due to a lack of new researchers penetrating what we call the 'grass-roots community'—the community which takes an active role in developing the theory and application of problem structuring. Evidence for this stagnation (or lack of development) is that, in 2000, many PSMs still relied heavily on the same basic methods proposed by the originators nearly 30 years earlier—perhaps only supporting those methods with computer software as a sign of development. Furthermore, no new methods had been integrated into the literature which suggests that revolutionary development, at least by academics, has stalled. We are pleased to suggest that from papers in this double special issue on PSMs this trend seems over because new authors report new PSMs and extend existing PSMs in new directions. Despite these recent developments of the methods, it is important to examine why this apparent stagnation took place. In the following sections, we identify and elaborate a number of reasons for it. We also consider the trends, challenges and opportunities that the PSM community will continue to face. Our aim is to evaluate the pre-2000 PSM community to encourage its revolutionary development post-2006 and offer directions for the long term sustainability of the field.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A multi-chromosome GA (Multi-GA) was developed, based upon concepts from the natural world, allowing improved flexibility in a number of areas including representation, genetic operators, their parameter rates and real world multi-dimensional applications. A series of experiments were conducted, comparing the performance of the Multi-GA to a traditional GA on a number of recognised and increasingly complex test optimisation surfaces, with promising results. Further experiments demonstrated the Multi-GA's flexibility through the use of non-binary chromosome representations and its applicability to dynamic parameterisation. A number of alternative and new methods of dynamic parameterisation were investigated, in addition to a new non-binary 'Quotient crossover' mechanism. Finally, the Multi-GA was applied to two real world problems, demonstrating its ability to handle mixed type chromosomes within an individual, the limited use of a chromosome level fitness function, the introduction of new genetic operators for structural self-adaptation and its viability as a serious real world analysis tool. The first problem involved optimum placement of computers within a building, allowing the Multi-GA to use multiple chromosomes with different type representations and different operators in a single individual. The second problem, commonly associated with Geographical Information Systems (GIS), required a spatial analysis location of the optimum number and distribution of retail sites over two different population grids. In applying the Multi-GA, two new genetic operators (addition and deletion) were developed and explored, resulting in the definition of a mechanism for self-modification of genetic material within the Multi-GA structure and a study of this behaviour.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There has been much recent research into extracting useful diagnostic features from the electrocardiogram with numerous studies claiming impressive results. However, the robustness and consistency of the methods employed in these studies is rarely, if ever, mentioned. Hence, we propose two new methods; a biologically motivated time series derived from consecutive P-wave durations, and a mathematically motivated regularity measure. We investigate the robustness of these two methods when compared with current corresponding methods. We find that the new time series performs admirably as a compliment to the current method and the new regularity measure consistently outperforms the current measure in numerous tests on real and synthetic data.