13 resultados para QUANTIFYING LEISHMANIA

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The judicial interest in ‘scientific’ evidence has driven recent work to quantify results for forensic linguistic authorship analysis. Through a methodological discussion and a worked example this paper examines the issues which complicate attempts to quantify results in work. The solution suggested to some of the difficulties is a sampling and testing strategy which helps to identify potentially useful, valid and reliable markers of authorship. An important feature of the sampling strategy is that these markers identified as being generally valid and reliable are retested for use in specific authorship analysis cases. The suggested approach for drawing quantified conclusions combines discriminant function analysis and Bayesian likelihood measures. The worked example starts with twenty comparison texts for each of three potential authors and then uses a progressively smaller comparison corpus, reducing to fifteen, ten, five and finally three texts per author. This worked example demonstrates how reducing the amount of data affects the way conclusions can be drawn. With greater numbers of reference texts quantified and safe attributions are shown to be possible, but as the number of reference texts reduces the analysis shows how the conclusion which should be reached is that no attribution can be made. The testing process at no point results in instances of a misattribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A framework that connects computational mechanics and molecular dynamics has been developed and described. As the key parts of the framework, the problem of symbolising molecular trajectory and the associated interrelation between microscopic phase space variables and macroscopic observables of the molecular system are considered. Following Shalizi and Moore, it is shown that causal states, the constituent parts of the main construct of computational mechanics, the e-machine, define areas of the phase space that are optimal in the sense of transferring information from the micro-variables to the macro-observables. We have demonstrated that, based on the decay of their Poincare´ return times, these areas can be divided into two classes that characterise the separation of the phase space into resonant and chaotic areas. The first class is characterised by predominantly short time returns, typical to quasi-periodic or periodic trajectories. This class includes a countable number of areas corresponding to resonances. The second class includes trajectories with chaotic behaviour characterised by the exponential decay of return times in accordance with the Poincare´ theorem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When making predictions with complex simulators it can be important to quantify the various sources of uncertainty. Errors in the structural specification of the simulator, for example due to missing processes or incorrect mathematical specification, can be a major source of uncertainty, but are often ignored. We introduce a methodology for inferring the discrepancy between the simulator and the system in discrete-time dynamical simulators. We assume a structural form for the discrepancy function, and show how to infer the maximum-likelihood parameter estimates using a particle filter embedded within a Monte Carlo expectation maximization (MCEM) algorithm. We illustrate the method on a conceptual rainfall-runoff simulator (logSPM) used to model the Abercrombie catchment in Australia. We assess the simulator and discrepancy model on the basis of their predictive performance using proper scoring rules. This article has supplementary material online. © 2011 International Biometric Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid global loss of biodiversity has led to a proliferation of systematic conservation planning methods. In spite of their utility and mathematical sophistication, these methods only provide approximate solutions to real-world problems where there is uncertainty and temporal change. The consequences of errors in these solutions are seldom characterized or addressed. We propose a conceptual structure for exploring the consequences of input uncertainty and oversimpli?ed approximations to real-world processes for any conservation planning tool or strategy. We then present a computational framework based on this structure to quantitatively model species representation and persistence outcomes across a range of uncertainties. These include factors such as land costs, landscape structure, species composition and distribution, and temporal changes in habitat. We demonstrate the utility of the framework using several reserve selection methods including simple rules of thumb and more sophisticated tools such as Marxan and Zonation. We present new results showing how outcomes can be strongly affected by variation in problem characteristics that are seldom compared across multiple studies. These characteristics include number of species prioritized, distribution of species richness and rarity, and uncertainties in the amount and quality of habitat patches. We also demonstrate how the framework allows comparisons between conservation planning strategies and their response to error under a range of conditions. Using the approach presented here will improve conservation outcomes and resource allocation by making it easier to predict and quantify the consequences of many different uncertainties and assumptions simultaneously. Our results show that without more rigorously generalizable results, it is very dif?cult to predict the amount of error in any conservation plan. These results imply the need for standard practice to include evaluating the effects of multiple real-world complications on the behavior of any conservation planning method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Counts of Pick bodies (PB), Pick cells (PC), senile plaques (SP) and neurofibrillary tangles (NFT) were made in the frontal and temporal cortex from patients with Pick's disease (PD). Lesions were stained histologically with hematoxylin and eosin (HE) and the Bielschowsky silver impregnation method and labeled immunohistochemically with antibodies raised to ubiquitin and tau. The greatest numbers of PB were revealed by immunohistochemistry. Counts of PB revealed by ubiquitin and tau were highly positively correlated which suggested that the two antibodies recognized virtually identical populations of PB. The greatest numbers of PC were revealed by HE followed by the anti-ubiquitin antibody. However, the correlation between counts was poor, suggesting that HE and ubiquitin revealed different populations of PC. The greatest numbers of SP and NFT were revealed by the Bielschowsky method indicating the presence of Alzheimer-type lesions not revealed by the immunohistochemistry. In addition, more NFT were revealed by the anti-ubiquitin compared with the anti-tau antibody. The data suggested that in PD: (i) the anti-ubiquitin and anti-tau antibodies were equally effective at labeling PB; (ii) both HE and anti-ubiquitin should be used to quantitate PC; and (iii) the Bielschowsky method should be used to quantitate SP and NFT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pH and counter-ion response of a microphase separated poly(methyl methacrylate)-block-poly(2-(diethylamino)ethyl methacrylate)-block-poly(methyl methacrylate) hydrogel has been investigated using laser light scattering on an imprinted micron scale topography. A quartz diffraction grating was used to create a micron-sized periodic structure on the surface of a thin film of the polymer and the resulting diffraction pattern used to calculate the swelling ratio of the polymer film in situ. A potentiometric titration and a sequence of counter ion species, taken from the Hofmeister series, have been used to compare the results obtained using this novel technique against small angle X-ray scattering (nanoscopic) and gravimetric studies of bulk gel pieces (macroscopic). For the first time, the technique has been proven to be an inexpensive and effective analytical tool for measuring hydrogel response on the microscopic scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Parkinson’s disease (PD) is an incurable neurological disease with approximately 0.3% prevalence. The hallmark symptom is gradual movement deterioration. Current scientific consensus about disease progression holds that symptoms will worsen smoothly over time unless treated. Accurate information about symptom dynamics is of critical importance to patients, caregivers, and the scientific community for the design of new treatments, clinical decision making, and individual disease management. Long-term studies characterize the typical time course of the disease as an early linear progression gradually reaching a plateau in later stages. However, symptom dynamics over durations of days to weeks remains unquantified. Currently, there is a scarcity of objective clinical information about symptom dynamics at intervals shorter than 3 months stretching over several years, but Internet-based patient self-report platforms may change this. Objective: To assess the clinical value of online self-reported PD symptom data recorded by users of the health-focused Internet social research platform PatientsLikeMe (PLM), in which patients quantify their symptoms on a regular basis on a subset of the Unified Parkinson’s Disease Ratings Scale (UPDRS). By analyzing this data, we aim for a scientific window on the nature of symptom dynamics for assessment intervals shorter than 3 months over durations of several years. Methods: Online self-reported data was validated against the gold standard Parkinson’s Disease Data and Organizing Center (PD-DOC) database, containing clinical symptom data at intervals greater than 3 months. The data were compared visually using quantile-quantile plots, and numerically using the Kolmogorov-Smirnov test. By using a simple piecewise linear trend estimation algorithm, the PLM data was smoothed to separate random fluctuations from continuous symptom dynamics. Subtracting the trends from the original data revealed random fluctuations in symptom severity. The average magnitude of fluctuations versus time since diagnosis was modeled by using a gamma generalized linear model. Results: Distributions of ages at diagnosis and UPDRS in the PLM and PD-DOC databases were broadly consistent. The PLM patients were systematically younger than the PD-DOC patients and showed increased symptom severity in the PD off state. The average fluctuation in symptoms (UPDRS Parts I and II) was 2.6 points at the time of diagnosis, rising to 5.9 points 16 years after diagnosis. This fluctuation exceeds the estimated minimal and moderate clinically important differences, respectively. Not all patients conformed to the current clinical picture of gradual, smooth changes: many patients had regimes where symptom severity varied in an unpredictable manner, or underwent large rapid changes in an otherwise more stable progression. Conclusions: This information about short-term PD symptom dynamics contributes new scientific understanding about the disease progression, currently very costly to obtain without self-administered Internet-based reporting. This understanding should have implications for the optimization of clinical trials into new treatments and for the choice of treatment decision timescales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resistance to pentavallent antimonial (Sb-v) agents such as sodium stibogluconate (SSG) is creating a major problem in the treatment of visceral leishmaniasis. In the present study the in vivo susceptibilities of Leishmania donovani strains, typed as SSG resistant (strain 200011) or SSG sensitive (strain 200016) on the basis of their responses to a single SSG dose of 300 mg of Sb-v/kg of body weight, to other antileishmanial drugs were determined. In addition, the role of glutathione in SSG resistance was investigated by determining the influence on SSG treatment of concomitant treatment with a nonionic surfactant vesicle formulation of buthionine sulfoximine (BSO), a specific inhibitor of the enzyme gamma-glutamylcysteine synthetase which is involved in glutathione biosynthesis, and SSG, on the efficacy of SSG treatment. L. donovani strains that were SSG resistant (strain 200011) and SSG sensitive (strain 200016) were equally susceptible to in vivo treatment with miltefosine, paromomycin and amphotericin B (Fungizone and AmBisome) formulations. Combined treatment with SSG and vesicular BSO significantly increased the in vivo efficacy of SSG against both the 200011 and the 200016 L. donovani strains. However, joint treatment that included high SSG doses was unexpectedly associated with toxicity. Measurement of glutathione levels in the spleens and livers of treated mice showed that the ability of the combined therapy to inhibit glutathione levels was also dependent on the SSG dose used and that the combined treatment exhibited organ-dependent effects. The SSG resistance exhibited by the L. donovani strains was not associated with cross-resistance to other classes of compounds and could be reversed by treatment with an inhibitor of glutathione biosynthesis, indicating that clinical resistance to antimonial drugs should not affect the antileishmanial efficacies of alternative drugs. In addition, it should be possible to identify a treatment regimen that could reverse antimony resistance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The simulated classical dynamics of a small molecule exhibiting self-organizing behavior via a fast transition between two states is analyzed by calculation of the statistical complexity of the system. It is shown that the complexity of molecular descriptors such as atom coordinates and dihedral angles have different values before and after the transition. This provides a new tool to identify metastable states during molecular self-organization. The highly concerted collective motion of the molecule is revealed. Low-dimensional subspaces dynamics is found sensitive to the processes in the whole, high-dimensional phase space of the system. © 2004 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.