38 resultados para Potential methods

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The timeline imposed by recent worldwide chemical legislation is not amenable to conventional in vivo toxicity testing, requiring the development of rapid, economical in vitro screening strategies which have acceptable predictive capacities. When acquiring regulatory neurotoxicity data, distinction on whether a toxic agent affects neurons and/or astrocytes is essential. This study evaluated neurofilament (NF) and glial fibrillary acidic protein (GFAP) directed single-cell (S-C) ELISA and flow cytometry as methods for distinguishing cell-specific cytoskeletal responses, using the established human NT2 neuronal/astrocytic (NT2.N/A) co-culture model and a range of neurotoxic (acrylamide, atropine, caffeine, chloroquine, nicotine) and non-neurotoxic (chloramphenicol, rifampicin, verapamil) test chemicals. NF and GFAP directed flow cytometry was able to identify several of the test chemicals as being specifically neurotoxic (chloroquine, nicotine) or astrocytoxic (atropine, chloramphenicol) via quantification of cell death in the NT2.N/A model at cytotoxic concentrations using the resazurin cytotoxicity assay. Those neurotoxicants with low associated cytotoxicity are the most significant in terms of potential hazard to the human nervous system. The NF and GFAP directed S-C ELISA data predominantly demonstrated the known neurotoxicants only to affect the neuronal and/or astrocytic cytoskeleton in the NT2.N/A cell model at concentrations below those affecting cell viability. This report concluded that NF and GFAP directed S-C ELISA and flow cytometric methods may prove to be valuable additions to an in vitro screening strategy for differentiating cytotoxicity from specific neuronal and/or astrocytic toxicity. Further work using the NT2.N/A model and a broader array of toxicants is appropriate in order to confirm the applicability of these methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives Ecstasy is a recreational drug whose active ingredient, 3,4-methylenedioxymethamphetamine (MDMA), acts predominantly on the serotonergic system. Although MDMA is known to be neurotoxic in animals, the long-term effects of recreational Ecstasy use in humans remain controversial but one commonly reported consequence is mild cognitive impairment particularly affecting verbal episodic memory. Although event-related potentials (ERPs) have made significant contributions to our understanding of human memory processes, until now they have not been applied to study the long-term effects of Ecstasy. The aim of this study was to examine the effects of past Ecstasy use on recognition memory for both verbal and non-verbal stimuli using ERPs. Methods We compared the ERPs of 15 Ecstasy/polydrug users with those of 14 cannabis users and 13 non-illicit drug users as controls. Results Despite equivalent memory performance, Ecstasy/polydrug users showed an attenuated late positivity over left parietal scalp sites, a component associated with the specific memory process of recollection. Conlusions This effect was only found in the word recognition task which is consistent with evidence that left hemisphere cognitive functions are disproportionately affected by Ecstasy, probably because the serotonergic system is laterally asymmetrical. Experimentally, decreasing central serotonergic activity through acute tryptophan depletion also selectively impairs recollection, and this too suggests the importance of the serotonergic system. Overall, our results suggest that Ecstasy users, who also use a wide range of other drugs, show a durable abnormality in a specific ERP component thought to be associated with recollection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian techniques have been developed over many years in a range of different fields, but have only recently been applied to the problem of learning in neural networks. As well as providing a consistent framework for statistical pattern recognition, the Bayesian approach offers a number of practical advantages including a potential solution to the problem of over-fitting. This chapter aims to provide an introductory overview of the application of Bayesian methods to neural networks. It assumes the reader is familiar with standard feed-forward network models and how to train them using conventional techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian techniques have been developed over many years in a range of different fields, but have only recently been applied to the problem of learning in neural networks. As well as providing a consistent framework for statistical pattern recognition, the Bayesian approach offers a number of practical advantages including a potential solution to the problem of over-fitting. This chapter aims to provide an introductory overview of the application of Bayesian methods to neural networks. It assumes the reader is familiar with standard feed-forward network models and how to train them using conventional techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To evaluate the accuracy of an open-field autorefractor compared with subjective refraction in pseudophakes and hence its ability to assess objective eye focus with intraocular lenses (IOLs). Methods: Objective refraction was measured at 6 m using the Shin-Nippon NVision-K 5001/Grand Seiko WR-5100K open-field autorefractor (five repeats) and by subjective refraction on 141 eyes implanted with a spherical (Softec1 n=53), aspherical (SoftecHD n=37) or accommodating (1CU n=22; Tetraflex n=29) IOL. Autorefraction was repeated 2 months later. Results: The autorefractor prescription was similar (average difference: 0.09±0.53 D; p=0.19) to that found by subjective refraction, with ~71% within ±0.50 D. The horizontal cylindrical components were similar (difference: 0.00±0.39 D; p=0.96), although the oblique (J45) autorefractor cylindrical vector was slightly more negative (by -0.06±0.25 D; p=0.06) than the subjective refraction. The results were similar for each of the IOL designs except for the spherical IOL, where the mean spherical equivalent difference between autorefraction and subjective was more hypermetropic than the Tetraflex accommodating IOL (F=2.77, p=0.04). The intrasession repeatability was

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent work we have developed a novel variational inference method for partially observed systems governed by stochastic differential equations. In this paper we provide a comparison of the Variational Gaussian Process Smoother with an exact solution computed using a Hybrid Monte Carlo approach to path sampling, applied to a stochastic double well potential model. It is demonstrated that the variational smoother provides us a very accurate estimate of mean path while conditional variance is slightly underestimated. We conclude with some remarks as to the advantages and disadvantages of the variational smoother. © 2008 Springer Science + Business Media LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, technologically advanced methodologies such as Translog have gained a lot of ground in translation process research. However, in this paper it will be argued that quantitative research methods can be supplemented by ethnographic qualitative ones so as to enhance our understanding of what underlies the translation process. Although translation studies scholars have sometimes applied an ethnographic approach to the study of translation, this paper offers a different perspective and considers the potential of ethnographic research methods for tapping cognitive and behavioural aspects of the translation process. A number of ethnographic principles are discussed and it is argued that process researchers aiming to understand translators’ perspectives and intentions, how these shape their behaviours, as well as how translators reflect on the situations they face and how they see themselves, would undoubtedly benefit from adopting an ethnographic framework for their studies on translation processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The underlying work to this thesis focused on the exploitation and investigation of photosensitivity mechanisms in optical fibres and planar waveguides for the fabrication of advanced integrated optical devices for telecoms and sensing applications. One major scope is the improvement of grating fabrication specifications by introducing new writing techniques and the use of advanced characterisation methods for grating testing. For the first time the polarisation control method for advanced grating fabrication has successfully been converted to apodised planar waveguide fabrication and the development of a holographic method for the inscription of chirped gratings at arbitrary wavelength is presented. The latter resulted in the fabrication of gratings for pulse-width suppression and wavelength selection in diode lasers. In co-operation with research partners a number of samples were tested using optical frequency domain and optical low coherence reflectometry for a better insight into the limitations of grating writing techniques. Using a variety of different fabrication methods, custom apodised and chirped fibre Bragg gratings were written for the use as filter elements for multiplexer-demultiplexer devices, as well as for short pulse generation and wavelength selection in telecommunication transmission systems. Long period grating based devices in standard, speciality and tapered fibres are presented, showing great potential for multi-parameter sensing. One particular scope is the development of vectorial curvature and refractive index sensors with potential for medical, chemical and biological sensing. In addition the design of an optically tunable Mach-Zehnder based multiwavelength filter is introduced. The discovery of a Type IA grating type through overexposure of hydrogen loaded standard and Boron-Germanium co-doped fibres strengthened the assumption of UV-photosensitivity being a highly non-linear process. Gratings of this type show a significantly lower thermal sensitivity compared to standard gratings, which makes them useful for sensing applications. An Oxford Lasers copper-vapour laser operating at 255 nm in pulsed mode was used for their inscription, in contrast to previous work using CW-Argon-Ion lasers and contributing to differences in the processes of the photorefractive index change

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Brain stem death can elicit a potentially manipulable cardiotoxic proinflammatory cytokine response. We investigated the prevalence of this response, the impact of donor management with tri-iodothyronine (T3) and methylprednisolone (MP) administration, and the relationship of biomarkers to organ function and transplant suitability. METHODS: In a prospective randomized double-blinded factorially designed study of T3 and MP therapy, we measured serum levels of interleukin-1 and -6 (IL-1 and IL-6), tumor necrosis factor-alpha (TNF-alpha), C-reactive protein, and procalcitonin (PCT) levels in 79 potential heart or lung donors. Measurements were performed before and after 4 hr of algorithm-based donor management to optimize cardiorespiratory function and +/-hormone treatment. Donors were assigned to receive T3, MP, both drugs, or placebo. RESULTS: Initial IL-1 was elevated in 16% donors, IL-6 in 100%, TNF-alpha in 28%, CRP in 98%, and PCT in 87%. Overall biomarker concentrations did not change between initial and later measurements and neither T3 nor MP effected any change. Both PCT (P =0.02) and TNF-alpha (P =0.044) levels were higher in donor hearts with marginal hemodynamics at initial assessment. Higher PCT levels were related to worse cardiac index and right and left ventricular ejection fractions and a PCT level more than 2 ng x mL(-1) may attenuate any improvement in cardiac index gained by donor management. No differences were observed between initially marginal and nonmarginal donor lungs. A PCT level less than or equal to 2 ng x mL(-1) but not other biomarkers predicted transplant suitability following management. CONCLUSIONS: There is high prevalence of a proinflammatory environment in the organ donor that is not affected by tri-iodothyronine or MP therapy. High PCT and TNF-alpha levels are associated with donor heart dysfunction. (C) 2009 Lippincott Williams & Wilkins, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visualization of high-dimensional data has always been a challenging task. Here we discuss and propose variants of non-linear data projection methods (Generative Topographic Mapping (GTM) and GTM with simultaneous feature saliency (GTM-FS)) that are adapted to be effective on very high-dimensional data. The adaptations use log space values at certain steps of the Expectation Maximization (EM) algorithm and during the visualization process. We have tested the proposed algorithms by visualizing electrostatic potential data for Major Histocompatibility Complex (MHC) class-I proteins. The experiments show that the variation in the original version of GTM and GTM-FS worked successfully with data of more than 2000 dimensions and we compare the results with other linear/nonlinear projection methods: Principal Component Analysis (PCA), Neuroscale (NSC) and Gaussian Process Latent Variable Model (GPLVM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of oligonucleotides directed against the mRNA of HIV promises site-specific inhibition of viral replication. In this work, the effect of aralkyl substituents on oligonucleotide duplex stability was studied using model oligonucleotide sequences in an attempt to improve targeting of oligonucleotides to viral mRNA. Arakyl-substituted oligonucleotides were made by solid phase synthesis using either the appropriate aralkyl-substituted phosphoramidite or by post-synthetic substitution of a pentafluorophenoxy substituent by N-methylphenethylamine. The presence of phenethyl or benzoyl substituents invariably resulted in thermodynamic destabilisation of all duplexes studied. The methods which were developed for the synthesis of nucleoside intermediates for oligonucleotide applications were also used to prepare a series of nucleoside analogues derived from uridine, 2'-deoxyuridine and AZT. Crystal structures of six compounds were successfully determined. Anti-HIV activity was observed for most compounds in the series although none were without cytotoxicity. The most active compound of the series was the ribose nucleoside; 1-β-D-erythro-pentofuranosyl-4-pentafluorophenoxy-pyrimidine-2(1H)-one 95, derived directly from uridine. The same series of compounds also displayed very modest anti-cancer activity. To enable synthesis of prooligonucleotides and analogues for possible antisense applications, the properties of a new Silyl-Linked Controlled Pore Glass solid support were investigated. Synthesis of the sequences d(Tp)7T, d(Tps)7T and the base-sensitive d(Tp)3(CBzp)2(Tp)2T was achieved using the silyl-linked solid support in a fluoride-induced cleavage/deprotection strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contrary to previously held beliefs, it is now known that bacteria exist not only on the surface of the skin but they are also distributed at varying depths beneath the skin surface. Hence, in order to sterilise the skin, antimicrobial agents are required to penetrate across the skin and eliminate the bacteria residing at all depths. Chlorhexidine is an antimicrobial agent with the widest use for skin sterilisation. However, due to its poor permeation rate across the skin, sterilisation of the skin cannot be achieved and, therefore, the remaining bacteria can act as a source of infection during an operation or insertion of catheters. The underlying theme of this study is to enhance the permeation of this antimicrobial agent in the skin by employing chemical (enhancers and supersaturated systems) or physical (iontophoresis) techniques. The hydrochloride salt of chlorhexidine (CHX), a poorly soluble salt, was used throughout this study. The effect of ionisation on in vitro permeation rate across the excised human epidennis was investigated using Franz-type diffusion cells. Saturated solutions of CHX were used as donor and the variable studied was vehicle pH. Permeation rate was increased with increasing vehicle pH. The pH effect was not related to the level of ionisation of the drug. The effect of donor vehicle was also studied using saturated solutions of CHX in 10% and 20% ethanol as the donor solutions. Permeation of CHX was enhanced by increasing the concentration of ethanol which could be due to the higher concentration of CHX in the donor phase and the effect of ethanol itself on the membrane. The interplay between drug diffusion and enhancer pretreatment of the epidennis was studied. Pretreatment of the membrane with 10% Azone/PG demonstrated the highest diffusion rate followed by 10% olcic acid/PG pretreatment compared to other pretreatment regimens (ethanol, dimethyl sulfoxide (DMSO), propylene glycol (PG), sodium dodecyl sulphate (SDS) and dodecyl trimethyl ammonium bromide (DT AB). Differential Scanning Calorimetry (DSC) was also employed to study the mode of action of these enhancers. The potential of supersaturated solutions in enhancing percutaneous absorption of CHX was investigated. Various anti-nucleating polymers were screened in order to establish the most effective agent. Polyvinylpyrrolidone (PVP, K30) was found to be a better candidate than its lower molecular weight counterpart (K25) and hydroxypropyl methyleellulose (HPMC). The permeation studies showed an increase in diffusion rate by increasing the degree of saturation. Iontophoresis is a physical means of transdemal drug delivery enhancement that causes an increased penetration of molecules into or through the skin by the application of an electric field. This technique was employed in conjunction with chemical enhancers to assess the effect on CHX permeation across the human epidermis. An improved transport of CHX, which was pH dependant was observed upon application of the current. Combined use of iontophoresis and chemical enhancers further increased the CHX transport indicating a synergistic effect. Pretreatment of the membrane with 10% Azone/PG demonstrated the greatest effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Qualitative research is increasingly valued as part of the evidence for policy and practice, but how it should be appraised is contested. Various appraisal methods, including checklists and other structured approaches, have been proposed but rarely evaluated. We aimed to compare three methods for appraising qualitative research papers that were candidates for inclusion in a systematic review of evidence on support for breast-feeding. Method: A sample of 12 research papers on support for breast-feeding was appraised by six qualitative reviewers using three appraisal methods: unprompted judgement, based on expert opinion; a UK Cabinet Office quality framework; and CASP, a Critical Appraisal Skills Programme tool. Papers were assigned, following appraisals, to 1 of 5 categories, which were dichotomized to indicate whether or not papers should be included in a systematic review. Patterns of agreement in categorization of papers were assessed quantitatively using κ statistics, and qualitatively using cross-case analysis. Results: Agreement in categorizing papers across the three methods was slight (κ =0.13; 95% CI 0.06-0.24). Structured approaches did not appear to yield higher agreement than that by unprompted judgement. Qualitative analysis revealed reviewers' dilemmas in deciding between the potential impact of findings and the quality of the research execution or reporting practice. Structured instruments appeared to make reviewers more explicit about the reasons for their judgements. Conclusions: Structured approaches may not produce greater consistency of judgements about whether to include qualitative papers in a systematic review. Future research should address how appraisals of qualitative research should be incorporated in systematic reviews. © The Royal Society of Medicine Press Ltd 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.