9 resultados para RAPID METHODS
em Aston University Research Archive
Resumo:
The rapid global loss of biodiversity has led to a proliferation of systematic conservation planning methods. In spite of their utility and mathematical sophistication, these methods only provide approximate solutions to real-world problems where there is uncertainty and temporal change. The consequences of errors in these solutions are seldom characterized or addressed. We propose a conceptual structure for exploring the consequences of input uncertainty and oversimpli?ed approximations to real-world processes for any conservation planning tool or strategy. We then present a computational framework based on this structure to quantitatively model species representation and persistence outcomes across a range of uncertainties. These include factors such as land costs, landscape structure, species composition and distribution, and temporal changes in habitat. We demonstrate the utility of the framework using several reserve selection methods including simple rules of thumb and more sophisticated tools such as Marxan and Zonation. We present new results showing how outcomes can be strongly affected by variation in problem characteristics that are seldom compared across multiple studies. These characteristics include number of species prioritized, distribution of species richness and rarity, and uncertainties in the amount and quality of habitat patches. We also demonstrate how the framework allows comparisons between conservation planning strategies and their response to error under a range of conditions. Using the approach presented here will improve conservation outcomes and resource allocation by making it easier to predict and quantify the consequences of many different uncertainties and assumptions simultaneously. Our results show that without more rigorously generalizable results, it is very dif?cult to predict the amount of error in any conservation plan. These results imply the need for standard practice to include evaluating the effects of multiple real-world complications on the behavior of any conservation planning method.
Resumo:
The timeline imposed by recent worldwide chemical legislation is not amenable to conventional in vivo toxicity testing, requiring the development of rapid, economical in vitro screening strategies which have acceptable predictive capacities. When acquiring regulatory neurotoxicity data, distinction on whether a toxic agent affects neurons and/or astrocytes is essential. This study evaluated neurofilament (NF) and glial fibrillary acidic protein (GFAP) directed single-cell (S-C) ELISA and flow cytometry as methods for distinguishing cell-specific cytoskeletal responses, using the established human NT2 neuronal/astrocytic (NT2.N/A) co-culture model and a range of neurotoxic (acrylamide, atropine, caffeine, chloroquine, nicotine) and non-neurotoxic (chloramphenicol, rifampicin, verapamil) test chemicals. NF and GFAP directed flow cytometry was able to identify several of the test chemicals as being specifically neurotoxic (chloroquine, nicotine) or astrocytoxic (atropine, chloramphenicol) via quantification of cell death in the NT2.N/A model at cytotoxic concentrations using the resazurin cytotoxicity assay. Those neurotoxicants with low associated cytotoxicity are the most significant in terms of potential hazard to the human nervous system. The NF and GFAP directed S-C ELISA data predominantly demonstrated the known neurotoxicants only to affect the neuronal and/or astrocytic cytoskeleton in the NT2.N/A cell model at concentrations below those affecting cell viability. This report concluded that NF and GFAP directed S-C ELISA and flow cytometric methods may prove to be valuable additions to an in vitro screening strategy for differentiating cytotoxicity from specific neuronal and/or astrocytic toxicity. Further work using the NT2.N/A model and a broader array of toxicants is appropriate in order to confirm the applicability of these methods.
Resumo:
Objectives: To determine the sensitivity and specificity of a novel ELISA for the serodiagnosis of surgical site infection (SSI) due to staphylococci following median sternotomy. Methods: Twelve patients with a superficial sternal SSI and 19 with a deep sternal SSI due to Staphylococcus aureus were compared with 37 control patients who also underwent median sternotomy for cardiac surgery but exhibited no microbiological or clinical symptoms of infection. A further five patients with sternal SSI due to coagulase-negative (CoNS) staphylococci were studied. An ELISA incorporating a recently recognised exocellular short chain form of lipoteichoic acid (lipid S) recovered from CoNS, was used to determine serum levels of anti-lipid S IgG in all patient groups. Results: Serum anti-lipid S IgG titres of patients with sternal SSI due to S. aureus were significantly higher than the control patients (P<0.0001). In addition, patients with deep sternal SSI had significantly higher serum anti-lipid S IgG titres than patients with superficial sternal SSI (P=0.03). Serum anti-lipid S IgG titres of patients with sternal SSI due to CoNS were significantly higher than the control patients (P=0.001). Conclusion: The lipid S ELISA may facilitate the diagnosis of sternal SSI due to S. aureus and could also be of value with infection due to CoNS. © 2005 Published by Elsevier Ltd. on behalf of The Bristish Infection Society.
Resumo:
Abstract Oxidation of proteins has received a lot of attention in the last decades due to the fact that they have been shown to accumulate and to be implicated in the progression and the patho-physiology of several diseases such as Alzheimer, coronary heart diseases, etc. This has also resulted in the fact that research scientist became more eager to be able to measure accurately the level of oxidized protein in biological materials, and to determine the precise site of the oxidative attack on the protein, in order to get insights into the molecular mechanisms involved in the progression of diseases. Several methods for measuring protein carbonylation have been implemented in different laboratories around the world. However, to date no methods prevail as the most accurate, reliable and robust. The present paper aims at giving an overview of the common methods used to determine protein carbonylation in biological material as well as to highlight the limitations and the potential. The ultimate goal is to give quick tips for a rapid decision making when a method has to be selected and taking into consideration the advantage and drawback of the methods.
Resumo:
The evolution of cognitive neuroscience has been spurred by the development of increasingly sophisticated investigative techniques to study human cognition. In Methods in Mind, experts examine the wide variety of tools available to cognitive neuroscientists, paying particular attention to the ways in which different methods can be integrated to strengthen empirical findings and how innovative uses for established techniques can be developed. The book will be a uniquely valuable resource for the researcher seeking to expand his or her repertoire of investigative techniques. Each chapter explores a different approach. These include transcranial magnetic stimulation, cognitive neuropsychiatry, lesion studies in nonhuman primates, computational modeling, psychophysiology, single neurons and primate behavior, grid computing, eye movements, fMRI, electroencephalography, imaging genetics, magnetoencephalography, neuropharmacology, and neuroendocrinology. As mandated, authors focus on convergence and innovation in their fields; chapters highlight such cross-method innovations as the use of the fMRI signal to constrain magnetoencephalography, the use of electroencephalography (EEG) to guide rapid transcranial magnetic stimulation at a specific frequency, and the successful integration of neuroimaging and genetic analysis. Computational approaches depend on increased computing power, and one chapter describes the use of distributed or grid computing to analyze massive datasets in cyberspace. Each chapter author is a leading authority in the technique discussed.
Resumo:
Aim: To develop and evaluate a rapid enzyme linked immunosorbent assay (ELISA) for the diagnosis of intravascular catheter related sepsis caused by coagulase negative staphylococci. Methods: Forty patients with a clinical and microbiological diagnosis of intravascular catheter related sepsis and positive blood cultures, caused by coagulase negative staphylococci, and 40 control patients requiring a central venous catheter as part of their clinical management were recruited into the study. Serum IgG responses to a previously undetected exocellular antigen produced by coagulase negative staphylococci, termed lipid S, were determined in the patient groups by a rapid ELISA. Results: There was a significant difference (p = < 0.0001) in serum IgG to lipid S between patients with catheter related sepsis and controls. The mean antibody titre in patients with sepsis caused by coagulase negative staphylococci was 10 429 (range, no detectable serum IgG antibody to 99 939), whereas serum IgG was not detected in the control group of patients. Conclusions: The rapid ELISA offers a simple, economical, and rapid diagnostic test for suspected intravascular catheter related sepsis caused by coagulase negative staphylococci, which can be difficult to diagnose clinically. This may facilitate treatment with appropriate antimicrobials and may help prevent the unnecessary removal of intravascular catheters.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
Oxidative post-translational modifications (oxPTMs) can alter the function of proteins, and are important in the redox regulation of cell behaviour. The most informative technique to detect and locate oxPTMs within proteins is mass spectrometry (MS). However, proteomic MS data are usually searched against theoretical databases using statistical search engines, and the occurrence of unspecified or multiple modifications, or other unexpected features, can lead to failure to detect the modifications and erroneous identifications of oxPTMs. We have developed a new approach for mining data from accurate mass instruments that allows multiple modifications to be examined. Accurate mass extracted ion chromatograms (XIC) for specific reporter ions from peptides containing oxPTMs were generated from standard LC-MSMS data acquired on a rapid-scanning high-resolution mass spectrometer (ABSciex 5600 Triple TOF). The method was tested using proteins from human plasma or isolated LDL. A variety of modifications including chlorotyrosine, nitrotyrosine, kynurenine, oxidation of lysine, and oxidized phospholipid adducts were detected. For example, the use of a reporter ion at 184.074 Da/e, corresponding to phosphocholine, was used to identify for the first time intact oxidized phosphatidylcholine adducts on LDL. In all cases the modifications were confirmed by manual sequencing. ApoB-100 containing oxidized lipid adducts was detected even in healthy human samples, as well as LDL from patients with chronic kidney disease. The accurate mass XIC method gave a lower false positive rate than normal database searching using statistical search engines, and identified more oxidatively modified peptides. A major advantage was that additional modifications could be searched after data collection, and multiple modifications on a single peptide identified. The oxPTMs present on albumin and ApoB-100 have potential as indicators of oxidative damage in ageing or inflammatory diseases.
Resumo:
Purpose: To determine whether the ‘through-focus’ aberrations of a multifocal and accommodative intraocular lens (IOL) implanted patient can be used to provide rapid and reliable measures of their subjective range of clear vision. Methods: Eyes that had been implanted with a concentric (n = 8), segmented (n = 10) or accommodating (n = 6) intraocular lenses (mean age 62.9 ± 8.9 years; range 46-79 years) for over a year underwent simultaneous monocular subjective (electronic logMAR test chart at 4m with letters randomised between presentations) and objective (Aston open-field aberrometer) defocus curve testing for levels of defocus between +1.50 to -5.00DS in -0.50DS steps, in a randomised order. Pupil size and ocular aberration (a combination of the patient’s and the defocus inducing lens aberrations) at each level of blur was measured by the aberrometer. Visual acuity was measured subjectively at each level of defocus to determine the traditional defocus curve. Objective acuity was predicted using image quality metrics. Results: The range of clear focus differed between the three IOL types (F=15.506, P=0.001) as well as between subjective and objective defocus curves (F=6.685, p=0.049). There was no statistically significant difference between subjective and objective defocus curves in the segmented or concentric ring MIOL group (P>0.05). However a difference was found between the two measures and the accommodating IOL group (P<0.001). Mean Delta logMAR (predicted minus measured logMAR) across all target vergences was -0.06 ± 0.19 logMAR. Predicted logMAR defocus curves for the multifocal IOLs did not show a near vision addition peak, unlike the subjective measurement of visual acuity. However, there was a strong positive correlation between measured and predicted logMAR for all three IOLs (Pearson’s correlation: P<0.001). Conclusions: Current subjective procedures are lengthy and do not enable important additional measures such as defocus curves under differently luminance or contrast levels to be assessed, which may limit our understanding of MIOL performance in real-world conditions. In general objective aberrometry measures correlated well with the subjective assessment indicating the relative robustness of this technique in evaluating post-operative success with segmented and concentric ring MIOL.