982 resultados para TOTAL ANALYSIS SYSTEMS


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The authors describe a reverse-phase high-performance liquid chromatography-electrospray-tandem mass spectrometry method for the measurement of nicotine in human plasma. Samples (500 muL) with added deuterium-labeled d(3)-nicotine as an internal standard (IS) were treated with a 2-step process of ether extraction (6 mL) followed by back-extraction into 0.1% formic acid (50 muL). Chromatography was performed on a phenyl Novapak column with a mobile phase consisting of 50% 10 mM ammonium fortriate (pH 3.3) and acetonitrile (50:50, vol/vol). A flow rate of 0.2 mL/min resulted in a total analysis time of 5 minutes per sample. Mass spectrometric detection was by selected reactant monitoring (nicotine m/z 163.2 --> 130.2; IS m/z 166.2 --> 87.2). The assay was linear from 0.5 to 100 mug/L (r > 0.993, n = 9). The accuracy and imprecision of the method for quality control sampleswere 87.5% to 113% and < 10.2%, respectively. Interday accuracy and imprecision at the limit of quantification (0.5 mug/L) was 113% and 7.2% (n = 4). The process efficiency for nicotine in plasma was > 75%. The method described has good process efficiency, stabilized nicotine, avoided concentration steps, and most importantly minimized potential contamination. Further, we have established that water-based standards and controls are interchangeable with plasma-based samples. This method was used successfully to measure the pharmacokinetic profiles of subjects involved in the development of an aerosol inhalation drug delivery system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of methyl jasmonate treatment on gene expression in sugarcane roots signalling between roots and shoots was studied. A collection of 829 ESTs were obtained from sugarcane roots treated with the defence-regulator methyl jasmonate (MJ) treatment. A subset of 747 of these were combined with 4793 sugarcane ESTs obtained from stem tissues in a cDNA microarray and experiments undertaken to identify genes that were induced in roots 24-120 h following treatment with MJ. Two data analysis systems (t-statistic and tRMA) were used to analyse the microarray results and these methods identified a common set of 21 ESTs corresponding to transcripts significantly induced by MJ in roots and 23 that were reduced in expression following MJ treatment. The induction of six transcripts identified in the microarray analysis was tested and confirmed using northern blotting. Homologues of genes encoding lipoxygenase and PR-10 proteins were induced 824 It after MJ treatment while the other four selected transcripts were induced at later time points. Following treatment of roots with MJ, the lipoxygenase homologue, but not the PR-10 homologue, was induced in untreated stem and leaf tissues. The PR-10 homologue and a PR-1 homologue, but not the lipoxygenase homologue, were induced in untreated tissues after the application of SA to roots. Repeated foliar application of MJ had no apparent effects on plant growth and was demonstrated to increase lipoxygenase transcripts in roots, but did not increase transcript levels-of other genes tested. These results lay a foundation for further studies of induced pest and disease resistance in sugarcane roots. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work, we analyse and compare the continuous variable tripartite entanglement available from the use of two concurrent or cascaded X (2) nonlinearities. We examine both idealized travelling-wave models and more experimentally realistic intracavity models, showing that tripartite entangled outputs are readily producible. These may be a useful resource for applications such as quantum cryptography and teleportation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper provides evidence that spatial indexing structures offer faster resolution of Formal Concept Analysis queries than B-Tree/Hash methods. We show that many Formal Concept Analysis operations, computing the contingent and extent sizes as well as listing the matching objects, enjoy improved performance with the use of spatial indexing structures such as the RD-Tree. Speed improvements can vary up to eighty times faster depending on the data and query. The motivation for our study is the application of Formal Concept Analysis to Semantic File Systems. In such applications millions of formal objects must be dealt with. It has been found that spatial indexing also provides an effective indexing technique for more general purpose applications requiring scalability in Formal Concept Analysis systems. The coverage and benchmarking are presented with general applications in mind.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This review will discuss the use of manual grading scales, digital photography, and automated image analysis in the quantification of fundus changes caused by age-related macular disease. Digital imaging permits processing of images for enhancement, comparison, and feature quantification, and these techniques have been investigated for automated drusen analysis. The accuracy of automated analysis systems has been enhanced by the incorporation of interactive elements, such that the user is able to adjust the sensitivity of the system, or manually add and remove pixels. These methods capitalize on both computer and human image feature recognition and the advantage of computer-based methodologies for quantification. The histogram-based adaptive local thresholding system is able to extract useful information from the image without being affected by the presence of other structures. More recent developments involve compensation for fundus background reflectance, which has most recently been combined with the Otsu method of global thresholding. This method is reported to provide results comparable with manual stereo viewing. Developments in this area are likely to encourage wider use of automated techniques. This will make the grading of photographs easier and cheaper for clinicians and researchers. © 2007 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The principle theme of this thesis is the advancement and expansion of ophthalmic research via the collaboration between professional Engineers and professional Optometrists. The aim has been to develop new and novel approaches and solutions to contemporary problems in the field. The work is sub divided into three areas of investigation; 1) High technology systems, 2) Modification of current systems to increase functionality, and 3) Development of smaller more portable and cost effective systems. High Technology Systems: A novel high speed Optical Coherence Tomography (OCT) system with integrated simultaneous high speed photography was developed achieving better operational speed than is currently available commercially. The mechanical design of the system featured a novel 8 axis alignment system. A full set of capture, analysis, and post processing software was developed providing custom analysis systems for ophthalmic OCT imaging, expanding the current capabilities of the technology. A large clinical trial was undertaken to test the dynamics of contact lens edge interaction with the cornea in-vivo. The interaction between lens edge design, lens base curvature, post insertion times and edge positions was investigated. A novel method for correction of optical distortion when assessing lens indentation was also demonstrated. Modification of Current Systems: A commercial autorefractor, the WAM-5500, was modified with the addition of extra hardware and a custom software and firmware solution to produce a system that was capable of measuring dynamic accommodative response to various stimuli in real time. A novel software package to control the data capture process was developed allowing real time monitoring of data by the practitioner, adding considerable functionality of the instrument further to the standard system. The device was used to assess the accommodative response differences between subjects who had worn UV blocking contact lens for 5 years, verses a control group that had not worn UV blocking lenses. While the standard static measurement of accommodation showed no differences between the two groups, it was determined that the UV blocking group did show better accommodative rise and fall times (faster), thus demonstrating the benefits of the modification of this commercially available instrumentation. Portable and Cost effective Systems: A new instrument was developed to expand the capability of the now defunct Keeler Tearscope. A device was developed that provided a similar capability in allowing observation of the reflected mires from the tear film surface, but with the added advantage of being able to record the observations. The device was tested comparatively with the tearscope and other tear film break-up techniques, demonstrating its potential. In Conclusion: This work has successfully demonstrated the advantages of interdisciplinary research between engineering and ophthalmic research has provided new and novel instrumented solutions as well as having added to the sum of scientific understanding in the ophthalmic field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research presented in this thesis was developed as part of DIBANET, an EC funded project aiming to develop an energetically self-sustainable process for the production of diesel miscible biofuels (i.e. ethyl levulinate) via acid hydrolysis of selected biomass feedstocks. Three thermal conversion technologies, pyrolysis, gasification and combustion, were evaluated in the present work with the aim of recovering the energy stored in the acid hydrolysis solid residue (AHR). Mainly consisting of lignin and humins, the AHR can contain up to 80% of the energy in the original feedstock. Pyrolysis of AHR proved unsatisfactory, so attention focussed on gasification and combustion with the aim of producing heat and/or power to supply the energy demanded by the ethyl levulinate production process. A thermal processing rig consisting on a Laminar Entrained Flow Reactor (LEFR) equipped with solid and liquid collection and online gas analysis systems was designed and built to explore pyrolysis, gasification and air-blown combustion of AHR. Maximum liquid yield for pyrolysis of AHR was 30wt% with volatile conversion of 80%. Gas yield for AHR gasification was 78wt%, with 8wt% tar yields and conversion of volatiles close to 100%. 90wt% of the AHR was transformed into gas by combustion, with volatile conversions above 90%. 5volO2%-95vol%N2 gasification resulted in a nitrogen diluted, low heating value gas (2MJ/m3). Steam and oxygen-blown gasification of AHR were additionally investigated in a batch gasifier at KTH in Sweden. Steam promoted the formation of hydrogen (25vol%) and methane (14vol%) improving the gas heating value to 10MJ/m3, below the typical for steam gasification due to equipment limitations. Arrhenius kinetic parameters were calculated using data collected with the LEFR to provide reaction rate information for process design and optimisation. Activation energy (EA) and pre-exponential factor (ko in s-1) for pyrolysis (EA=80kJ/mol, lnko=14), gasification (EA=69kJ/mol, lnko=13) and combustion (EA=42kJ/mol, lnko=8) were calculated after linearly fitting the data using the random pore model. Kinetic parameters for pyrolysis and combustion were also determined by dynamic thermogravimetric analysis (TGA), including studies of the original biomass feedstocks for comparison. Results obtained by differential and integral isoconversional methods for activation energy determination were compared. Activation energy calculated by the Vyazovkin method was 103-204kJ/mol for pyrolysis of untreated feedstocks and 185-387kJ/mol for AHRs. Combustion activation energy was 138-163kJ/mol for biomass and 119-158 for AHRs. The non-linear least squares method was used to determine reaction model and pre-exponential factor. Pyrolysis and combustion of biomass were best modelled by a combination of third order reaction and 3 dimensional diffusion models, while AHR decomposed following the third order reaction for pyrolysis and the 3 dimensional diffusion for combustion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is dedicated to modelling of network maintaining based on live example – maintaining ATM banking network, where any problems are mean money loss. A full analysis is made in order to estimate valuable and not-valuable parameters based on complex analysis of available data. Correlation analysis helps to estimate provided data and to produce a complex solution of increasing network maintaining effectiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a novel template matching approach for the discrimination of handwritten and machine-printed text. We first pre-process the scanned document images by performing denoising, circles/lines exclusion and word-block level segmentation. We then align and match characters in a flexible sized gallery with the segmented regions, using parallelised normalised cross-correlation. The experimental results over the Pattern Recognition & Image Analysis Research Lab-Natural History Museum (PRImA-NHM) dataset show remarkably high robustness of the algorithm in classifying cluttered, occluded and noisy samples, in addition to those with significant high missing data. The algorithm, which gives 84.0% classification rate with false positive rate 0.16 over the dataset, does not require training samples and generates compelling results as opposed to the training-based approaches, which have used the same benchmark.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this research is to investigate potential methods to produce an ion-exchange membrane that can be integrated directly into a polydimethylsiloxane Lab-on-a-Chip or Micro-Total-Analysis-System. The majority of microfluidic membranes are based on creating microporous structures, because it allows flexibility in the choice of material such that it can match the material of the microfluidic chip. This cohesion between the material of the microfluidic chip and membrane is an important feature to prevent bonding difficulties which can lead to leaking and other practical problems. However, of the materials commonly used to manufacture microfluidic chips, there are none that provide the ion-exchange capability. The DuPont product Nafion{TM} is chosen as the ion-exchange membrane, a copolymer with high conductivity and selectivity to cations and suitable for many applications such as electrolysis of water and the chlor-alkali process. The use of such an ion-exchange membrane in microfluidics could have multiple advantages, but there is no reversible/irreversible bonding that occurs between PDMS and Nafion{TM}. In this project multiple methods of physical entrapment of the ion-exchange material inside a film of PDMS are attempted. Through the use of the inherent properties of PDMS, very inexpensive sugar granulate can be used to make an inexpensive membrane mould which does not interfere with the PDMS crosslinking process. After dissolving away this sacrificial mould material, Nafion{TM} is solidified in the irregular granulate holes. Nafion{TM} in this membrane is confined in the irregular shape of the PDMS openings. The outer structure of the membrane is all PDMS and can be attached easily and securely to any PDMS-based microfluidic device through reversible or irreversible PDMS/PDMS bonding. Through impedance measurement, the effectiveness of these integrated membranes are compared against plain Nafion{TM} films in simple sodium chloride solutions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La organización del conocimiento en el contexto de las Ciencias de la Información tiene como esencia la información y el conocimiento debidamente documentado o registrado. La organización del conocimiento como proceso, envuelve tanto la descripción física como de los contenidos de los objetos informacionales. Y el producto de ese proceso descriptivo es la representación de los atributos de un objeto o conjunto de objetos. Las representaciones son construidas por lenguajes elaborados específicamente para los objetivos de la organización en los sistemas de información. Lenguajes que se subdividen en lenguajes que describen el documento (el soporte físico del objeto) y lenguajes que describen la información (los contenidos).A partir de esta premisa la siguiente investigación tiene como objetivo general analizarlos sistemas de Gestión de Información y Conocimiento Institucional principalmente los que proponen utilizar el Currículum Vitae del profesor como única fuente de información, medición y representación de la información y el conocimiento de una organización. Dentro delos principales resultados se muestra la importancia de usar el currículo personal como fuente de información confiable y normalizada; una síntesis de los principales sistemas curriculares que existen a nivel internacional y regional; así como el gráfico del modelo de datos del caso de estudio; y por último, la propuesta del uso de las ontologías como principal herramienta para la organización semántica de la información en un sistema de gestión de información y conocimiento.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The development of an ultrasensitive biosensor for the low-cost and on-site detection of pathogenic DNA could transform detection capabilities within food safety, environmental monitoring and clinical diagnosis. Herein, we present an innovative approach exploiting endonuclease-controlled aggregation of plasmonic gold nanoparticles (AuNPs) for label-free and ultrasensitive detection of bacterial DNA. The method utilizes RNA-functionalized AuNPs which form DNA-RNA heteroduplex structures through specific hybridization with target DNA. Once formed, the DNA-RNA heteroduplex is susceptible to RNAse H enzymatic cleavage of the RNA probe, allowing the target DNA to liberate and hybridize with another RNA probe. This continuously happens until all of the RNA probes are cleaved, leaving the nanoparticles unprotected and thus aggregated upon exposure to a high electrolytic medium. The assay is ultrasensitive, allowing the detection of target DNA at femtomolar level by simple spectroscopic analysis (40.7 fM and 2.45 fM as measured by UV-vis and dynamic light scattering (DLS), respectively). The target DNA spiked food matrix (chicken meat) is also successfully detected at a concentration of 1.2 pM (by UV-vis) or 18.0 fM (by DLS). In addition to the ultra-high sensitivity, the total analysis time of the assay is less than 3 hours, thus demonstrating its practicality for food analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Malware is a foundational component of cyber crime that enables an attacker to modify the normal operation of a computer or access sensitive, digital information. Despite the extensive research performed to identify such programs, existing schemes fail to detect evasive malware, an increasingly popular class of malware that can alter its behavior at run-time, making it difficult to detect using today’s state of the art malware analysis systems. In this thesis, we present DVasion, a comprehensive strategy that exposes such evasive behavior through a multi-execution technique. DVasion successfully detects behavior that would have been missed by traditional, single-execution approaches, while addressing the limitations of previously proposed multi-execution systems. We demonstrate the accuracy of our system through strong parallels with existing work on evasive malware, as well as uncover the hidden behavior within 167 of 1,000 samples.