857 resultados para Detection and segmentation
Resumo:
A simple, sensitive, and mild method for the determination of amino compounds based on a condensation reaction with 1-ethyl-3-(3-dimethylaminopropyl)-carbodiimide hydrochloride (EDC-HCI) as the dehydrant with fluorescence detection has been developed. Amines were derivatized to their acidamides with labeling reagent 2-(2-phenyl-1H-phenanthro-[9,10-d]imidazole-1-yl)-acetic acid (PPIA). Studies on derivatization conditions indicated that the coupling reaction proceeded rapidly and smoothly in the presence of a base catalyst in acetonitrile to give the corresponding sensitively fluorescent derivatives with an excitation maximum at lambda(ex) 260nm and an emission maximum at lambda(em) 380nm. The labeled derivatives exhibited high stability and were enough to be efficiently analyzed by high-performance liquid chromatography. Identification of derivatives was carried out by online post-column mass spectrometry (LC/APCI-MS/MS) and showed an intense protonated molecular ion corresponding m/z [MH](+) under APCI in positive-ion mode. At the same time, the fluorescence properties of derivatives in various solvents or at different temperature were investigated. The method, in conjunction with a gradient elution, offered a baseline resolution of the common amine derivatives on a reversed-phase Eclipse XDB-C-8 column. LC separation for the derivatized amines showed good reproducibility with acetonitrile-water as mobile phase. Detection limits calculated from 0.78 pmol injection, at a signal-to-noise ratio of 3, were 3.1-18.2 fmol. The mean intra- and inter-assay precision for all amine levels were < 3.85% and 2.11%, respectively. Excellent linear responses were observed with coefficients of > 0.9996. The established method for the determination of aliphatic amines from real wastewater and biological samples was satisfactory. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
A pre-column derivatization method for the sensitive determination of amino acids and peptides using the tagging reagent 1,2-benzo-3,4dihydrocarbazole-9-ethyl chloroformate (BCEOC) followed by high-performance liquid chromatography with fluorescence detection has been developed. Identification of derivatives was carried out by liquid chromatography/electrospray ionization mass spectrometry (LC/ESI-MS/MS). The chromophore of 2-(9-carbazole)-ethyl chloroformate (CEOC) reagent was replaced by 1,2-benzo-3,4-dihydrocarbazole functional group, which resulted in a sensitive fluorescence derivatizing reagent BCEOC. BCEOC can easily and quickly label peptides and amino acids. Derivatives are stable enough to be efficiently analyzed by high-performance liquid chromatography. The derivatives showed an intense protonated molecular ion corresponding m/z (M + H)(+) under electrospray ionization (ESI) positive-ion mode with an exception being Tyr detected at negative mode. The collision-induced dissociation of protonated molecular ion formed a product at m/z 246.2 corresponding to the cleavage of C-O bond of BCEOC molecule. Studies on derivatization demonstrate excellent derivative yields over the pH 9.0-10.0. Maximal yields close to 100% are observed with a 3-4-fold molar reagent excess. Derivatives exhibit strong fluorescence and extracted detzvatization solution with n-hexane/ethyl acetate (10:1, v/v) allows for the direct injection with no significant interference from the major fluorescent reagent degradation by-products, such as 1,2-benzo-3,4-dihydrocarbazole-9-ethanol (BDC-OH) (a major by-product), mono- 1,2-benzo-3,4-dihydrocarbazole-9-ethyl carbonate (BCEOC-OH) and bis-(1,2-benzo-3,4-dihydrocarbazole-9-ethyl) carbonate (BCEOC)(2). In addition, the detection responses for BCEOC derivatives are compared to those obtained with previously synthesized 2-(9-carbazole)-ethyl chloroformate (CEOC) in our laboratory. The ratios AC(BCEOC)/AC(CEOC) = 2.05-6.51 for fluorescence responses are observed (here, AC is relative fluorescence response). Separation of the derivatized peptides and amino acids had been optimized on Hypersil BDS C-18 column. Detection limits were calculated from 1.0 pmol injection at a signal-to-noise ratio of 3, and were 6.3 (Lys)-177.6 (His) fmol. The mean interday accuracy ranged from 92 to 106% for fluorescence detection with mean %CV < 7.5. The mean interday precision for all standards was < 10% of the expected concentration. Excellent linear responses were observed with coefficients of > 0.9999. Good compositional data could be obtained from the analysis of derivatized protein hydrolysates containing as little as 50.5 ng of sample. Therefore, the facile BCEOC derivatization coupled with mass spectrometry allowed the development of a highly sensitive and specific method for the quantitative analysis of trace levels of amino acids and peptides from biological and natural environmental samples. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Robots must plan and execute tasks in the presence of uncertainty. Uncertainty arises from sensing errors, control errors, and uncertainty in the geometry of the environment. The last, which is called model error, has received little previous attention. We present a framework for computing motion strategies that are guaranteed to succeed in the presence of all three kinds of uncertainty. The motion strategies comprise sensor-based gross motions, compliant motions, and simple pushing motions.
Resumo:
Y. Zhu, S. Williams and R. Zwiggelaar, 'Computer technology in detection and staging of prostate carcinoma: a review', Medical Image Analysis 10 (2), 178-199 (2006)
Resumo:
R. Zwiggelaar, S.M. Astley, C.J. Taylor and C.R.M. Boggis, 'Linear structures in mammographic images: detection and classification', IEEE Transaction on Medical Imaging 23 (9), 1077-1086 (2004)
Resumo:
A method for deformable shape detection and recognition is described. Deformable shape templates are used to partition the image into a globally consistent interpretation, determined in part by the minimum description length principle. Statistical shape models enforce the prior probabilities on global, parametric deformations for each object class. Once trained, the system autonomously segments deformed shapes from the background, while not merging them with adjacent objects or shadows. The formulation can be used to group image regions based on any image homogeneity predicate; e.g., texture, color, or motion. The recovered shape models can be used directly in object recognition. Experiments with color imagery are reported.
Resumo:
In this project we design and implement a centralized hashing table in the snBench sensor network environment. We discuss the feasibility of this approach and compare and contrast with the distributed hashing architecture, with particular discussion regarding the conditions under which a centralized architecture makes sense. There are numerous computational tasks that require persistence of data in a sensor network environment. To help motivate the need for data storage in snBench we demonstrate a practical application of the technology whereby a video camera can monitor a room to detect the presence of a person and send an alert to the appropriate authorities.
Resumo:
Ribosome profiling (ribo-seq) is a recently developed technique that provides genomewide information on protein synthesis (GWIPS) in vivo. The high resolution of ribo-seq is one of the exciting properties of this technique. In Chapter 2, I present a computational method that utilises the sub-codon precision and triplet periodicity of ribosome profiling data to detect transitions in the translated reading frame. Application of this method to ribosome profiling data generated for human HeLa cells allowed us to detect several human genes where the same genomic segment is translated in more than one reading frame. Since the initial publication of the ribosome profiling technique in 2009, there has been a proliferation of studies that have used the technique to explore various questions with respect to translation. A review of the many uses and adaptations of the technique is provided in Chapter 1. Indeed, owing to the increasing popularity of the technique and the growing number of published ribosome profiling datasets, we have developed GWIPS-viz (http://gwips.ucc.ie), a ribo-seq dedicated genome browser. Details on the development of the browser and its usage are provided in Chapter 3. One of the surprising findings of ribosome profiling of initiating ribosomes carried out in 3 independent studies, was the widespread use of non-AUG codons as translation initiation start sites in mammals. Although initiation at non-AUG codons in mammals has been documented for some time, the extent of non-AUG initiation reported by these ribo-seq studies was unexpected. In Chapter 4, I present an approach for estimating the strength of initiating codons based on the leaky scanning model of translation initiation. Application of this approach to ribo-seq data illustrates that initiation at non-AUG codons is inefficient compared to initiation at AUG codons. In addition, our approach provides a probability of initiation score for each start site that allows its strength of initiation to be evaluated.
Resumo:
The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.
Resumo:
The contribution of buildings towards total worldwide energy consumption in developed countries is between 20% and 40%. Heating Ventilation and Air Conditioning (HVAC), and more specifically Air Handling Units (AHUs) energy consumption accounts on average for 40% of a typical medical device manufacturing or pharmaceutical facility’s energy consumption. Studies have indicated that 20 – 30% energy savings are achievable by recommissioning HVAC systems, and more specifically AHU operations, to rectify faulty operation. Automated Fault Detection and Diagnosis (AFDD) is a process concerned with potentially partially or fully automating the commissioning process through the detection of faults. An expert system is a knowledge-based system, which employs Artificial Intelligence (AI) methods to replicate the knowledge of a human subject matter expert, in a particular field, such as engineering, medicine, finance and marketing, to name a few. This thesis details the research and development work undertaken in the development and testing of a new AFDD expert system for AHUs which can be installed in minimal set up time on a large cross section of AHU types in a building management system vendor neutral manner. Both simulated and extensive field testing was undertaken against a widely available and industry known expert set of rules known as the Air Handling Unit Performance Assessment Rules (APAR) (and a later more developed version known as APAR_extended) in order to prove its effectiveness. Specifically, in tests against a dataset of 52 simulated faults, this new AFDD expert system identified all 52 derived issues whereas the APAR ruleset identified just 10. In tests using actual field data from 5 operating AHUs in 4 manufacturing facilities, the newly developed AFDD expert system for AHUs was shown to identify four individual fault case categories that the APAR method did not, as well as showing improvements made in the area of fault diagnosis.
Resumo:
The computational detection of regulatory elements in DNA is a difficult but important problem impacting our progress in understanding the complex nature of eukaryotic gene regulation. Attempts to utilize cross-species conservation for this task have been hampered both by evolutionary changes of functional sites and poor performance of general-purpose alignment programs when applied to non-coding sequence. We describe a new and flexible framework for modeling binding site evolution in multiple related genomes, based on phylogenetic pair hidden Markov models which explicitly model the gain and loss of binding sites along a phylogeny. We demonstrate the value of this framework for both the alignment of regulatory regions and the inference of precise binding-site locations within those regions. As the underlying formalism is a stochastic, generative model, it can also be used to simulate the evolution of regulatory elements. Our implementation is scalable in terms of numbers of species and sequence lengths and can produce alignments and binding-site predictions with accuracy rivaling or exceeding current systems that specialize in only alignment or only binding-site prediction. We demonstrate the validity and power of various model components on extensive simulations of realistic sequence data and apply a specific model to study Drosophila enhancers in as many as ten related genomes and in the presence of gain and loss of binding sites. Different models and modeling assumptions can be easily specified, thus providing an invaluable tool for the exploration of biological hypotheses that can drive improvements in our understanding of the mechanisms and evolution of gene regulation.