967 resultados para Detectors electroquímics
Resumo:
The aim of the research work described in this thesis was to investigate the interrogation of fibre optic sensors using "off the shelf optical components and equipment developed mainly for the telecommunications industry. This provides a cost effective way of bringing fibre optic sensor systems to within the price range of their electro-mechanical counterparts. The research work focuses on the use of an arrayed waveguide grating, an acousto-optic tuneable filter and low-coherence interferometry to measure dynamic strain and displacement using fibre Bragg grating and interferometric sensors. Based on the intrinsic properties of arrayed waveguide gratings and acousto-optic tuneable filters used in conjunction with interferometry, fibre Bragg gratings and interferometric sensors a number of novel fibre optic sensor interrogation systems have been realised. Special single mode fibre, namely, high-birefringence fibre has been employed to implement a dual-beam interrogating interferometer. The first interrogation scheme is based on an optical channel monitor, which is an arrayed waveguide grating with integral photo-detectors providing a number of amplified electrical outputs. It is used to interrogate fibre Bragg grating and interferometric sensors. Using the properties of polarisation maintainability in high-birefringent fibre an interrogating interferometer was realised by winding a length of the fibre around a piezoelectric modulator generating a low-frequency carrier signal. The system was used to interrogate both fibre Bragg grating and interferometric sensors. Finally, the use of an acousto-optic tuneable filter is employed to interrogate fibre Bragg gratings. The device is used to generate a very high frequency carrier signal at the output of an optical interferometer.
Resumo:
We consider the detection of biased information sources in the ubiquitous code-division multiple-access (CDMA) scheme. We propose a simple modification to both the popular single-user matched-filter detector and a recently introduced near-optimal message-passing-based multiuser detector. This modification allows for detecting modulated biased sources directly with no need for source coding. Analytical results and simulations with excellent agreement are provided, demonstrating substantial improvement in bit error rate in comparison with the unmodified detectors and the alternative of source compression. The robustness of error-performance improvement is shown under practical model settings, including bias estimation mismatch and finite-length spreading codes. © 2007 IOP Publishing Ltd.
Resumo:
The aim of this study was to use the transformation of anionic to metathesis polymerization to produce block co-polymers of styrene-b-pentenylene using WC16 /PStLi and WC16/PStLi/ AlEtC12 catalyst systems. Analysis of the products using SEC and 1H and 13C NMR spectroscopy enabled mechanisms for metathesis initiation reactions to be proposed. The initial work involved preparation of the constituent homo-polymers. Solutions of polystyryllithium in cyclohexane were prepared and diluted so that the [PStLi]o<2x10-3M. The dilution produced initial rapid decay of the active species, followed by slower spontaneous decay within a period of days. This was investigated using UV / visible spectrophotometry and the wavelength of maximum absorbance of the PStLi was found to change with the decay from an initial value of 328mn. to λmax of approximately 340nm. after 4-7 days. SEC analysis of solutions of polystyrene, using RI and UV / visible (set at 254nm.) detectors; showed the UV:RI peak area was constant for a range of polystyrene samples of different moleculor weight. Samples of polypentenylene were prepared and analysed using SEC. Unexpectedly the solutions showed an absorbance at 254nm. which had to be considered when this technique was used subsequently to analyse polymer samples to determine their styrene/ pentenylene co-polymer composition. Cyclohexane was found to be a poor solvent for these ring-opening metathesis polymerizations of cyclopentene. Attempts to produce styrene-b-pentenylene block co-polymers, using a range of co-catalyst systems, were generally unsuccessful as the products were shown to be mainly homopolymers. The character of the polymers did suggest that several catalytic species are present in these systems and mechanisms have been suggested for the formation of initiating carbenes. Evidence of some low molecular weight product with co-polymer character has been obtained. Further investigation indicated that this is most likely to be ABA block copolymer, which led to a mechanism being proposed for the termination of the polymerization.
Resumo:
Vision must analyze the retinal image over both small and large areas to represent fine-scale spatial details and extensive textures. The long-range neuronal convergence that this implies might lead us to expect that contrast sensitivity should improve markedly with the contrast area of the image. But this is at odds with the orthodox view that contrast sensitivity is determined merely by probability summation over local independent detectors. To address this puzzle, I aimed to assess the summation of luminance contrast without the confounding influence of area-dependent internal noise. I measured contrast detection thresholds for novel Battenberg stimuli that had identical overall dimensions (to clamp the aggregation of noise) but were constructed from either dense or sparse arrays of micro-patterns. The results unveiled a three-stage visual hierarchy of contrast summation involving (i) spatial filtering, (ii) long-range summation of coherent textures, and (iii) pooling across orthogonal textures. Linear summation over local energy detectors was spatially extensive (as much as 16 cycles) at Stage 2, but the resulting model is also consistent with earlier classical results of contrast summation (J. G. Robson & N. Graham, 1981), where co-aggregation of internal noise has obscured these long-range interactions.
Resumo:
In this paper, we demonstrate the integration of a 3D hydrogel matrix within a hollow core photonic crystal fibre (HC-PCF). In addition, we also show the fluorescence of Cy5-labelled DNA molecules immobilized within the hydrogel formed in two different types of HC-PCF. The 3D hydrogel matrix is designed to bind with the amino groups of biomolecules using an appropriate cross-linker, providing higher sensitivity and selectivity than the standard 2D coverage, enabling a greater number of probe molecules to be available per unit area. The HC-PCFs, on the other hand, can be designed to maximize the capture of fluorescence to improve sensitivity and provide longer interaction lengths. This could enable the development of fibre-based point-of-care and remote systems, where the enhanced sensitivity would relax the constraints placed on sources and detectors. In this paper, we will discuss the formation of such polyethylene glycol diacrylate (PEGDA) hydrogels within a HC-PCF, including their optical properties such as light propagation and auto-fluorescence.
Resumo:
This article describes the use of evolutionary methods to generate ambiguous images---images that the brain or a computer may interpret in multiple ways. Object detectors often falsely identify the presence of objects in these images, even when they are not visible to human observers.
Resumo:
I was recently part of a small committee looking at higher qualifications in contact lens practice and the discussion turned to future technologies. There was mention of different materials and different applications of contact lenses. Drug delivery with contact lenses was discussed as this has been talked about in the literature for a while. The first paper I could find that talked about using contact lenses for drug delivery dates back over 40 years. There was a review paper in CLAE in 2008 that looked specifically at this too [1]. However, where are these products? Why are we not seeing them in the market place? Maybe the technology is not quite there yet, or maybe patents are prohibiting usage or maybe the market is not big enough to develop such products? We do have lenses on the market with slow release of lubricating agents but not therapeutic agents used for ocular or systemic conditions. Contact lenses with pathogen detectors may be part of our contact lens armoury of the future and again we can already see papers in the literature that have trialled this technology for glucose monitoring in diabetics or lactate concentration in the tear film. Future contact lenses may incorporate better optics based on aberration control and we see this starting to emerge with aspheric designs designed to minimise spherical aberration. Irregular corneas can be fitted with topography based designs and again this technology exists and is being used by some manufacturers in their designs already. Moreover, the topography based fitting of irregular corneas is certainly something we see a lot of today and CLAE has seen many articles related to this over the last decade or so. What about further into the future? Well one interesting area must the 3-dimensional contact lenses, or contact lenses with electronic devices built in that simulate a display screen. A little like the virtual display spectacles that are already sold by electronics companies. It does not take much of a stretch of the imagination to see a large electronic company taking this technology on and making it viable. Will we see people on the train watching movies on these electronic virtual reality contact lenses? I think we will, but when is harder to know.
Resumo:
This work explores the creation of ambiguous images, i.e., images that may induce multistable perception, by evolutionary means. Ambiguous images are created using a general purpose approach, composed of an expression-based evolutionary engine and a set of object detectors, which are trained in advance using Machine Learning techniques. Images are evolved using Genetic Programming and object detectors are used to classify them. The information gathered during classification is used to assign fitness. In a first stage, the system is used to evolve images that resemble a single object. In a second stage, the discovery of ambiguous images is promoted by combining pairs of object detectors. The analysis of the results highlights the ability of the system to evolve ambiguous images and the differences between computational and human ambiguous images.
Resumo:
ACM Computing Classification System (1998): J.2.
Resumo:
Polymer optical fibre (POF) is a relatively new and novel technology that presents an innovative approach for ultrasonic endoscopic applications. Currently, piezo electric transducers are the typical detectors of choice, albeit possessing a limited bandwidth due to their resonant nature and a sensitivity that decreases proportionally to their size. Optical fibres provide immunity from electromagnetic interference and POF in particular boasts more suitable physical characteristics than silica optical fibre. The most important of these are lower acoustic impedance, a reduced Young's Modulus and a higher acoustic sensitivity than single-mode silica fibre at both 1 MHz and 10 MHz. POF therefore offers an interesting alternative to existing technology. Intrinsic fibre structures such as Bragg gratings and Fabry-Perot cavities may be inscribed into the fibre core using UV lasers. These gratings are a modulation of the refractive index of the fibre core and provide the advantages of high reflectivity, customisable bandwidth and point detection. We present a compact in fibre ultrasonic point detector based upon a POF Bragg grating (POFBG) sensor. We demonstrate that the detector is capable of leaving a laboratory environment by using connectorised fibre sensors and make a case for endoscopic ultrasonic detection through use of a mounting structure that better mimics the environment of an endoscopic probe. We measure the effects of water immersion upon POFBGs and analyse the ultrasonic response for 1, 5 and 10 MHz.
Resumo:
The kaon electroproduction reaction H(e, e ′K+)Λ was studied as a function of the four momentum transfer, Q2, for different values of the virtual photon polarization parameter. Electrons and kaons were detected in coincidence in two High Resolution Spectrometers (HRS) at Jefferson Lab. Data were taken at electron beam energies ranging from 3.4006 to 5.7544 GeV. The kaons were identified using combined time of flight information and two Aerogel Čerenkov detectors used for particle identification. For different values of Q2 ranging from 1.90 to 2.35 GeV/c2 the center of mass cross sections for the Λ hyperon were determined for 20 kinematics and the longitudinal, σ L, and transverse, σT, terms were separated using the Rosenbluth separation technique. ^ Comparisons between available models and data have been studied. The comparison supports the t-channel dominance behavior for kaon electroproduction. All models seem to underpredict the transverse cross section. An estimate of the kaon form factor has been explored by determining the sensitivity of the separated cross sections to variations of the kaon EM form factor. From comparison between models and data we can conclude that interpreting the data using the Regge model is quite sensitive to a particular choice for the EM form factors. The data from the E98-108 experiment extends the range of the available kaon electroproduction cross section data to an unexplored region of Q2 where no separations have ever been performed. ^
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
Providing transportation system operators and travelers with accurate travel time information allows them to make more informed decisions, yielding benefits for individual travelers and for the entire transportation system. Most existing advanced traveler information systems (ATIS) and advanced traffic management systems (ATMS) use instantaneous travel time values estimated based on the current measurements, assuming that traffic conditions remain constant in the near future. For more effective applications, it has been proposed that ATIS and ATMS should use travel times predicted for short-term future conditions rather than instantaneous travel times measured or estimated for current conditions. ^ This dissertation research investigates short-term freeway travel time prediction using Dynamic Neural Networks (DNN) based on traffic detector data collected by radar traffic detectors installed along a freeway corridor. DNN comprises a class of neural networks that are particularly suitable for predicting variables like travel time, but has not been adequately investigated for this purpose. Before this investigation, it was necessary to identifying methods for data imputation to account for missing data usually encountered when collecting data using traffic detectors. It was also necessary to identify a method to estimate the travel time on the freeway corridor based on data collected using point traffic detectors. A new travel time estimation method referred to as the Piecewise Constant Acceleration Based (PCAB) method was developed and compared with other methods reported in the literatures. The results show that one of the simple travel time estimation methods (the average speed method) can work as well as the PCAB method, and both of them out-perform other methods. This study also compared the travel time prediction performance of three different DNN topologies with different memory setups. The results show that one DNN topology (the time-delay neural networks) out-performs the other two DNN topologies for the investigated prediction problem. This topology also performs slightly better than the simple multilayer perceptron (MLP) neural network topology that has been used in a number of previous studies for travel time prediction.^
Resumo:
Today, over 15,000 Ion Mobility Spectrometry (IMS) analyzers are employed at worldwide security checkpoints to detect explosives and illicit drugs. Current portal IMS instruments and other electronic nose technologies detect explosives and drugs by analyzing samples containing the headspace air and loose particles residing on a surface. Canines can outperform these systems at sampling and detecting the low vapor pressure explosives and drugs, such as RDX, PETN, cocaine, and MDMA, because these biological detectors target the volatile signature compounds available in the headspace rather than the non-volatile parent compounds of explosives and drugs.^ In this dissertation research volatile signature compounds available in the headspace over explosive and drug samples were detected using SPME as a headspace sampling tool coupled to an IMS analyzer. A Genetic Algorithm (GA) technique was developed to optimize the operating conditions of a commercial IMS (GE Itemizer 2), leading to the successful detection of plastic explosives (Detasheet, Semtex H, and C-4) and illicit drugs (cocaine, MDMA, and marijuana). Short sampling times (between 10 sec to 5 min) were adequate to extract and preconcentrate sufficient analytes (> 20 ng) representing the volatile signatures in the headspace of a 15 mL glass vial or a quart-sized can containing ≤ 1 g of the bulk explosive or drug.^ Furthermore, a research grade IMS with flexibility for changing operating conditions and physical configurations was designed and fabricated to accommodate future research into different analytes or physical configurations. The design and construction of the FIU-IMS were facilitated by computer modeling and simulation of ion’s behavior within an IMS. The simulation method developed uses SIMION/SDS and was evaluated with experimental data collected using a commercial IMS (PCP Phemto Chem 110). The FIU-IMS instrument has comparable performance to the GE Itemizer 2 (average resolving power of 14, resolution of 3 between two drugs and two explosives, and LODs range from 0.7 to 9 ng). ^ The results from this dissertation further advance the concept of targeting volatile components to presumptively detect the presence of concealed bulk explosives and drugs by SPME-IMS, and the new FIU-IMS provides a flexible platform for future IMS research projects.^