976 resultados para Optical multi-channel analyzer


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We prove that, under certain conditions, the capacity of an optical communication channel with in-line, nonlinear filtering (regeneration) elements can be higher than the Shannon capacity for the corresponding linear Gaussian white noise channel. © 2012 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of cancer diagnosis from multi-channel images using the neural networks is investigated. The goal of this work is to classify the different tissue types which are used to determine the cancer risk. The radial basis function networks and backpropagation neural networks are used for classification. The results of experiments are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Az online marketingről sokan elsősorban az interneten fellelhető új kommunikációs eszközökre asszociálnak, ugyanakkor elfeledkeznek arról, hogy az új médium mind az árazás, mind értékesítési csatornapolitika területén számos új lehetőséget, és egyben kihívást is jelent a vállalatok számára. Online környezetben egészen más fogyasztói magatartásmintákat követhetnek a potenciális ügyfelek, vásárlók, amely eltérő megközelítéseket igényelnek a cégek részéről. Kérdés azonban, hogy a sok különbözőség ellenére lehetséges-e az új csatorna integrációja az eddig működtetett értékesítési csatornákkal, egyáltalán kell-e integrálni? Az tanulmány megkísérli bemutatni azokat a stratégiai irányokat, amelyeket a többcsatornás értékesítési rendszereket működtető vállalatok követhetnek, és azonosítani az ezekkel együtt járó problémákat. / === / Online marketing is widely viewed as a new communication tool on the internet, and many times it is neglected that this channel provides a range of opportunities and, at the same time, challenges regarding pricing and sales for companies. In the online environment consumers can show quite different behaviour patterns from the ones in the offline context and it can require different approach from companies. The diverse characteristics of the internet, however, raise the question if it is possible to integrate the new channel with the existing ones? Moreover, is it necessary at all? The paper attempts to determine and present the possible strategic directions that a company can follow when applying multi-channel marketing, and identify the barriers to implement them successfully.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the reliability and efficacy of hotel guest e-mail questionnaire compared to the paper questionnaire in the Asian Pacific context. Conducted inPerth,SingaporeandPenang, cities with mature hospitality and tourism industries and a representation of chain and independent deluxe hotels, this exploratory qualitative study examines hotelier views of e-mail guest communication derived from content analysis of guest questionnaires format and content and in-depth interviews with senior hoteliers. The findings indicated that e-questionnaires manifested as e-mails, as a direct replacement of the paper questionnaire, appear to be premature given divergent hotelier views and shortcomings in e-mail response administration. If properly executed, e-mail can play an increasingly important adjunct role to the paper guest questionnaire as a part of a multi-channel approach. The balance/relationship between ‘high tech’ and ‘high touch’ needs to be maintained: the latter can enhance the latter but should not undermine it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ticket distribution channels for live music events have been revolutionised through the increased take-up of internet technologies, and the music supply-chain has evolved into a multi-channel value network. The assumption that this creates increased consumer autonomy and improved service quality is explored here through a case-study of the ticket pre-sale for the US leg of the Depeche Mode 2005–06 World Tour, which utilises an innovative virtual channel strategy, promoted as a service to loyal fans. A multi-method analysis, adopting Kozinets' (2002) Kozinets, R. V. 2002. The field behind the screen: using netnography for marketing research in online communities. Journal of Marketing Research, 39: 61–72. [CrossRef], [Web of Science ®] netnography methodology, is employed to map responses of the band's serious fan base on an internet message board (IMB) throughout the tour pre-sale. The analysis focuses on concerns of pricing, ethics, scope of the offer, use of technology, service quality and perceived brand performance fit of channel partners. Findings indicate that fans behaviour is unpredictable in response to channel partners' performance, and that such offers need careful management to avoid alienation of loyal consumers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in biosensing approaches as alternative techniques for food diagnostics for the detection of chemical contaminants and foodborne pathogens has increased over the last twenty years. The key component of such tests is the biorecognition element whereby polyclonal or monoclonal antibodies still dominate the market. Traditionally the screening of sera or cell culture media for the selection of polyclonal or monoclonal candidate antibodies respectively has been performed by enzyme immunoassays. For niche toxin compounds, enzyme immunoassays can be expensive and/or prohibitive methodologies for antibody production due to limitations in toxin supply for conjugate production. Automated, self-regenerating, chip-based biosensors proven in food diagnostics may be utilised as rapid screening tools for antibody candidate selection. This work describes the use of both single channel and multi-channel surface plasmon resonance (SPR) biosensors for the selection and characterisation of antibodies, and their evaluation in shellfish tissue as standard techniques for the detection of domoic acid, as a model toxin compound. The key advantages in the use of these biosensor techniques for screening hybridomas in monoclonal antibody production were the real time observation of molecular interaction and rapid turnaround time in analysis compared to enzyme immunoassays. The multichannel prototype instrument was superior with 96 analyses completed in 2h compared to 12h for the single channel and over 24h for the ELISA immunoassay. Antibodies of high sensitivity, IC50's ranging from 4.8 to 6.9ng/mL for monoclonal and 2.3-6.0ng/mL for polyclonal, for the detection of domoic acid in a 1min analysis time were selected. Although there is a progression for biosensor technology towards low cost, multiplexed portable diagnostics for the food industry, there remains a place for laboratory-based SPR instrumentation for antibody development for food diagnostics as shown herein.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of human brain electroencephalography (EEG) signals for automatic person identi cation has been investigated for a decade. It has been found that the performance of an EEG-based person identication system highly depends on what feature to be extracted from multi-channel EEG signals. Linear methods such as Power Spectral Density and Autoregressive Model have been used to extract EEG features. However these methods assumed that EEG signals are stationary. In fact, EEG signals are complex, non-linear, non-stationary, and random in nature. In addition, other factors such as brain condition or human characteristics may have impacts on the performance, however these factors have not been investigated and evaluated in previous studies. It has been found in the literature that entropy is used to measure the randomness of non-linear time series data. Entropy is also used to measure the level of chaos of braincomputer interface systems. Therefore, this thesis proposes to study the role of entropy in non-linear analysis of EEG signals to discover new features for EEG-based person identi- cation. Five dierent entropy methods including Shannon Entropy, Approximate Entropy, Sample Entropy, Spectral Entropy, and Conditional Entropy have been proposed to extract entropy features that are used to evaluate the performance of EEG-based person identication systems and the impacts of epilepsy, alcohol, age and gender characteristics on these systems. Experiments were performed on the Australian EEG and Alcoholism datasets. Experimental results have shown that, in most cases, the proposed entropy features yield very fast person identication, yet with compatible accuracy because the feature dimension is low. In real life security operation, timely response is critical. The experimental results have also shown that epilepsy, alcohol, age and gender characteristics have impacts on the EEG-based person identication systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to survey and assess the state-of-the-art in automatic target recognition for synthetic aperture radar imagery (SAR-ATR). The aim is not to develop an exhaustive survey of the voluminous literature, but rather to capture in one place the various approaches for implementing the SAR-ATR system. This paper is meant to be as self-contained as possible, and it approaches the SAR-ATR problem from a holistic end-to-end perspective. A brief overview for the breadth of the SAR-ATR challenges is conducted. This is couched in terms of a single-channel SAR, and it is extendable to multi-channel SAR systems. Stages pertinent to the basic SAR-ATR system structure are defined, and the motivations of the requirements and constraints on the system constituents are addressed. For each stage in the SAR-ATR processing chain, a taxonomization methodology for surveying the numerous methods published in the open literature is proposed. Carefully selected works from the literature are presented under the taxa proposed. Novel comparisons, discussions, and comments are pinpointed throughout this paper. A two-fold benchmarking scheme for evaluating existing SAR-ATR systems and motivating new system designs is proposed. The scheme is applied to the works surveyed in this paper. Finally, a discussion is presented in which various interrelated issues, such as standard operating conditions, extended operating conditions, and target-model design, are addressed. This paper is a contribution toward fulfilling an objective of end-to-end SAR-ATR system design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Pianosa Contourite Depositional System (CDS) is located in the Corsica Trough (Northern Tyrrhenian Sea), a confined basin dominated by mass transport and contour currents in the eastern flank and by turbidity currents in the western flank. The morphologic and stratigraphic characterisation of the Pianosa CDS is based on multibeam bathymetry, seismic reflection data (multi-channel high resolution mini GI gun, single-channel sparker and CHIRP), sediment cores and ADCP data. The Pianosa CDS is located at shallow to intermediate water depths (170 to 850 m water depth) and is formed under the influence of the Levantine Intermediate Water (LIW). It is 120 km long, has a maximum width of 10 km and is composed of different types of muddy sediment drifts: plastered drift, separated mounded drift, sigmoid drift and multicrested drift. The reduced tectonic activity in the Corsica Trough since the early Pliocene permits to recover a sedimentary record of the contourite depositional system that is only influenced by climate fluctuations. Contourites started to develop in the Middle-Late Pliocene, but their growth was enhanced since the Middle Pleistocene Transition (0.7–0.9 Ma). Although the general circulation of the LIW, flowing northwards in the Corsica Trough, remained active all along the history of the system, contourite drift formation changed, controlled by sediment influx and bottom current velocity. During periods of sea level fall, fast bottom currents often eroded the drift crest in the middle and upper slope. At that time the proximity of the coast to the shelf edge favoured the formation of bioclastic sand deposits winnowed by bottom currents. Higher sediment accumulation of mud in the drifts occurred during periods of fast bottom currents and high sediment availability (i.e. high activity of turbidity currents), coincident with periods of sea level low-stands. Condensed sections were formed during sea level high-stands, when bottom currents were more sluggish and the turbidite system was disconnected, resulting in a lower sediment influx.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two Pleistocene mass transport deposits (MTDs), with volumes of thousands of km(3), have been identified from multi-channel seismic data in the abyssal plain at the front of the Barbados accretionary prism. Estimated sediment volumes for these MTDs are likely underestimated due to limited seismic coverage. In this work, we suggest that these MTDs are comparable in size to large submarine landslides as reported in the literature. These MTDs lie on the vicinity of two major oceanic ridges, the Barracuda Ridge and the Tiburon Rise. It is also suggested in this work that the MTDs come from seismicity associated with the formation of the Barracuda Ridge or the Barbados accretionary prism; however, triggering mechanisms involved in their formation remain uncertain. The present study discusses the potential causal factors accounting for the formation of these MTDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The TOMO-ETNA experiment was devised to image of the crust underlying the volcanic edifice and, possibly, its plumbing system by using passive and active refraction/reflection seismic methods. This experiment included activities both on-land and offshore with the main objective of obtaining a new high-resolution seismic tomography to improve the knowledge of the crustal structures existing beneath the Etna volcano and northeast Sicily up to Aeolian Islands. The TOMO ETNA experiment was divided in two phases. The first phase started on June 15, 2014 and finalized on July 24, 2014, with the withdrawal of two removable seismic networks (a Short Period Network and a Broadband network composed by 80 and 20 stations respectively) deployed at Etna volcano and surrounding areas. During this first phase the oceanographic research vessel “Sarmiento de Gamboa” and the hydro-oceanographic vessel “Galatea” performed the offshore activities, which includes the deployment of ocean bottom seismometers (OBS), air-gun shooting for Wide Angle Seismic refraction (WAS), Multi-Channel Seismic (MCS) reflection surveys, magnetic surveys and ROV (Remotely Operated Vehicle) dives. This phase finished with the recovery of the short period seismic network. In the second phase the Broadband seismic network remained operative until October 28, 2014, and the R/V “Aegaeo” performed additional MCS surveys during November 19-27, 2014. Overall, the information deriving from TOMO-ETNA experiment could provide the answer to many uncertainties that have arisen while exploiting the large amount of data provided by the cutting-edge monitoring systems of Etna volcano and seismogenic area of eastern Sicily.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A NOx reduction efficiency higher than 95% with NH3 slip less than 30 ppm is desirable for heavy-duty diesel (HDD) engines using selective catalytic reduction (SCR) systems to meet the US EPA 2010 NOx standard and the 2014-2018 fuel consumption regulation. The SCR performance needs to be improved through experimental and modeling studies. In this research, a high fidelity global kinetic 1-dimensional 2-site SCR model with mass transfer, heat transfer and global reaction mechanisms was developed for a Cu-zeolite catalyst. The model simulates the SCR performance for the engine exhaust conditions with NH3 maldistribution and aging effects, and the details are presented. SCR experimental data were collected for the model development, calibration and validation from a reactor at Oak Ridge National Laboratory (ORNL) and an engine experimental setup at Michigan Technological University (MTU) with a Cummins 2010 ISB engine. The model was calibrated separately to the reactor and engine data. The experimental setup, test procedures including a surrogate HD-FTP cycle developed for transient studies and the model calibration process are described. Differences in the model parameters were determined between the calibrations developed from the reactor and the engine data. It was determined that the SCR inlet NH3 maldistribution is one of the reasons causing the differences. The model calibrated to the engine data served as a basis for developing a reduced order SCR estimator model. The effect of the SCR inlet NO2/NOx ratio on the SCR performance was studied through simulations using the surrogate HD-FTP cycle. The cumulative outlet NOx and the overall NOx conversion efficiency of the cycle are highest with a NO2/NOx ratio of 0.5. The outlet NH3 is lowest for the NO2/NOx ratio greater than 0.6. A combined engine experimental and simulation study was performed to quantify the NH3 maldistribution at the SCR inlet and its effects on the SCR performance and kinetics. The uniformity index (UI) of the SCR inlet NH3 and NH3/NOx ratio (ANR) was determined to be below 0.8 for the production system. The UI was improved to 0.9 after installation of a swirl mixer into the SCR inlet cone. A multi-channel model was developed to simulate the maldistribution effects. The results showed that reducing the UI of the inlet ANR from 1.0 to 0.7 caused a 5-10% decrease in NOx reduction efficiency and 10-20 ppm increase in the NH3 slip. The simulations of the steady-state engine data with the multi-channel model showed that the NH3 maldistribution is a factor causing the differences in the calibrations developed from the engine and the reactor data. The Reactor experiments were performed at ORNL using a Spaci-IR technique to study the thermal aging effects. The test results showed that the thermal aging (at 800°C for 16 hours) caused a 30% reduction in the NH3 stored on the catalyst under NH3 saturation conditions and different axial concentration profiles under SCR reaction conditions. The kinetics analysis showed that the thermal aging caused a reduction in total NH3 storage capacity (94.6 compared to 138 gmol/m3), different NH3 adsorption/desorption properties and a decrease in activation energy and the pre-exponential factor for NH3 oxidation, standard and fast SCR reactions. Both reduction in the storage capability and the change in kinetics of the major reactions contributed to the change in the axial storage and concentration profiles observed from the experiments.