856 resultados para chiroptical switches, data processing, enantiospecificity, photochromism, steric hindrance
Resumo:
The design and development of a Bottom Pressure Recorder for a Tsunami Early Warning System is described here. The special requirements that it should satisfy for the specific application of deployment at ocean bed and pressure monitoring of the water column above are dealt with. A high-resolution data digitization and low circuit power consumption are typical ones. The implementation details of the data sensing and acquisition part to meet these are also brought out. The data processing part typically encompasses a Tsunami detection algorithm that should detect an event of significance in the background of a variety of periodic and aperiodic noise signals. Such an algorithm and its simulation are presented. Further, the results of sea trials carried out on the system off the Chennai coast are presented. The high quality and fidelity of the data prove that the system design is robust despite its low cost and with suitable augmentations, is ready for a full-fledged deployment at ocean bed. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
This report is a detailed description of data processing of NOAA/MLML spectroradiometry data. It introduces the MLML_DBASE programs, describes the assembly of diverse data fues, and describes general algorithms and how individual routines are used. Definitions of data structures are presented in Appendices. [PDF contains 48 pages]
Resumo:
This report outlines the NOAA spectroradiometer data processing system implemented by the MLML_DBASE programs. This is done by presenting the algorithms and graphs showing the effects of each step in the algorithms. [PDF contains 32 pages]
Resumo:
Commercially available software packages for IBM PC-compatibles are evaluated to use for data acquisition and processing work. Moss Landing Marine Laboratories (MLML) acquired computers since 1978 to use on shipboard data acquisition (Le. CTD, radiometric, etc.) and data processing. First Hewlett-Packard desktops were used then a transition to the DEC VAXstations, with software developed mostly by the author and others at MLML (Broenkow and Reaves, 1993; Feinholz and Broenkow, 1993; Broenkow et al, 1993). IBM PC were at first very slow and limited in available software, so they were not used in the early days. Improved technology such as higher speed microprocessors and a wide range of commercially available software made use of PC more reasonable today. MLML is making a transition towards using the PC for data acquisition and processing. Advantages are portability and available outside support.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
In the last year [1], Angiolini and co-workers have synthesized and investigated methacrylic polymers bearing in the side chain the chiral cyclic (S)-3-hydroxypyrrolidine moiety interposed between the main chain and the trans-azoaromatic chromophore, substituted or not in the 4’ position by an electron-withdrawing group. In these materials, the presence of a rigid chiral moiety of one prevailing absolute configuration favours the establishment of a chiral conformation of one prevailing helical handedness, at least within chain segments of the macromolecules, which can be observed by circular dichroism (CD). The simultaneous presence of the azoaromatic and chiral functionalities allows the polymers to display both the properties typical of dissymmetric systems (optical activity, exciton splitting of dichroic absorptions), as well as the features typical of photochromic materials (photorefractivity, photoresponsiveness, NLO properties). The first part of this research was to synthesize analogue homopolymers and copolymers based on bisazoaromatic moiety and compare their properties to those of the above mentioned analogue derivatives bearing only one azoaromatic chromophore in the side chain. We focused also the attention on the effects induced on the thermal and chiroptical behaviours by the insertion of particulars achiral comonomers characterized by different side-chain mobility and grown hindrance (MMA, tert-BMA and TrMA). On the other hand carbazole containing polymers [2] have attracted much attention because of their unique features. The use of these materials in advanced micro- and nanotechnologies spreads in many different applications such as photoconductive and photorefractive polymers, electroluminescent devices, programmable optical interconnections, data storage, chemical photoreceptors, NLO, surface relief gratings, blue emitting materials and holographic memory. The second part of the work was focused on the synthesis and the characterization polymeric derivatives bearing in the side chain carbazole or phenylcarbazole moieties linked to the (S)- 2-hydroxy succinimide or the (S)-3-hydroxy pyrrolidinyl ring as chiral groups covalently linked to the main chain through ester bonds. The last objective of this research was to design, synthesize, and characterize multifunctional methacrylic homopolymers and copolymers bearing three distinct functional groups (i.e. azoaromatic, carbazole and chiral group of one single configuration) directly linked in the side chain. This polymeric derivatives could be of potential interest for several advanced application fields, such as optical storage, waveguides, chiroptical switches, chemical photoreceptors, NLO, surface relief gratings, photoconductive materials, etc.
Resumo:
The sea level variation (SLVtotal) is the sum of two major contributions: steric and mass-induced. The steric SLVsteric is that resulting from the thermal and salinity changes in a given water column. It only involves volume change, hence has no gravitational effect. The mass-induced SLVmass, on the other hand, arises from adding or subtracting water mass to or from the water column and has direct gravitational signature. We examine the closure of the seasonal SLV budget and estimate the relative importance of the two contributions in the Mediterranean Sea as a function of time. We use ocean altimetry data (from TOPEX/Poseidon, Jason 1, ERS, and ENVISAT missions) to estimate SLVtotal, temperature, and salinity data (from the Estimating the Circulation and Climate of the Ocean ocean model) to estimate SLVsteric, and time variable gravity data (from Gravity Recovery and Climate Experiment (GRACE) Project, April 2002 to July 2004) to estimate SLVmass. We find that the annual cycle of SLVtotal in the Mediterranean is mainly driven by SLVsteric but moderately offset by SLVmass. The agreement between the seasonal SLVmass estimations from SLVtotal – SLVsteric and from GRACE is quite remarkable; the annual cycle reaches the maximum value in mid-February, almost half a cycle later than SLVtotal or SLVsteric, which peak by mid-October and mid-September, respectively. Thus, when sea level is rising (falling), the Mediterranean Sea is actually losing (gaining) mass. Furthermore, as SLVmass is balanced by vertical (precipitation minus evaporation, P–E) and horizontal (exchange of water with the Atlantic, Black Sea, and river runoff) mass fluxes, we compared it with the P–E determined from meteorological data to estimate the annual cycle of the horizontal flux.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
The advancement of GPS technology has made it possible to use GPS devices as orientation and navigation tools, but also as tools to track spatiotemporal information. GPS tracking data can be broadly applied in location-based services, such as spatial distribution of the economy, transportation routing and planning, traffic management and environmental control. Therefore, knowledge of how to process the data from a standard GPS device is crucial for further use. Previous studies have considered various issues of the data processing at the time. This paper, however, aims to outline a general procedure for processing GPS tracking data. The procedure is illustrated step-by-step by the processing of real-world GPS data of car movements in Borlänge in the centre of Sweden.
Resumo:
Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.
Resumo:
This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.