965 resultados para parallel optical data storage
Resumo:
Tight regulation of the MAP kinase Hog1 is crucial for survival under changing osmotic conditions. Interestingly, we found that Hog1 phosphorylates multiple upstream components, implying feedback regulation within the signaling cascade. Taking advantage of an unexpected link between glucose availability and Hog1 activity, we used quantitative single cell measurements and computational modeling to unravel feedback regulation operating in addition to the well-known adaptation feedback triggered by glycerol accumulation. Indeed, we found that Hog1 phosphorylates its activating kinase Ssk2 on several sites, and cells expressing a non-phosphorylatable Ssk2 mutant are partially defective for feedback regulation and proper control of basal Hog1 activity. Together, our data suggest that Hog1 activity is controlled by intertwined regulatory mechanisms operating with varying kinetics, which together tune the Hog1 response to balance basal Hog1 activity and its steady-state level after adaptation to high osmolarity.
Resumo:
ABSTRACT Dual-trap optical tweezers are often used in high-resolution measurements in single-molecule biophysics. Such measurements can be hindered by the presence of extraneous noise sources, the most prominent of which is the coupling of fluctuations along different spatial directions, which may affect any optical tweezers setup. In this article, we analyze, both from the theoretical and the experimental points of view, the most common source for these couplings in dual-trap optical-tweezers setups: the misalignment of traps and tether. We give criteria to distinguish different kinds of misalignment, to estimate their quantitative relevance and to include them in the data analysis. The experimental data is obtained in a, to our knowledge, novel dual-trap optical-tweezers setup that directly measures forces. In the case in which misalignment is negligible, we provide a method to measure the stiffness of traps and tether based on variance analysis. This method can be seen as a calibration technique valid beyond the linear trap region. Our analysis is then employed to measure the persistence length of dsDNA tethers of three different lengths spanning two orders of magnitude. The effective persistence length of such tethers is shown to decrease with the contour length, in accordance with previous studies.
Resumo:
We consider the numerical treatment of the optical flow problem by evaluating the performance of the trust region method versus the line search method. To the best of our knowledge, the trust region method is studied here for the first time for variational optical flow computation. Four different optical flow models are used to test the performance of the proposed algorithm combining linear and nonlinear data terms with quadratic and TV regularization. We show that trust region often performs better than line search; especially in the presence of non-linearity and non-convexity in the model.
Resumo:
Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.
Resumo:
This thesis presents briefly the basic operation and use of centrifugal pumps and parallel pumping applications. The characteristics of parallel pumping applications are compared to circuitry, in order to search analogy between these technical fields. The purpose of studying circuitry is to find out if common software tools for solving circuit performance could be used to observe parallel pumping applications. The empirical part of the thesis introduces a simulation environment for parallel pumping systems, which is based on circuit components of Matlab Simulink —software. The created simulation environment ensures the observation of variable speed controlled parallel pumping systems in case of different controlling methods. The introduced simulation environment was evaluated by building a simulation model for actual parallel pumping system at Lappeenranta University of Technology. The simulated performance of the parallel pumps was compared to measured values of the actual system. The gathered information shows, that if the initial data of the system and pump perfonnance is adequate, the circuitry based simulation environment can be exploited to observe parallel pumping systems. The introduced simulation environment can represent the actual operation of parallel pumps in reasonably accuracy. There by the circuitry based simulation can be used as a researching tool to develop new controlling ways for parallel pumps.
Resumo:
Peer-reviewed
Resumo:
We present a polarimetric-based optical encoder for image encryption and verification. A system for generating random polarized vector keys based on a Mach-Zehnder configuration combined with translucent liquid crystal displays in each path of the interferometer is developed. Polarization information of the encrypted signal is retrieved by taking advantage of the information provided by the Stokes parameters. Moreover, photon-counting model is used in the encryption process which provides data sparseness and nonlinear transformation to enhance security. An authorized user with access to the polarization keys and the optical design variables can retrieve and validate the photon-counting plain-text. Optical experimental results demonstrate the feasibility of the encryption method.
Resumo:
The large hadron collider constructed at the European organization for nuclear research, CERN, is the world’s largest single measuring instrument ever built, and also currently the most powerful particle accelerator that exists. The large hadron collider includes six different experiment stations, one of which is called the compact muon solenoid, or the CMS. The main purpose of the CMS is to track and study residue particles from proton-proton collisions. The primary detectors utilized in the CMS are resistive plate chambers (RPCs). To obtain data from these detectors, a link system has been designed. The main idea of the link system is to receive data from the detector front-end electronics in parallel form, and to transmit it onwards in serial form, via an optical fiber. The system is mostly ready and in place. However, a problem has occurred with innermost RPC detectors, located in sector labeled RE1/1; transmission lines for parallel data suffer from signal integrity issues over long distances. As a solution to this, a new version of the link system has been devised, a one that fits in smaller space and can be located within the CMS, closer to the detectors. This RE1/1 link system has been so far completed only partially, with just the mechanical design and casing being done. In this thesis, link system electronics for RE1/1 sector has been designed, by modifying the existing link system concept to better meet the requirements of the RE1/1 sector. In addition to completion of the prototype of the RE1/1 link system electronics, some testing for the system has also been done, to ensure functionality of the design.
Resumo:
The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.
Resumo:
The aim of this master´s thesis is to study which processes increase the auxiliary power consumption in carbon capture and storage processes and if it is possible to reduce the auxiliary power consumption with variable speed drives. Also the cost of carbon capture and storage is studied. Data about auxiliary power consumption in carbon capture is gathered from various studies and estimates made by various research centres. Based on these studies a view is presented how the power auxiliary power consumption is divided between different processes in carbon capture processes. In a literary study, the operation of three basic carbon capture systems is described. Also different methods to transport carbon dioxide and carbon dioxide storage options are described in this section. At the end of the thesis processes that consume most of the auxiliary power are defined and possibilities to reduce the auxiliary power consumption are evaluated. Cost of carbon capture, transport and storage are also evaluated at this point and in the case that the carbon capture and storage systems are fully deployed. According to the results, it can be estimated what are the processes are where variable speed drives can be used and what kind of cost and power consumption reduction could be achieved. Results also show how large a project carbon capture and storage is if it is fully deployed.
Resumo:
The optical, mechanical, and microstructural properties of MgF2 single layers grown by ion beam sputtering have been investigated by spectrophotometric measurements, film stress characterization, x-ray photoelectron spectroscopy (XPS), x-ray diffraction, and transmission electron microscopy. The deposition conditions, using fluorine reactive gas or not, have been found to greatly influence the optical absorption and the stress of the films as well as their microstructure. The layers grown with fluorine compensation exhibit a regular columnar microstructure and an UV-optical absorption which can be very low, either as deposited or after thermal annealings at very low temperatures. On the contrary, layers grown without fluorine compensation exhibit a less regular microstructure and a high ultraviolet absorption which is particularly hard to cure. On the basis of calculations, it is shown that F centers are responsible for this absorption, whereas all the films were found to be stoichiometric, in the limit of the XPS sensitivity. On the basis of external data taken from literature, our experimental curves are analyzed, so we propose possible diffusion mechanisms which could explain the behaviors of the coatings.
Resumo:
Asphaltenes are fractions of crude oils that can precipitate and one of the parameters used in the prediction of the conditions in which this phenomenon occurs is the Hildebrand solubility parameter. In this work, it was evaluated the uncertainty propagation in the experimental determination of the solubility parameter of different crude oils, calculated from data of the asphaltenes precipitation by the addition of n-heptane, identified by optical microscopy. It was verified that the solubility parameter of an oil and the associated uncertainty are specific, being recommended that, whenever viable, it is determined parallel both, conferring higher credibility to the results.
Resumo:
The synthesis of gold nanoparticles (Au NPs) 15, 26, and 34 nm in diameter, followed by the investigation of their size-dependent optical and catalytic properties, is described herein as an undergraduate level experiment. The proposed experiment covers concepts on the synthesis, stabilization, and characterization of Au NPs, their size-dependent optical and catalytic properties at the nanoscale, chemical kinetics, and the role of a catalyst. The experiment should be performed by groups of two or three students in three lab sessions of 3 h each and organized as follows: i) synthesis of Au NPs of different sizes and investigation of their optical properties; ii) evaluation of their catalytic activity; and iii) data analysis and discussion. We believe that this activity enables students to integrate these multidisciplinary concepts in a single experiment as well as to become introduced/familiarized with an active research field and current literature in the areas of nanoparticle synthesis and catalysis.
Resumo:
In the modern warfare there is an active development of a new trend connected with a robotic warfare. One of the critical elements of robotics warfare systems is an automatic target recognition system, allowing to recognize objects, based on the data received from sensors. This work considers aspects of optical realization of such a system by means of NIR target scanning at fixed wavelengths. An algorithm was designed, an experimental setup was built and samples of various modern gear and apparel materials were tested. For pattern testing the samples of actively arm engaged armies camouflages were chosen. Tests were performed both in clear atmosphere and in the artificial extremely humid and hot atmosphere to simulate field conditions.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.