904 resultados para 100602 Input Output and Data Devices


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is based on the analysis and implementation of a new drive system applied to refrigeration systems, complying with the restrictions imposed by the IEC standards (Harmonic/Flicker/EMI-Electromagnetic Interference restrictions), in order to obtain high efficiency, high power factor, reduced harmonic distortion in the input current and reduced electromagnetic interference, with excellent performance in temperature control of a refrigeration prototype system (automatic control, precision and high dynamic response). The proposal is replace the single-phase motor by a three-phase motor, in the conventional refrigeration system. In this way, a proper control technique can be applied, using a closed-loop (feedback control), that will allow an accurate adjustment of the desirable temperature. The proposed refrigeration prototype uses a 0.5Hp three-phase motor and an open (Belt-Drive) Bitzer IY type compressor. The input rectifier stage's features include the reduction in the input current ripple, the reduction in the output voltage ripple, the use of low stress devices, low volume for the EMI input filter, high input power factor (PF), and low total harmonic distortion (THD) in the input current, in compliance with the IEC61000-3-2 standards. The digital controller for the output three-phase inverter stage has been developed using a conventional voltage-frequency control (scalar V/f control), and a simplified stator oriented Vector control, in order to verify the feasibility and performance of the proposed digital controls for continuous temperature control applied at the refrigerator prototype. ©2008 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEB

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are several electrophysiological systems available commercially. Usually, control groups are required to compare their results, due to the differences between display types. Our aim was to examine the differences between CRT and LCD/TFT stimulators used in pattern VEP responses performed according to the ISCEV standards. We also aimed to check different contrast values toward thresholds. In order to obtain more precise results, we intended to measure the intensity and temporal response characteristics of the monitors with photometric methods. To record VEP signals, a Roland RetiPort electrophysiological system was used. The pattern VEP tests were carried out according to ISCEV protocols on a CRT and a TFT monitor consecutively. Achromatic checkerboard pattern was used at three different contrast levels (maximal, 75, 25%) using 1A degrees and 15` check sizes. Both CRT and TFT displays were luminance and contrast matched, according to the gamma functions based on measurements at several DAC values. Monitor-specific luminance parameters were measured by means of spectroradiometric instruments. Temporal differences between the displays` electronic and radiometric signals were measured with a device specifically built for the purpose. We tested six healthy control subjects with visual acuity of at least 20/20. The tests were performed on each subject three times on different days. We found significant temporal differences between the CRT and the LCD monitors at all contrast levels and spatial frequencies. In average, the latency times were 9.0 ms (+/- 3.3 ms) longer with the TFT stimulator. This value is in accordance with the average of the measured TFT input-output temporal difference values (10.1 +/- A 2.2 ms). According to our findings, measuring the temporal parameters of the TFT monitor with an adequately calibrated measurement setup and correcting the VEP data with the resulting values, the VEP signals obtained with different display types can be transformed to be comparable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Ziele der vorliegenden Arbeit waren 1) die Entwicklung und Validierung von sensitiven und substanz-spezifischen Methoden für die quantitative Bestimmung von anionischen, nichtionischen und amphoteren Tensiden und deren Metaboliten in wässrigen Umweltproben unter Einsatz leistungsfähiger, massenspektrometrischer Analysengeräte,2) die Gewinnung von aeroben, polaren Abbauprodukten aus Tensiden in einem die realen Umweltbedingungen simulierenden Labor-Festbettbioreaktor (FBBR), dessen Biozönose oberflächenwasserbürtig war,3) zur Aufklärung des Abbaumechanismus von Tensiden neue, in 2) gewonnene Metabolite zu identifizieren und massenspektrometrisch zu charakterisieren ebenso wie den Primärabbau und den weiteren Abbau zu verfolgen,4) durch quantitative Untersuchungen von Tensiden und deren Abbauprodukten in Abwasser und Oberflächenwasser Informationen zu ihrem Eintrag und Verhalten bei unterschiedlichen hydrologischen und klimatischen Bedingungen zu erhalten,5) das Verhalten von persistenten Tensidmetaboliten in Wasserwerken, die belastetes Oberflächenwasser aufbereiten, zu untersuchen und deren Vorkommen im Trinkwasser zu bestimmen,6) mögliche Schadwirkungen von neu entdeckten Metabolite mittels ökotoxikologischer Biotests abzuschätzen,7) durch Vergleich der Felddaten mit den Ergebnissen der Laborversuche die Umweltrelevanz der Abbaustudien zu belegen. Die Auswahl der untersuchten Verbindungen erfolgte unter Berücksichtigung ihres Produktionsvolumens und der Neuheit auf dem Tensidmarkt. Sie umfasste die Waschmittelinhaltsstoffe lineare Alkylbenzol-sulfonate (LAS), welches das Tensid mit der höchsten Produktionsmenge darstellte, die beiden nichtionischen Tenside Alkylglucamide (AG) und Alkylpolyglucoside (APG), ebenso wie das amphotere Tensid Cocamidopropylbetain (CAPB). Außerdem wurde der polymere Farbübertragungsinhibitor Polyvinylpyrrolidon (PVP) untersucht.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the experimental methods commonly used to define the behaviour of a full scale system, dynamic tests are the most complete and efficient procedures. A dynamic test is an experimental process, which would define a set of characteristic parameters of the dynamic behaviour of the system, such as natural frequencies of the structure, mode shapes and the corresponding modal damping values associated. An assessment of these modal characteristics can be used both to verify the theoretical assumptions of the project, to monitor the performance of the structural system during its operational use. The thesis is structured in the following chapters: The first introductive chapter recalls some basic notions of dynamics of structure, focusing the discussion on the problem of systems with multiply degrees of freedom (MDOF), which can represent a generic real system under study, when it is excited with harmonic force or in free vibration. The second chapter is entirely centred on to the problem of dynamic identification process of a structure, if it is subjected to an experimental test in forced vibrations. It first describes the construction of FRF through classical FFT of the recorded signal. A different method, also in the frequency domain, is subsequently introduced; it allows accurately to compute the FRF using the geometric characteristics of the ellipse that represents the direct input-output comparison. The two methods are compared and then the attention is focused on some advantages of the proposed methodology. The third chapter focuses on the study of real structures when they are subjected to experimental test, where the force is not known, like in an ambient or impact test. In this analysis we decided to use the CWT, which allows a simultaneous investigation in the time and frequency domain of a generic signal x(t). The CWT is first introduced to process free oscillations, with excellent results both in terms of frequencies, dampings and vibration modes. The application in the case of ambient vibrations defines accurate modal parameters of the system, although on the damping some important observations should be made. The fourth chapter is still on the problem of post processing data acquired after a vibration test, but this time through the application of discrete wavelet transform (DWT). In the first part the results obtained by the DWT are compared with those obtained by the application of CWT. Particular attention is given to the use of DWT as a tool for filtering the recorded signal, in fact in case of ambient vibrations the signals are often affected by the presence of a significant level of noise. The fifth chapter focuses on another important aspect of the identification process: the model updating. In this chapter, starting from the modal parameters obtained from some environmental vibration tests, performed by the University of Porto in 2008 and the University of Sheffild on the Humber Bridge in England, a FE model of the bridge is defined, in order to define what type of model is able to capture more accurately the real dynamic behaviour of the bridge. The sixth chapter outlines the necessary conclusions of the presented research. They concern the application of a method in the frequency domain in order to evaluate the modal parameters of a structure and its advantages, the advantages in applying a procedure based on the use of wavelet transforms in the process of identification in tests with unknown input and finally the problem of 3D modeling of systems with many degrees of freedom and with different types of uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Environmental computer models are deterministic models devoted to predict several environmental phenomena such as air pollution or meteorological events. Numerical model output is given in terms of averages over grid cells, usually at high spatial and temporal resolution. However, these outputs are often biased with unknown calibration and not equipped with any information about the associated uncertainty. Conversely, data collected at monitoring stations is more accurate since they essentially provide the true levels. Due the leading role played by numerical models, it now important to compare model output with observations. Statistical methods developed to combine numerical model output and station data are usually referred to as data fusion. In this work, we first combine ozone monitoring data with ozone predictions from the Eta-CMAQ air quality model in order to forecast real-time current 8-hour average ozone level defined as the average of the previous four hours, current hour, and predictions for the next three hours. We propose a Bayesian downscaler model based on first differences with a flexible coefficient structure and an efficient computational strategy to fit model parameters. Model validation for the eastern United States shows consequential improvement of our fully inferential approach compared with the current real-time forecasting system. Furthermore, we consider the introduction of temperature data from a weather forecast model into the downscaler, showing improved real-time ozone predictions. Finally, we introduce a hierarchical model to obtain spatially varying uncertainty associated with numerical model output. We show how we can learn about such uncertainty through suitable stochastic data fusion modeling using some external validation data. We illustrate our Bayesian model by providing the uncertainty map associated with a temperature output over the northeastern United States.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet of Things (IoT) is the next industrial revolution: we will interact naturally with real and virtual devices as a key part of our daily life. This technology shift is expected to be greater than the Web and Mobile combined. As extremely different technologies are needed to build connected devices, the Internet of Things field is a junction between electronics, telecommunications and software engineering. Internet of Things application development happens in silos, often using proprietary and closed communication protocols. There is the common belief that only if we can solve the interoperability problem we can have a real Internet of Things. After a deep analysis of the IoT protocols, we identified a set of primitives for IoT applications. We argue that each IoT protocol can be expressed in term of those primitives, thus solving the interoperability problem at the application protocol level. Moreover, the primitives are network and transport independent and make no assumption in that regard. This dissertation presents our implementation of an IoT platform: the Ponte project. Privacy issues follows the rise of the Internet of Things: it is clear that the IoT must ensure resilience to attacks, data authentication, access control and client privacy. We argue that it is not possible to solve the privacy issue without solving the interoperability problem: enforcing privacy rules implies the need to limit and filter the data delivery process. However, filtering data require knowledge of how the format and the semantics of the data: after an analysis of the possible data formats and representations for the IoT, we identify JSON-LD and the Semantic Web as the best solution for IoT applications. Then, this dissertation present our approach to increase the throughput of filtering semantic data by a factor of ten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To determine fluid retention, glomerular filtration rate, and urine output in dogs anesthetized for a surgical orthopedic procedure. ANIMALS: 23 dogs treated with a tibial plateau leveling osteotomy. PROCEDURES: 12 dogs were used as a control group. Cardiac output was measured in 5 dogs, and 6 dogs received carprofen for at least 14 days. Dogs received oxymorphone, atropine, propofol, and isoflurane for anesthesia (duration, 4 hours). Urine and blood samples were obtained for analysis every 30 minutes. Lactated Ringer's solution was administered at 10 mL/kg/h. Urine output was measured and glomerular filtration rate was estimated. Fluid retention was measured by use of body weight, fluid balance, and bioimpedance spectroscopy. RESULTS: No difference was found among control, cardiac output, or carprofen groups, so data were combined. Median urine output and glomerular filtration rate were 0.46 mL/kg/h and 1.84 mL/kg/min. Dogs retained a large amount of fluids during anesthesia, as indicated by increased body weight, positive fluid balance, increased total body water volume, and increased extracellular fluid volume. The PCV, total protein concentration, and esophageal temperature decreased in a linear manner. CONCLUSIONS AND CLINICAL RELEVANCE: Dogs anesthetized for a tibial plateau leveling osteotomy retained a large amount of fluids, had low urinary output, and had decreased PCV, total protein concentration, and esophageal temperature. Evaluation of urine output alone in anesthetized dogs may not be an adequate indicator of fluid balance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New designs of user input systems have resulted from the developing technologies and specialized user demands. Conventional keyboard and mouse input devices still dominate the input speed, but other input mechanisms are demanded in special application scenarios. Touch screen and stylus input methods have been widely adopted by PDAs and smartphones. Reduced keypads are necessary for mobile phones. A new design trend is exploring the design space in applications requiring single-handed input, even with eyes-free on small mobile devices. This requires as few keys on the input device to make it feasible to operate. But representing many characters with fewer keys can make the input ambiguous. Accelerometers embedded in mobile devices provide opportunities to combine device movements with keys for input signal disambiguation. Recent research has explored its design space for text input. In this dissertation an accelerometer assisted single key positioning input system is developed. It utilizes input device tilt directions as input signals and maps their sequences to output characters and functions. A generic positioning model is developed as guidelines for designing positioning input systems. A calculator prototype and a text input prototype on the 4+1 (5 positions) positioning input system and the 8+1 (9 positions) positioning input system are implemented using accelerometer readings on a smartphone. Users use one physical key to operate and feedbacks are audible. Controlled experiments are conducted to evaluate the feasibility, learnability, and design space of the accelerometer assisted single key positioning input system. This research can provide inspiration and innovational references for researchers and practitioners in the positioning user input designs, applications of accelerometer readings, and new development of standard machine readable sign languages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: The purpose of this study was to systematically review the literature on the survival rates of palatal implants, Onplants((R)), miniplates and mini screws. MATERIAL AND METHODS: An electronic MEDLINE search supplemented by manual searching was conducted to identify randomized clinical trials, prospective and retrospective cohort studies on palatal implants, Onplants((R)), miniplates and miniscrews with a mean follow-up time of at least 12 weeks and of at least 10 units per modality having been examined clinically at a follow-up visit. Assessment of studies and data abstraction was performed independently by two reviewers. Reported failures of used devices were analyzed using random-effects Poisson regression models to obtain summary estimates and 95% confidence intervals (CI) of failure and survival proportions. RESULTS: The search up to January 2009 provided 390 titles and 71 abstracts with full-text analysis of 34 articles, yielding 27 studies that met the inclusion criteria. In meta-analysis, the failure rate for Onplants((R)) was 17.2% (95% CI: 5.9-35.8%), 10.5% for palatal implants (95% CI: 6.1-18.1%), 16.4% for miniscrews (95% CI: 13.4-20.1%) and 7.3% for miniplates (95% CI: 5.4-9.9%). Miniplates and palatal implants, representing torque-resisting temporary anchorage devices (TADs), when grouped together, showed a 1.92-fold (95% CI: 1.06-2.78) lower clinical failure rate than miniscrews. CONCLUSION: Based on the available evidence in the literature, palatal implants and miniplates showed comparable survival rates of >or=90% over a period of at least 12 weeks, and yielded superior survival than miniscrews. Palatal implants and miniplates for temporary anchorage provide reliable absolute orthodontic anchorage. If the intended orthodontic treatment would require multiple miniscrew placement to provide adequate anchorage, the reliability of such systems is questionable. For patients who are undergoing extensive orthodontic treatment, force vectors may need to be varied or the roots of the teeth to be moved may need to slide past the anchors. In this context, palatal implants or miniplates should be the TADs of choice.