932 resultados para Read Out Driver, Data Acquisition, Electronics, FPGA, ATLAS, IBL, Pixel Detector, LHC, VME


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supervisory Control & Data Acquisition (SCADA) systems are used by many industries because of their ability to manage sensors and control external hardware. The problem with commercially available systems is that they are restricted to a local network of users that use proprietary software. There was no Internet development guide to give remote users out of the network, control and access to SCADA data and external hardware through simple user interfaces. To solve this problem a server/client paradigm was implemented to make SCADAs available via the Internet. Two methods were applied and studied: polling of a text file as a low-end technology solution and implementing a Transmission Control Protocol (TCP/IP) socket connection. Users were allowed to login to a website and control remotely a network of pumps and valves interfaced to a SCADA. This enabled them to sample the water quality of different reservoir wells. The results were based on real time performance, stability and ease of use of the remote interface and its programming. These indicated that the most feasible server to implement is the TCP/IP connection. For the user interface, Java applets and Active X controls provide the same real time access.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study arises in the context of physics teacher training and aims, from the speech of the teacher trainer, identify possible pedagogical models and characterize thinking styles present in the course of licentiate in physics of IFRN using the epistemology of Ludwik Fleck. We classify our research as qualitative with an empirical nature, and for the analysis we chose the discursive textual analysis - DTA (MORAES, 2003). The locus of our research will be the licentiate in physics at the Federal Institute of Education, Science and Technology of Rio Grande do Norte - IFRN, Natal-Central Campus and the research subjects, a group of teacher trainers of this course. We interviewed ten teachers, being six from the group dedicated to physics and four from the group dedicated to didactics and pedagogy. From this design, we performed data acquisition consisted of: 1) semi-structured interview, 2) document analysis. On the data analysis, with the support of pedagogical trends that were observed in our study based on the perception of the similarities and differences between the ideas presented by teachers about: education and teaching; ideal teaching practice, teacher's role, learning conceptions, and according to the student and on the ideological thinking of these former teachers on the professional profile of graduates, we noted subsidies to identify evidences of the presence of three distinct thinking styles that interrelate with each other in a considerably intense way. The relevance of the study is presented in the understanding of thinking styles that participate in the dynamics of the course of teacher training in physics, and by consequence, elucidation of a problem pointed out a priori as motivating the research: the difficulty of communicative interaction on educational practices among teacher trainers. We bring Fleck's epistemology as a motivating possibility of dialogue and negotiation, setting thus an instrument of real change, towards the significance of teacher training in physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Respiration and ammonium excretion rates at different oxygen partial pressure were measured for calanoid copepods and euphausiids from the Eastern Tropical South Pacific and the Eastern Tropical North Atlantic. All specimens used for experiments were caught in the upper 400 m of the water column and only animals appearing unharmed and fit were used for experiments. Specimens were sorted, identified and transferred into aquaria with filtered, well-oxygenated seawater immediately after the catch and maintained for 1 to 13 hours prior to physiological experiments at the respective experimental temperature. Maintenance and physiological experiments were conducted in darkness in temperature-controlled incubators at 11, 13 or 23 degree C (±1). Before and during experiments, animals were not fed. Respiration and ammonium excretion rate measurements (both in µmol h-1 gDW-1) at varying oxygen concentrations were conducted in 12 to 60 mL gas-tight glass bottles. These were equipped with oxygen microsensors (ø 3 mm, PreSens Precision Sensing GmbH, Regensburg, Germany) attached to the inner wall of the bottles to monitor oxygen concentrations non-invasively. Read-out of oxygen concentrations was conducted using multi-channel fiber optic oxygen transmitters (Oxy-4 and Oxy-10 mini, PreSens Precision Sensing GmbH, Regensburg, Germany) that were connected via optical fibers to the outside of the bottles directly above the oxygen microsensor spots. Measurements were started at pre-adjusted oxygen and carbon dioxide levels. For this, seawater stocks with adjusted pO2 and pCO2 were prepared by equilibrating 3 to 4 L of filtered (0.2 µm filter Whatman GFF filter) and UV - sterilized (Aqua Cristal UV C 5 Watt, JBL GmbH & Co. KG, Neuhofen, Germany) water with premixed gases (certified gas mixtures from Air Liquide) for 4 hours at the respective experimental temperature. pCO2 levels were chosen to mimic the environmental pCO2 in the ETSP OMZ or the ETNA OMZ. Experimental runs were conducted with 11 to 15 trial incubations (1 or 2 animals per incubation bottle and three different treatment levels) and three animal-free control incubations (one per experimental treatment). During each run, experimental treatments comprised 100% air saturation as well as one reduced air saturation level with and without CO2. Oxygen concentrations in the incubation bottles were recorded every 5 min using the fiber-optic microsensor system and data recording for respiration rate determination was started immediately after all animals were transferred. Respiration rates were calculated from the slope of oxygen decrease over selected time intervals. Chosen time intervals were 20 to 105 min long. No respiration rate was calculated for the first 20 to 60 min after animal transfer to avoid the impact of enhanced activity of the animal or changes in the bottle water temperature during initial handling on the respiration rates and oxygen readings. Respiration rates were obtained over a maximum of 16 hours incubation time and slopes were linear at normoxia to mild hypoxia. Respiration rates in animal-free control bottles were used to correct for microbial activity. These rates were < 2% of animal respiration rates at normoxia. Samples for the measurement of ammonium concentrations were taken after 2 to 10 hours incubation time. Ammonium concentration was determined fluorimetrically (Holmes et al., 1999). Ammonium excretion was calculated as the concentration difference between incubation and animal-free control bottles. Some specimens died during the respiration and excretion rate measurements, as indicated by a cessation of respiration. No excretion rate measurements were conducted in this case, but the oxygen level at which the animal died was noted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most challenging task underlying many hyperspectral imagery applications is the spectral unmixing, which decomposes a mixed pixel into a collection of reectance spectra, called endmember signatures, and their corresponding fractional abundances. Independent Component Analysis (ICA) have recently been proposed as a tool to unmix hyperspectral data. The basic goal of ICA is to nd a linear transformation to recover independent sources (abundance fractions) given only sensor observations that are unknown linear mixtures of the unobserved independent sources. In hyperspectral imagery the sum of abundance fractions associated to each pixel is constant due to physical constraints in the data acquisition process. Thus, sources cannot be independent. This paper address hyperspectral data source dependence and its impact on ICA performance. The study consider simulated and real data. In simulated scenarios hyperspectral observations are described by a generative model that takes into account the degradation mechanisms normally found in hyperspectral applications. We conclude that ICA does not unmix correctly all sources. This conclusion is based on the a study of the mutual information. Nevertheless, some sources might be well separated mainly if the number of sources is large and the signal-to-noise ratio (SNR) is high.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuropeptides affect the activity of the myriad of neuronal circuits in the brain. They are under tight spatial and chemical control and the dynamics of their release and catabolism directly modify neuronal network activity. Understanding neuropeptide functioning requires approaches to determine their chemical and spatial heterogeneity within neural tissue, but most imaging techniques do not provide the complete information desired. To provide chemical information, most imaging techniques used to study the nervous system require preselection and labeling of the peptides of interest; however, mass spectrometry imaging (MSI) detects analytes across a broad mass range without the need to target a specific analyte. When used with matrix-assisted laser desorption/ionization (MALDI), MSI detects analytes in the mass range of neuropeptides. MALDI MSI simultaneously provides spatial and chemical information resulting in images that plot the spatial distributions of neuropeptides over the surface of a thin slice of neural tissue. Here a variety of approaches for neuropeptide characterization are developed. Specifically, several computational approaches are combined with MALDI MSI to create improved approaches that provide spatial distributions and neuropeptide characterizations. After successfully validating these MALDI MSI protocols, the methods are applied to characterize both known and unidentified neuropeptides from neural tissues. The methods are further adapted from tissue analysis to be able to perform tandem MS (MS/MS) imaging on neuronal cultures to enable the study of network formation. In addition, MALDI MSI has been carried out over the timecourse of nervous system regeneration in planarian flatworms resulting in the discovery of two novel neuropeptides that may be involved in planarian regeneration. In addition, several bioinformatic tools are developed to predict final neuropeptide structures and associated masses that can be compared to experimental MSI data in order to make assignments of neuropeptide identities. The integration of computational approaches into the experimental design of MALDI MSI has allowed improved instrument automation and enhanced data acquisition and analysis. These tools also make the methods versatile and adaptable to new sample types.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents the modeling and FPGA implementation of digital TIADC mismatches compensation systems. The development of the whole work follows a top-down methodology. Following this methodology was developed a two channel TIADC behavior modeling and their respective offset, gain and clock skew mismatches on Simulink. In addition was developed digital mismatch compensation system behavior modeling. For clock skew mismatch compensation fractional delay filters were used, more specifically, the efficient Farrow struct. The definition of wich filter design methodology would be used, and wich Farrow structure, required the study of various design methods presented in literature. The digital compensation systems models were converted to VHDL, for FPGA implementation and validation. These system validation was carried out using the test methodology FPGA In Loop . The results obtained with TIADC mismatch compensators show the high performance gain provided by these structures. Beyond this result, these work illustrates the potential of design, implementation and FPGA test methodologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A medição precisa da força é necessária para muitas aplicações, nomeadamente, para a determinação da resistência mecânica dos materiais, controlo de qualidade durante a produção, pesagem e segurança de pessoas. Dada a grande necessidade de medição de forças, têm-se desenvolvido, ao longo do tempo, várias técnicas e instrumentos para esse fim. Entre os vários instrumentos utilizados, destacam-se os sensores de força, também designadas por células de carga, pela sua simplicidade, precisão e versatilidade. O exemplo mais comum é baseado em extensómetros elétricos do tipo resistivo, que aliados a uma estrutura formam uma célula de carga. Este tipo de sensores possui sensibilidades baixas e em repouso, presença de offset diferente de zero, o que torna complexo o seu condicionamento de sinal. Este trabalho apresenta uma solução para o condicionamento e aquisição de dados para células de carga que, tanto quanto foi investigado, é inovador. Este dispositivo permite efetuar o condicionamento de sinal, digitalização e comunicação numa estrutura atómica. A ideia vai de encontro ao paradigma dos sensores inteligentes onde um único dispositivo eletrónico, associado a uma célula de carga, executa um conjunto de operações de processamento de sinal e transmissão de dados. Em particular permite a criação de uma rede ad-hoc utilizando o protocolo de comunicação IIC. O sistema é destinado a ser introduzido numa plataforma de carga, desenvolvida na Escola Superior de Tecnologia e Gestão de Bragança, local destinado à sua implementação. Devido à sua estratégia de conceção para a leitura de forças em três eixos, contém quatro células de carga, com duas saídas cada, totalizando oito saídas. O hardware para condicionamento de sinal já existente é analógico, e necessita de uma placa de dimensões consideráveis por cada saída. Do ponto de vista funcional, apresenta vários problemas, nomeadamente o ajuste de ganho e offset ser feito manualmente, tornando-se essencial um circuito com melhor desempenho no que respeita a lidar com um array de sensores deste tipo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The State of Paraíba is one of the most dynamic states of Brazil, strategically located in the northeast, is notable for the excellent potential for integration of different transportation modes forming the states of Rio Grande do Norte, Pernambuco and Alagoas. The dynamic that occurs with port activity causes changes in the space where it is installed. And the elements of this space are always more than suffering direct or indirect influences as the flow in the port is expanded. Therefore, this region became subject to the accidental spillage of oil, because it presents a heavy traffic of ships of various sizes that can run aground or collide with oil causing accidental events. The study of geomorphological and sedimentological compositions of seafloor becomes important as more is known about the relationships between these parameters and associated fauna, and can identify their preferred habitats. The database background, acoustically collected along the proposed study area, is a wealth of information, which were duly examined, cataloged and made available. Such information can serve as an important tool, providing a geomorphological survey of the sedimentary area studied, and come to subsidize, in a flexible, future decision making. With the study area Port of Cabedelo, Paraíba - Brazil, this research aimed to evaluate the influence of the tidal surface and background in modeling the seabed, including the acquisition of information about the location of submerged rocky bodies and the depth of these bodies may turn out to be natural traps for the trapping of oil in case of leaks, and obtain the relationship between types of bed and the hydrodynamic conditions present in the region. In this context, for this study were collected bathymetric data (depth) and physical oceanographic (height of water column, water temperature, intensity and direction of currents, waves and turbidity), meteorological (rainfall, air temperature, humidity, winds and barometric pressure) of the access channel to the Port of Cabedelo / PB and its basin evolution (where the cruise ships dock), and includes tools of remote sensing (Landsat 7 ETM +, 2001), so that images and the results are integrated into Geographic Information Systems and used in the elaboration of measures aimed at environmental protection areas under the influence of this scale facilities, serving as a grant to prepare a contingency plan in case of oil spills in the region. The main findings highlight the techniques of using hydroacoustic data acquisition together bathymetric surveys of high and low frequency. From there, five were prepared in bathymetric pattern of Directorate of Hydrography and Navigation - DHN, with the depth in meters, on a scale of 1:2500 (Channel and Basin Evolution of Access to Port of Cabedelo), where there is a large extent possible beachrocks that hinder the movement of vessels in the port area, which can cause collisions, running aground and leaking oil. From the scatter diagram of the vectors of currents, it can be seen as the tidal stream and undergoes a channeling effect caused by the bidirectional effect of the tide (ebb and flood) in the basin of the Port of Cabedelo evolution in NW-direction SE and the highest speed of the currents occurs at low tide. The characterization weather for the period from 28/02 to 04/07/2010 values was within the expected average for the region of study. The multidisciplinary integration of products (digital maps and remote sensing images), proved to be efficient for the characterization of underwater geomorphological study area, reaching the aim to discriminate and enhance submerged structures, previously not visible in the images

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a monitoring system devoted to small sized photovoltaic (PV) power plants. The system is characterized by: a high level of integration; a low cost, when compared to the cost of the PV system to be monitored; and an easy installation in the majority of the PV plants with installed power of some kW. The system is able to collect, store, process and display electrical and meteorological parameters that are crucial when monitoring PV facilities. The identification of failures in the PV system and the elaboration of performance analysis of such facilities are other important characteristics of the developed system. The access to the information about the monitored facilities is achieved by using a web application, which was developed with a focus on the mobile devices. In addition, there is the possibility of an integration between the developed monitoring system and the central supervision system of Martifer Solar (a company focused on the development, operation and maintenance of PV systems).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supervisory Control And Data Acquisition (SCADA) systems are widely used in the management of critical infrastructure such as electricity and water distrubution systems. Currently there is little understanding of how to best protect SCADA systems from malicious attacks. We review the constraints and requirements for SCADA security and propose a suitable architecture (SKMA) for secure SCADA communications. The architecture includes a proposed key management protocol (SKMP). We compare the architecture with a previous proposal from Sandia Labs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the popularity of modern Collaborative Virtual Environments, there has been a related increase in their size and complexity. Developers therefore need visualisations that expose usage patterns from logged data, to understand the structures and dynamics of these complex environments. This chapter presents a new framework for the process of visualising virtual environment usage data. Major components, such as an event model, designer task model and data acquisition infrastructure are described. Interface and implementation factors are also developed, along with example visualisation techniques that make use of the new task and event model. A case study is performed to illustrate a typical scenario for the framework, and its benefits to the environment development team.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems have recently been added to the already wide collection of wireless sensor networks applications. The PCS/SCADA environment is somewhat more amenable to the use of heavy cryptographic mechanisms such as public key cryptography than other sensor application environments. The sensor nodes in the environment, however, are still open to devastating attacks such as node capture, which makes designing a secure key management challenging. In this paper, a key management scheme is proposed to defeat node capture attack by offering both forward and backward secrecies. Our scheme overcomes the pitfalls which Nilsson et al.'s scheme suffers from, and is not more expensive than their scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the popularity of modern Collaborative Virtual Environments, there has been a related increase in their size and complexity. Developers therefore need visualisations that expose usage patterns from logged data, to understand the structures and dynamics of these complex environments. This chapter presents a new framework for the process of visualising virtual environment usage data. Major components, such as an event model, designer task model and data acquisition infrastructure are described. Interface and implementation factors are also developed, along with example visualisation techniques that make use of the new task and event model. A case study is performed to illustrate a typical scenario for the framework, and its benefits to the environment development team.