990 resultados para ELECTRIC MEASURING INSTRUMENTS
Investigation of the operating characteristics of the Iowa sediment concentration measuring system /
Resumo:
"May 1976."
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.
Resumo:
"March 1977--Cover."
Resumo:
Cascaded multilevel inverters-based Static Var Generators (SVGs) are FACTS equipment introduced for active and reactive power flow control. They eliminate the need for zigzag transformers and give a fast response. However, with regard to their application for flicker reduction in using Electric Arc Furnace (EAF), the existing multilevel inverter-based SVGs suffer from the following disadvantages. (1) To control the reactive power, an off-line calculation of Modulation Index (MI) is required to adjust the SVG output voltage. This slows down the transient response to the changes of reactive power; and (2) Random active power exchange may cause unbalance to the voltage of the d.c. link (HBI) capacitor when the reactive power control is done by adjusting the power angle d alone. To resolve these problems, a mathematical model of 11-level cascaded SVG, was developed. A new control strategy involving both MI (modulation index) and power angle (d) is proposed. A selected harmonics elimination method (SHEM) is taken for switching pattern calculations. To shorten the response time and simplify the controls system, feed forward neural networks are used for on-line computation of the switching patterns instead of using look-up tables. The proposed controller updates the MI and switching patterns once each line-cycle according to the sampled reactive power Qs. Meanwhile, the remainder reactive power (compensated by the MI) and the reactive power variations during the line-cycle will be continuously compensated by adjusting the power angles, d. The scheme senses both variables MI and d, and takes action through the inverter switching angle, qi. As a result, the proposed SVG is expected to give a faster and more accurate response than present designs allow. In support of the proposal there is a mathematical model for reactive powered distribution and a sensitivity matrix for voltage regulation assessment, MATLAB simulation results are provided to validate the proposed schemes. The performance with non-linear time varying loads is analysed and refers to a general review of flicker, of methods for measuring flickers due to arc furnace and means for mitigation.
Resumo:
Weakly electric fish produce a dual function electric signal that makes them ideal models for the study of sensory computation and signal evolution. This signal, the electric organ discharge (EOD), is used for communication and navigation. In some families of gymnotiform electric fish, the EOD is a dynamic signal that increases in amplitude during social interactions. Amplitude increase could facilitate communication by increasing the likelihood of being sensed by others or by impressing prospective mates or rivals. Conversely, by increasing its signal amplitude a fish might increase its sensitivity to objects by lowering its electrolocation detection threshold. To determine how EOD modulations elicited in the social context affect electrolocation, I developed an automated and fast method for measuring electroreception thresholds using a classical conditioning paradigm. This method employs a moving shelter tube, which these fish occupy at rest during the day, paired with an electrical stimulus. A custom built and programmed robotic system presents the electrical stimulus to the fish, slides the shelter tube requiring them to follow, and records video of their movements. I trained the electric fish of the genus Sternopygus was trained to respond to a resistive stimulus on this apparatus in 2 days. The motion detection algorithm correctly identifies the responses 91% of the time, with a false positive rate of only 4%. This system allows for a large number of trials, decreasing the amount of time needed to determine behavioral electroreception thresholds. This novel method enables the evaluation the evolutionary interplay between two conflicting sensory forces, social communication and navigation.
Resumo:
The Meals on Wheels (MOW) program is designed to help combat hunger in persons needing assistance. MOW has a duty not only to provide food but also to ensure that it reaches eligible clients safely. Given the population that MOW serves, transporting food safely takes on increased importance. This experiment focused on the major food safety issue of maintaining temperature integrity through the use of transport containers. For containers that did not contain electric heating elements, several factors influenced how fast the food temperature fell. Those factors included the U-value and size of the container as well as how many meals were in the container. As predicted, the smaller the U-value, the longer it took the temperature to fall. Larger containers did better at maintaining food temperatures, provided they were fully loaded. In general, fully loaded small and medium containers were better at maintaining food temperatures than larger containers loaded with the same number of meals.
Resumo:
Aims. To validate the Swedish version of the Sheffield Care Environment Assessment Matrix (S-SCEAM). The instrument’s items measure environmental elements important for supporting the needs of older people, and conceptualized within eight domains. Methods. Item relevance was assessed by a group of experts and measured using content validity index (CVI). Test-retest and inter-rater reliability tests were performed. The domain structure was assessed by the inter-rater agreement of a second group of experts, and measured using Fleiss kappa. Results. All items attained a CVI above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ = 0.903 and 0.869) and inter-rater reliability (κ = 0.851 and 0.832). Domain structure was good, Fleiss’ kappa was 0.63 (range 0.45 to 0.75). Conclusion. The S-SCEAM of 210 items and eight domains showed good content validity and construct validity. The instrument is suggested for use in measuring of the quality of the physical environment in residential care facilities for older persons.
Resumo:
The electric bass and double bass are two different instruments sharing a common function: they link harmony with rhythm, especially when talking about jazz music. The capacity of a bassist to fully support an ensemble is something that can be achieved individually playing electric or double bass. However there are some bassists who, despite of the technical differences between these two instruments, choose to play both. Some of these performers are true masters using and switching electric and double bass according to the different musical settings. It is possible to define similarities and differences between the electric and double bass, but is it viable to use similar approaches too? In order to investigate this field, I focus my research on one exemplar player who combines all the qualities needed to both play electric than double bass: John Patitucci, an inspiration for bassists of all generations and a musician who synthesizes all the fundamental characteristics of an ideal bass player. This dissertation is inspired by Patitucci’s example and by the urge to fill a gap in the specialized literature concerning the history and application of different left and right hand techniques on the electric and double bass. The main purpose of this study is to create the backbone of a bass program for teaching both instruments using John Patitucci as example. His technical approach on both instruments and his soloing vocabulary are points of departure of this dissertation. I begin my study with the historical origins of Patitucci’s techniques ending with the development of exercises created in order to teach his techniques and vocabulary to those who aspire to play electric and double bass; RESUMO: Baixo elétrico e contrabaixo, dois instrumentos distintos que partilham uma função comum: a possibilidade de produzir um conjunto de notas capazes de interligar uma grelha harmonia a uma base rítmica, criando uma coesão estética e musical, sobretudo na música jazz. A capacidade de um baixista de conseguir alcançar de forma eficiente esta ligação como sólido suporte para um “ensemble” musical está na base de uma sua eventual afirmação profissional. Há músicos que apesar das diferencias técnicas entre estes dois instrumentos, decidiram tocar ambos; alguns deles conseguiram destacarse, usando e trocando o baixo elétrico e o contrabaixo para servir melhor diferentes situações musicais. O contrabaixo e baixo elétrico têm características em comum mas ao mesmo tempo diferem por apresentar algumas diferenças técnica substanciais; será por isso possível abordar, explorar e aprender ambos utilizando uma mesma base metodológica? Com o intuito de explorar esta possibilidade direcionei a minha pesquisa para o estudo de um músico que no curso da sua longa carreira consegui grande destaque em quanto baixista elétrico e contrabaixista. John Patitucci é a síntese desta tipologia de músico, sendo uma fonte de inspiração para baixistas de todas as gerações. Esta dissertação é inspirada no seu exemplo e no desejo de colmatar o vazio presente na literatura musical comum aos dois instrumentos sobre a história e aplicação das técnicas da mão esquerda e direita. O foco principal é a criação de uma base sólida para o futuro desenvolvimento de um programa de ensino comum para o baixo eléctrico e o contrabaixo, utilizando o vocabulário técnico e improvisativo de Patitucci como ponto de partida. A dissertação aborda as origens históricas das técnicas utilizadas por Patitucci desenvolvendo, numa fase sucessiva, exercícios criados com a função de ensinar as suas técnicas aos que desejarem aprofundar a prática do baixo elétrico e do contrabaixo.
Resumo:
One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.
The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.
We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.
The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.
To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.
A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.
One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.
Resumo:
This paper focuses on tests of photovoltaic systems in order to address two case studies with silicon monocrystalline and silicon polycrystalline panels, respectively. The first case is an identification of the three parameters of the single-diode equivalent circuit for modelling photovoltaic systems with conclusion about the inevitably age degradation. A comparison between experimental observed and computed I-V and V-P characteristics curves is carried out at standard test conditions. The second case is an experimental observation on a photovoltaic system connected to an electric grid in what regards the quality of the energy injected into the grid. A measuring of the harmonic content in the voltage and in the current waveforms at the terminals of the photovoltaic system is carried out in order to conclude about the conformity with the Standard EN 50160 and the IEEE 519-1992, respectively.
Resumo:
With the increase in load demand for various sectors, protection and safety of the network are key factors that have to be taken into consideration over the electric grid and distribution network. A phasor Measuring unit is an Intelligent electronics device that collects the data in the form of a real-time synchrophasor with a precise time tag using GPS (Global positioning system) and transfers the data to the grid command to monitor and assess the data. The measurements made by PMU have to be very precise to protect the relays and measuring equipment according to the IEEE 60255-118-1(2018). As a device PMU is very expensive to research and develop new functionalities there is a need to find an alternative to working with. Hence many open source virtual libraries are available to replicate the exact function of PMU in the virtual environment(Software) to continue the research on multiple objectives, providing the very least error results when verified. In this thesis, I executed performance and compliance verification of the virtual PMU which was developed using the I-DFT (Interpolated Discrete Fourier transforms) C-class algorithm in MATLAB. In this thesis, a test environment has been developed in MATLAB and tested the virtually developed PMU on both steady state and dynamic state for verifying the latest standard compliance(IEEE-60255-118-1).
Resumo:
Fluorescence Correlation Spectroscopy (FCS) is an optical technique that allows the measurement of the diffusion coefficient of molecules in a diluted sample. From the diffusion coefficient it is possible to calculate the hydrodynamic radius of the molecules. For colloidal quantum dots (QDs) the hydrodynamic radius is valuable information to study interactions with other molecules or other QDs. In this chapter we describe the main aspects of the technique and how to use it to calculate the hydrodynamic radius of quantum dots (QDs).
Resumo:
Measurement instruments are an integral part of clinical practice, health evaluation and research. These instruments are only useful and able to present scientifically robust results when they are developed properly and have appropriate psychometric properties. Despite the significant increase of rating scales, the literature suggests that many of them have not been adequately developed and validated. The scope of this study was to conduct a narrative review on the process of developing new measurement instruments and to present some tools which can be used in some stages of the development process. The steps described were: I-The establishment of a conceptual framework, and the definition of the objectives of the instrument and the population involved; II-Development of the items and of the response scales; III-Selection and organization of the items and structuring of the instrument; IV-Content validity, V-Pre-test. This study also included a brief discussion on the evaluation of the psychometric properties due to their importance for the instruments to be accepted and acknowledged in both scientific and clinical environments.