902 resultados para Distributed computer-controlled systems
Resumo:
Tooth shade results from the interaction between enamel color, enamel translucency and dentine color. A change in any of these parameters will change a tooth’s color. The objective of this study was to evaluate the changes occurring in enamel translucency during a tooth whitening process. Fourteen human tooth enamel fragments, with a mean thickness of 0.96 mm (± 0.3 mm), were subjected to a bleaching agent (10% carbamide peroxide) 8 hours per day for 28 days. The enamel fragment translucency was measured by a computer controlled spectrophotometer before and after the bleaching agent applications in accordance with ANSI Z80.3-1986 - American National Standard for Ophthalmics - nonprescription sunglasses and fashion eyewear-requirements. The measurements were statistically compared by the Mann-Whitney non-parametric test. A decrease was observed in the translucency of all specimens and, consequently, there was a decrease in transmittance values for all samples. It was observed that the bleaching procedure significantly changes the enamel translucency, making it more opaque.
Resumo:
The aim of this work was to evaluate the performance of femtosecond laser-induced breakdown spectroscopy (fs-LIBS) for the determination of elements in animal tissues. Sample pellets were prepared from certified reference materials, such as liver, kidney, muscle, hepatopancreas, and oyster, after cryogenic grinding assisted homogenization. Individual samples were placed in a two-axis computer-controlled translation stage that moved in the plane orthogonal to a beam originating from a Ti:Sapphire chirped-pulse amplification (CPA) laser system operating at 800 mu and producing a train of 840 mu J and 40 fs pulses at 90 Hz. The plasma emission was coupled into the optical fiber of a high-resolution intensified charge-coupled device (ICCD)-echelle spectrometer. Time-resolved characteristics of the laser-produced plasmas showed that the best results were obtained with delay times between 80 and 120 ns. Data obtained indicate both that it is a matrix-independent sampling process and that fs-LIBS can be used for the determination of Ca, Cu, Fe, K, Mg, Na, and P, but efforts must be made to obtain more appropriate detection limits for Al, Sr, and Zn.
Resumo:
A multi-pumping flow system exploiting prior assay is proposed for sequential turbidimetric determination of sulphate and chloride in natural waters. Both methods are implemented in the same manifold that provides facilities for: in-line sample clean-up with a Bio-Rex 70 mini-column with fluidized beads: addition of low amounts of sulphate or chloride ions to the reaction medium for improving supersaturation; analyte precipitation with Ba(2+) or Ag(+); real-time decision on the need for next assay. The sample is initially run for chloride determination, and the analytical signal is compared with a preset value. If higher, the sample is run again, now for sulphate determination. The strategy may lead to all increased sample throughput. The proposed system is computer-controlled and presents enhanced figures of merit. About 10 samples are run per hour (about 60 measurements) and results are reproducible and Unaffected by the presence of potential interfering ions at concentration levels usually found in natural waters. Accuracy was assessed against ion chromatography. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Acoustic resonances are observed in high-pressure discharge lamps operated with ac input modulated power frequencies in the kilohertz range. This paper describes an optical resonance detection method for high-intensity discharge lamps using computer-controlled cameras and image processing software. Experimental results showing acoustic resonances in high-pressure sodium lamps are presented.
Resumo:
We describe a method for multiple indicator dilution studies in the isolated perfused human placental lobule developed to investigate the relationships between changes in pressure and flow and solute clearance. A peripheral lobule of a human placenta is perfused with a tissue culture-based medium and the perfusate oxygen tension, arterial and venous pressures, pH and perfusion temperature continuously monitored by a computerized system. Flow rates are readily changed. Bolus injections of vascular, extracellular and water space markers, and study compounds can be made into either maternal or fetal circulations, and precisely timed outflow fractions can be collected with computer-controlled fraction collectors, allowing simultaneous determination of concentration-time profiles of each marker. (C) 1997 Elsevier Science Inc.
Resumo:
Currently, Power Systems (PS) already accommodate a substantial penetration of DG and operate in competitive environments. In the future PS will have to deal with largescale integration of DG and other distributed energy resources (DER), such as storage means, and provide to market agents the means to ensure a flexible and secure operation. This cannot be done with the traditional PS operation. SCADA (Supervisory Control and Data Acquisition) is a vital infrastructure for PS. Current SCADA adaptation to accommodate the new needs of future PS does not allow to address all the requirements. In this paper we present a new conceptual design of an intelligent SCADA, with a more decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). Once a situation is characterized, data and control options available to each entity are re-defined according to this context, taking into account operation normative and a priori established contracts. The paper includes a case-study of using future SCADA features to use DER to deal with incident situations, preventing blackouts.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Electrónica Industrial
Resumo:
Future distribution systems will have to deal with an intensive penetration of distributed energy resources ensuring reliable and secure operation according to the smart grid paradigm. SCADA (Supervisory Control and Data Acquisition) is an essential infrastructure for this evolution. This paper proposes a new conceptual design of an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). This SCADA model is used to support the energy resource management undertaken by a distribution network operator (DNO). Resource management considers all the involved costs, power flows, and electricity prices, allowing the use of network reconfiguration and load curtailment. Locational Marginal Prices (LMP) are evaluated and used in specific situations to apply Demand Response (DR) programs on a global or a local basis. The paper includes a case study using a 114 bus distribution network and load demand based on real data.
Resumo:
Consider the problem of disseminating data from an arbitrary source node to all other nodes in a distributed computer system, like Wireless Sensor Networks (WSNs). We assume that wireless broadcast is used and nodes do not know the topology. We propose new protocols which disseminate data faster and use fewer broadcasts than the simple broadcast protocol.
Resumo:
Consider a distributed computer system such that every computer node can perform a wireless broadcast and when it does so, all other nodes receive this message. The computer nodes take sensor readings but individual sensor readings are not very important. It is important however to compute the aggregated quantities of these sensor readings. We show that a prioritized medium access control (MAC) protocol for wireless broadcast can compute simple aggregated quantities in a single transaction, and more complex quantities with many (but still a small number of) transactions. This leads to significant improvements in the time-complexity and as a consequence also similar reduction in energy “consumption”.
Resumo:
As electronic devices get smaller and more complex, dependability assurance is becoming fundamental for many mission critical computer based systems. This paper presents a case study on the possibility of using the on-chip debug infrastructures present in most current microprocessors to execute real time fault injection campaigns. The proposed methodology is based on a debugger customized for fault injection and designed for maximum flexibility, and consists of injecting bit-flip type faults on memory elements without modifying or halting the target application. The debugger design is easily portable and applicable to different architectures, providing a flexible and efficient mechanism for verifying and validating fault tolerant components.
Resumo:
Animal Cognition, V.6, pp. 259–267
Resumo:
Although the Navigation Satellite Timing and Ranging (NAVSTAR) Global Positioning System (GPS) is, de facto, the standard positioning system used in outdoor navigation, it does not provide, per se, all the features required to perform many outdoor navigational tasks. The accuracy of the GPS measurements is the most critical issue. The quest for higher position readings accuracy led to the development, in the late nineties, of the Differential Global Positioning System (DGPS). The differential GPS method detects the range errors of the GPS satellites received and broadcasts them. The DGPS/GPS receivers correlate the DGPS data with the GPS satellite data they are receiving, granting users increased accuracy. DGPS data is broadcasted using terrestrial radio beacons, satellites and, more recently, the Internet. Our goal is to have access, within the ISEP campus, to DGPS correction data. To achieve this objective we designed and implemented a distributed system composed of two main modules which are interconnected: a distributed application responsible for the establishment of the data link over the Internet between the remote DGPS stations and the campus, and the campus-wide DGPS data server application. The DGPS data Internet link is provided by a two-tier client/server distributed application where the server-side is connected to the DGPS station and the client-side is located at the campus. The second unit, the campus DGPS data server application, diffuses DGPS data received at the campus via the Intranet and via a wireless data link. The wireless broadcast is intended for DGPS/GPS portable receivers equipped with an air interface and the Intranet link is provided for DGPS/GPS receivers with just a RS232 DGPS data interface. While the DGPS data Internet link servers receive the DGPS data from the DGPS base stations and forward it to the DGPS data Internet link client, the DGPS data Internet link client outputs the received DGPS data to the campus DGPS data server application. The distributed system is expected to provide adequate support for accurate (sub-metric) outdoor campus navigation tasks. This paper describes in detail the overall distributed application.
Resumo:
OCEANS, 2001. MTS/IEEE Conference and Exhibition (Volume:2 )
Resumo:
No âmbito da unidade curricular Tese/Dissertação do 2ºano do Mestrado em Engenharia Eletrotécnica – Ramo Sistemas e Planeamento Industrial do Instituto Superior de Engenharia do Porto, o presente trabalho descreve o estágio curricular efetuado num projeto industrial de melhoria em parceria com o Kaizen Institute, uma empresa de consultoria operacional. Este projeto foi desenvolvido numa empresa de produção e redistribuição de artigos de papelaria e escritório, a Firmo AVS – Papeis e Papelaria,S.A.. O acordo efetuado entre o Kaizen Institute e a Firmo AVS foi o de promover e incutir a cultura da melhoria continua e da mudança de atitudes e comportamentos por parte dos colaboradores da Firmo, sendo que numa fase inicial o foco do projeto foi o departamento de produção de envelopes, designada por área piloto, expandindo-se posteriormente a metodologia Kaizen aos restantes departamentos. A realização deste projeto teve como objetivo a implementação de conceitos elementares de melhoria continua nomeadamente alguns pilares ou ferramentas do Total Flow Management (TFM) e do Kaizen Management System (KMS) na empresa Firmo, de forma a reduzir ou eliminar desperdícios, incremento do envolvimento dos colaboradores, melhoria da comunicação e trabalho em equipa, estandardização de processos produtivos, criação de normas de trabalho, utilização de ferramentas SMED para a redução de tempos improdutivos e aumento da produtividade. Várias foram as dificuldades presentes no terreno para a implementação destes objetivos mas com as diversas ferramentas e workshops realizados na organização, conseguiu-se o envolvimento de todos os colaboradores da organização e a obtenção de resultados satisfatórios nomeadamente ao nível da comunicação e trabalho em equipa, organização e limpeza dos postos de trabalho, standard work (normalização do trabalho), diminuição do lead time nos processos produtivos e consequente aumento de produtividade.