976 resultados para Control software


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Process planning is a very important industrial activity, since it determines how a part or a product is manufactured. Process planning decisions include machine selection, tool selection, and cutting conditions determination, and thus it is a complex activity. In the presence of unstable demand, flexibility has become a very important characteristic of today's successful industries, for which Flexible Manufacturing Systems (FMSs) have been proposed as a solution. However, we believe that FMS control software is not flexible enough to adapt to different manufacturing system conditions aiming at increasing the system's efficiency. One means to overcome this limitation is to include pre-planned alternatives in the process plan; then planning decisions are made by the control system in real time to select the most appropriate alternative according to the conditions of the shop floor. Some of the advantages of this approach reported in the literature are the reduction of the number of tool setups, and the selection of a replacement machine for executing an operation. To verify whether the presence of alternatives in process plans actually increases the efficiency of the manufacturing system, an investigation was carried out using simulation and design of experiments techniques for alternative plans on a single machine. The proposed methodology and the results are discussed within this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The evolution of digital circuit technology, leadind to higher speeds and more reliability allowed the development of machine controllers adapted to new production systems (e.g., Flexible Manufacturing Systems - FMS). Most of the controllers are developed in agreement with the CNC technology of the correspondent machine tool manufacturer. Any alterations or adaptation of their components are not easy to be implemented. The machine designers face up hardware and software restrictions such as lack of interaction among system's elements and impossibility of adding new function. This is due to hardware incompatibility and to software not allowing alterations in the source program. The introduction of open architecture philosophy propitiated the evolution of a new generation of numeric controllers. This brought the conventional CNC technology to the standard IBM - PC microcomputer. As a consequence, the characteristics of the CNC (positioning) and the microcomputer (easy of programming, system configuration, network communication etc) are combined. Some researchers have addressed a flexible structure of software and hardware allowing changes in the hardware basic configuration and all control software levels. In this work, the development of open architecture controllers in the OSACA, OMAC, HOAM-CNC and OSEC architectures is described.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A linearly-tunable ULV transconductor featuring excellent stability of the processed signal common-mode voltage upon tuning, critical for very-low voltage applications, is presented. Its employment to the synthesis of CMOS gm-C high-frequency and voiceband filters is discussed. SPICE data describe the filter characteristics. For a 1.3 V-supply, their nominal passband frequencies are 1.0 MHz and 3.78 KHz, respectively, with tuning rates of 12.52 KHz/mV and 0.16 KHz/m V, input-referred noise spectral density of 1.3 μV/Hz1/2 and 5.0μV/Hz1/2 and standby consumption of 0.87 mW and 11.8 μW. Large-signal distortion given by THD = 1% corresponds to a differential output-swing of 360 mVpp and 480 mVpp, respectively. Common-mode voltage deviation is less than 4 mV over tuning interval.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paperwork presents the complete circuitry used to build a microcontroller-based pH-meter. Key control software is also discussed. An industry-standard glass combination electrode has been employed for pH detection. Electrode parameter extraction procedure is presented. Good measurement results, with 1 % error, have been attained. Copyright© (2006) by the International Measurement Federation (IMEKO).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Biopatologia Bucal - ICT

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kernkollaps-Supernovae werden von einem massiven Ausbruch niederenergetischer Neutrinos begleitet. Sie zählen zu den energiereichsten Erscheinungen im Universum und stellen die derzeit einzig bekannte Quelle extrasolarer Neutrinos dar.rnDie Detektion einer solchen Neutrinosignatur würde zu einem tieferen Verständnis des bislang unzureichend bekannten stellaren Explosionsmechanismus führen. rnDarüber hinaus würden neue Einblicke in den Bereich der Teilchenphysik und der Supernova-Modellierung ermöglicht. Das sich zur Zeit am geographischen Südpol im Aufbau befindliche Neutrinoteleskop IceCube wird 2011 fertig gestellt sein.rnIceCube besteht im endgültigen Ausbau aus 5160 Photovervielfachern, die sich in gitterförmiger Anordnung in Tiefen zwischen 1450m und 2450m unter der Eisoberfläche befinden. Durch den Nachweis von Tscherenkow-Photonenrnim antarktischen Gletscher ist es in der Lage, galaktische Supernovae über einen kollektiven Anstieg der Rauschraten in seinen Photonenvervielfachern nachzuweisen.rnIn dieser Arbeit werden verschiedene Studien zur Implementierung einer künstlichen Totzeit vorgestellt, welche korreliertes Rauschen unterdrücken und somit das Signal-Untergund-Verhältnis maximieren würden.rnEin weiterer Teil dieser Dissertation bestand in der Integration der Supernova-Datenakquise eine neue Experiment-Steuerungssoftware.rnFür den Analyseteil der Arbeit wurde ein Monte-Carlo für IceCube entwickelt und Neutinooszillations-Mechanismen und eine Reihe von Signalmodellen integriert. Ein Likelihoodhypothesen-Test wurde verwendet, um die Unterscheidbarkeit verschiedener Supernova- beziehungsweise Neutrinooszillations-Szenarien zu untersuchen. Desweiteren wurde analysiert inwieweit sich Schock-Anregungen und QCD-Phasenübergnag im Verlauf des Explosionsprozesses detektieren lassen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die rasante Entwicklung der Computerindustrie durch die stetige Verkleinerung der Transistoren führt immer schneller zum Erreichen der Grenze der Si-Technologie, ab der die Tunnelprozesse in den Transistoren ihre weitere Verkleinerung und Erhöhung ihrer Dichte in den Prozessoren nicht mehr zulassen. Die Zukunft der Computertechnologie liegt in der Verarbeitung der Quanteninformation. Für die Entwicklung von Quantencomputern ist die Detektion und gezielte Manipulation einzelner Spins in Festkörpern von größter Bedeutung. Die Standardmethoden der Spindetektion, wie ESR, erlauben jedoch nur die Detektion von Spinensembles. Die Idee, die das Auslesen von einzelnen Spins ermöglich sollte, besteht darin, die Manipulation getrennt von der Detektion auszuführen.rn Bei dem NV−-Zentrum handelt es sich um eine spezielle Gitterfehlstelle im Diamant, die sich als einen atomaren, optisch auslesbaren Magnetfeldsensor benutzen lässt. Durch die Messung seiner Fluoreszenz sollte es möglich sein die Manipulation anderer, optisch nicht detektierbaren, “Dunkelspins“ in unmittelbarer Nähe des NV-Zentrums mittels der Spin-Spin-Kopplung zu detektieren. Das vorgeschlagene Modell des Quantencomputers basiert auf dem in SWCNT eingeschlossenen N@C60.Die Peapods, wie die Einheiten aus den in Kohlenstoffnanoröhre gepackten Fullerenen mit eingefangenem Stickstoff genannt werden, sollen die Grundlage für die Recheneinheiten eines wahren skalierbaren Quantencomputers bilden. Die in ihnen mit dem Stickstoff-Elektronenspin durchgeführten Rechnungen sollen mit den oberflächennahen NV-Zentren (von Diamantplatten), über denen sie positioniert sein sollen, optisch ausgelesen werden.rnrnDie vorliegende Arbeit hatte das primäre Ziel, die Kopplung der oberflächennahen NV-Einzelzentren an die optisch nicht detektierbaren Spins der Radikal-Moleküle auf der Diamantoberfläche mittels der ODMR-Kopplungsexperimente optisch zu detektieren und damit entscheidende Schritte auf dem Wege der Realisierung eines Quantenregisters zu tun.rn Es wurde ein sich im Entwicklungsstadium befindende ODMR-Setup wieder aufgebaut und seine bisherige Funktionsweise wurde an kommerziellen NV-Zentrum-reichen Nanodiamanten verifiziert. Im nächsten Schritt wurde die Effektivität und Weise der Messung an die Detektion und Manipulation der oberflächennah (< 7 nm Tiefe) implantieren NV-Einzelzenten in Diamantplatten angepasst.Ein sehr großer Teil der Arbeit, der hier nur bedingt beschrieben werden kann, bestand aus derrnAnpassung der existierenden Steuersoftware an die Problematik der praktischen Messung. Anschließend wurde die korrekte Funktion aller implementierten Pulssequenzen und anderer Software-Verbesserungen durch die Messung an oberflächennah implantierten NV-Einzelzentren verifiziert. Auch wurde der Messplatz um die zur Messung der Doppelresonanz notwendigen Komponenten wie einen steuerbaren Elektromagneten und RF-Signalquelle erweitert. Unter der Berücksichtigung der thermischen Stabilität von N@C60 wurde für zukünftige Experimente auch ein optischer Kryostat geplant, gebaut, in das Setup integriert und charakterisiert.rn Die Spin-Spin-Kopplungsexperimente wurden mit dem sauerstoffstabilen Galvinoxyl-Radikalals einem Modell-System für Kopplung durchgeführt. Dabei wurde über die Kopplung mit einem NVZentrum das RF-Spektrum des gekoppelten Radikal-Spins beobachtet. Auch konnte von dem gekoppelten Spin eine Rabi-Nutation aufgenommen werden.rn Es wurden auch weitere Aspekte der Peapod Messung und Oberflächenimplantation betrachtet.Es wurde untersucht, ob sich die NV-Detektion durch die SWCNTs, Peapods oder Fullerene stören lässt. Es zeigte sich, dass die Komponenten des geplanten Quantencomputers, bis auf die C60-Cluster, für eine ODMR-Messanordnung nicht detektierbar sind und die NV-Messung nicht stören werden. Es wurde auch betrachtet, welche Arten von kommerziellen Diamantplatten für die Oberflächenimplantation geeignet sind, für die Kopplungsmessungen geeignete Dichte der implantierten NV-Zentren abgeschätzt und eine Implantation mit abgeschätzter Dichte betrachtet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Zimmerwald SLR station is operated in a monostatic mode with 532nm laser pulses emitted at adjustable frequencies of 90-110Hz with energies slightly less than 10mJ. A rotating shutter protects the CSPAD receiver from the backscatter of the transmit beam. These systems are located below the telescope in an operator room housed within the observatory building with the laser system located in a separated, air-conditioned part of the room. All hardware components may be automatically accessed by the control software and from remote if required. Thanks to the fully automatic and remotely controllable SLR operations, the Zimmerwald station is one of the most productive stations in the ILRS network. Key characteristics of the hardware are shown. Specialities like the tracking of the full GLONASS constellation, one-way ranging to the Lunar Reconnaissance Orbiter, and photon reception from bi-static experiments with the Graz SLR station are highlighted as well.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To evaluate the role of an ultra-low-dose dual-source CT coronary angiography (CTCA) scan with high pitch for delimiting the range of the subsequent standard CTCA scan. METHODS 30 patients with an indication for CTCA were prospectively examined using a two-scan dual-source CTCA protocol (2.0 × 64.0 × 0.6 mm; pitch, 3.4; rotation time of 280 ms; 100 kV): Scan 1 was acquired with one-fifth of the tube current suggested by the automatic exposure control software [CareDose 4D™ (Siemens Healthcare, Erlangen, Germany) using 100 kV and 370 mAs as a reference] with the scan length from the tracheal bifurcation to the diaphragmatic border. Scan 2 was acquired with standard tube current extending with reduced scan length based on Scan 1. Nine central coronary artery segments were analysed qualitatively on both scans. RESULTS Scan 2 (105.1 ± 10.1 mm) was significantly shorter than Scan 1 (127.0 ± 8.7 mm). Image quality scores were significantly better for Scan 2. However, in 5 of 6 (83%) patients with stenotic coronary artery disease, a stenosis was already detected in Scan 1 and in 13 of 24 (54%) patients with non-stenotic coronary arteries, a stenosis was already excluded by Scan 1. Using Scan 2 as reference, the positive- and negative-predictive value of Scan 1 was 83% (5 of 6 patients) and 100% (13 of 13 patients), respectively. CONCLUSION An ultra-low-dose CTCA planning scan enables a reliable scan length reduction of the following standard CTCA scan and allows for correct diagnosis in a substantial proportion of patients. ADVANCES IN KNOWLEDGE Further dose reductions are possible owing to a change in the individual patient's imaging strategy as a prior ultra-low-dose CTCA scan may already rule out the presence of a stenosis or may lead to a direct transferal to an invasive catheter procedure.