912 resultados para Machine-tools - Numerical control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

While most data analysis and decision support tools use numerical aspects of the data, Conceptual Information Systems focus on their conceptual structure. This paper discusses how both approaches can be combined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The centralised control rooms of large industrial plants have separated people from the processes they should control. Perception is restricted mainly to the visual sense. Only telephone or radio links provide narrow-band voice communication with maintenance personnel down in the plant. Multimedia equipment can perceptionally bring back the operator into the plant while bodily keeping him the comfortable and safe control room. This involves video and audio transmission from process components as well as sights and sounds artificially generated from measurements. Groupware systems support inter-action between operators, engineers, and managers in different plants. With support from the German government, the state of Hessen, and industrial companies the Laboratory for Systems Engineering and Human-Machine Systems at the University of Kassel establishes an Experimental Multimedia Process Control Room. Core of this set-up are two high-performance graphics workstations linked to one of several process or vehicle simulators. Multimedia periphery includes video and teleconferencing equipment and a vibration and sound generation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of feed intake, feeding time and rumination time, summarized by the term feeding behavior, are helpful indicators for early recognition of animals which show deviations in their behavior. The overall objective of this work was the development of an early warning system for inadequate feeding rations and digestive and metabolic disorders, which prevention constitutes the basis for health, performance, and reproduction. In a literature review, the current state of the art and the suitability of different measurement tools to determine feeding behavior of ruminants was discussed. Five measurement methods based on different methodological approaches (visual observance, pressure transducer, electrical switches, electrical deformation sensors and acoustic biotelemetry), and three selected measurement techniques (the IGER Behavior Recorder, the Hi-Tag rumination monitoring system and RumiWatchSystem) were described, assessed and compared to each other within this review. In the second study, the new system for measuring feeding behavior of dairy cows was evaluated. The measurement of feeding behavior ensues through electromyography (EMG). For validation, the feeding behavior of 14 cows was determined by both the EMG system and by visual observation. The high correlation coefficients indicate that the current system is a reliable and suitable tool for monitoring the feeding behavior of dairy cows. The aim of a further study was to compare the DairyCheck (DC) system and two additional measurement systems for measuring rumination behavior in relation to efficiency, reliability and reproducibility, with respect to each other. The two additional systems were labeled as the Lely Qwes HR (HR) sensor, and the RumiWatchSystem (RW). Results of accordance of RW and DC to each other were high. The last study examined whether rumination time (RT) is affected by the onset of calving and if it might be a useful indicator for the prediction of imminent birth. Data analysis referred to the final 72h before the onset of calving, which were divided into twelve 6h-blocks. The results showed that RT was significantly reduced in the final 6h before imminent birth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ongoing depletion of the coastal aquifer in the Gaza strip due to groundwater overexploitation has led to the process of seawater intrusion, which is continually becoming a serious problem in Gaza, as the seawater has further invaded into many sections along the coastal shoreline. As a first step to get a hold on the problem, the artificial neural network (ANN)-model has been applied as a new approach and an attractive tool to study and predict groundwater levels without applying physically based hydrologic parameters, and also for the purpose to improve the understanding of complex groundwater systems and which is able to show the effects of hydrologic, meteorological and anthropogenic impacts on the groundwater conditions. Prediction of the future behaviour of the seawater intrusion process in the Gaza aquifer is thus of crucial importance to safeguard the already scarce groundwater resources in the region. In this study the coupled three-dimensional groundwater flow and density-dependent solute transport model SEAWAT, as implemented in Visual MODFLOW, is applied to the Gaza coastal aquifer system to simulate the location and the dynamics of the saltwater–freshwater interface in the aquifer in the time period 2000-2010. A very good agreement between simulated and observed TDS salinities with a correlation coefficient of 0.902 and 0.883 for both steady-state and transient calibration is obtained. After successful calibration of the solute transport model, simulation of future management scenarios for the Gaza aquifer have been carried out, in order to get a more comprehensive view of the effects of the artificial recharge planned in the Gaza strip for some time on forestall, or even to remedy, the presently existing adverse aquifer conditions, namely, low groundwater heads and high salinity by the end of the target simulation period, year 2040. To that avail, numerous management scenarios schemes are examined to maintain the ground water system and to control the salinity distributions within the target period 2011-2040. In the first, pessimistic scenario, it is assumed that pumping from the aquifer continues to increase in the near future to meet the rising water demand, and that there is not further recharge to the aquifer than what is provided by natural precipitation. The second, optimistic scenario assumes that treated surficial wastewater can be used as a source of additional artificial recharge to the aquifer which, in principle, should not only lead to an increased sustainable yield of the latter, but could, in the best of all cases, revert even some of the adverse present-day conditions in the aquifer, i.e., seawater intrusion. This scenario has been done with three different cases which differ by the locations and the extensions of the injection-fields for the treated wastewater. The results obtained with the first (do-nothing) scenario indicate that there will be ongoing negative impacts on the aquifer, such as a higher propensity for strong seawater intrusion into the Gaza aquifer. This scenario illustrates that, compared with 2010 situation of the baseline model, at the end of simulation period, year 2040, the amount of saltwater intrusion into the coastal aquifer will be increased by about 35 %, whereas the salinity will be increased by 34 %. In contrast, all three cases of the second (artificial recharge) scenario group can partly revert the present seawater intrusion. From the water budget point of view, compared with the first (do nothing) scenario, for year 2040, the water added to the aquifer by artificial recharge will reduces the amount of water entering the aquifer by seawater intrusion by 81, 77and 72 %, for the three recharge cases, respectively. Meanwhile, the salinity in the Gaza aquifer will be decreased by 15, 32 and 26% for the three cases, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate transport of an ion over macroscopic distances represents a challenging control problem due to the different length and time scales that enter and the experimental limitations on the controls that need to be accounted for. Here, we investigate the performance of different control techniques for ion transport in state-of-the-art segmented miniaturized ion traps. We employ numerical optimization of classical trajectories and quantum wavepacket propagation as well as analytical solutions derived from invariant based inverse engineering and geometric optimal control. The applicability of each of the control methods depends on the length and time scales of the transport. Our comprehensive set of tools allows us make a number of observations. We find that accurate shuttling can be performed with operation times below the trap oscillation period. The maximum speed is limited by the maximum acceleration that can be exerted on the ion. When using controls obtained from classical dynamics for wavepacket propagation, wavepacket squeezing is the only quantum effect that comes into play for a large range of trapping parameters. We show that this can be corrected by a compensating force derived from invariant based inverse engineering, without a significant increase in the operation time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are currently at the cusp of a revolution in quantum technology that relies not just on the passive use of quantum effects, but on their active control. At the forefront of this revolution is the implementation of a quantum computer. Encoding information in quantum states as “qubits” allows to use entanglement and quantum superposition to perform calculations that are infeasible on classical computers. The fundamental challenge in the realization of quantum computers is to avoid decoherence – the loss of quantum properties – due to unwanted interaction with the environment. This thesis addresses the problem of implementing entangling two-qubit quantum gates that are robust with respect to both decoherence and classical noise. It covers three aspects: the use of efficient numerical tools for the simulation and optimal control of open and closed quantum systems, the role of advanced optimization functionals in facilitating robustness, and the application of these techniques to two of the leading implementations of quantum computation, trapped atoms and superconducting circuits. After a review of the theoretical and numerical foundations, the central part of the thesis starts with the idea of using ensemble optimization to achieve robustness with respect to both classical fluctuations in the system parameters, and decoherence. For the example of a controlled phasegate implemented with trapped Rydberg atoms, this approach is demonstrated to yield a gate that is at least one order of magnitude more robust than the best known analytic scheme. Moreover this robustness is maintained even for gate durations significantly shorter than those obtained in the analytic scheme. Superconducting circuits are a particularly promising architecture for the implementation of a quantum computer. Their flexibility is demonstrated by performing optimizations for both diagonal and non-diagonal quantum gates. In order to achieve robustness with respect to decoherence, it is essential to implement quantum gates in the shortest possible amount of time. This may be facilitated by using an optimization functional that targets an arbitrary perfect entangler, based on a geometric theory of two-qubit gates. For the example of superconducting qubits, it is shown that this approach leads to significantly shorter gate durations, higher fidelities, and faster convergence than the optimization towards specific two-qubit gates. Performing optimization in Liouville space in order to properly take into account decoherence poses significant numerical challenges, as the dimension scales quadratically compared to Hilbert space. However, it can be shown that for a unitary target, the optimization only requires propagation of at most three states, instead of a full basis of Liouville space. Both for the example of trapped Rydberg atoms, and for superconducting qubits, the successful optimization of quantum gates is demonstrated, at a significantly reduced numerical cost than was previously thought possible. Together, the results of this thesis point towards a comprehensive framework for the optimization of robust quantum gates, paving the way for the future realization of quantum computers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal control theory is a powerful tool for solving control problems in quantum mechanics, ranging from the control of chemical reactions to the implementation of gates in a quantum computer. Gradient-based optimization methods are able to find high fidelity controls, but require considerable numerical effort and often yield highly complex solutions. We propose here to employ a two-stage optimization scheme to significantly speed up convergence and achieve simpler controls. The control is initially parametrized using only a few free parameters, such that optimization in this pruned search space can be performed with a simplex method. The result, considered now simply as an arbitrary function on a time grid, is the starting point for further optimization with a gradient-based method that can quickly converge to high fidelities. We illustrate the success of this hybrid technique by optimizing a geometric phase gate for two superconducting transmon qubits coupled with a shared transmission line resonator, showing that a combination of Nelder-Mead simplex and Krotov’s method yields considerably better results than either one of the two methods alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

KAM is a computer program that can automatically plan, monitor, and interpret numerical experiments with Hamiltonian systems with two degrees of freedom. The program has recently helped solve an open problem in hydrodynamics. Unlike other approaches to qualitative reasoning about physical system dynamics, KAM embodies a significant amount of knowledge about nonlinear dynamics. KAM's ability to control numerical experiments arises from the fact that it not only produces pictures for us to see, but also looks at (sic---in its mind's eye) the pictures it draws to guide its own actions. KAM is organized in three semantic levels: orbit recognition, phase space searching, and parameter space searching. Within each level spatial properties and relationships that are not explicitly represented in the initial representation are extracted by applying three operations ---(1) aggregation, (2) partition, and (3) classification--- iteratively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In early stages of architectural design, as in other design domains, the language used is often very abstract. In architectural design, for example, architects and their clients use experiential terms such as "private" or "open" to describe spaces. If we are to build programs that can help designers during this early-stage design, we must give those programs the capability to deal with concepts on the level of such abstractions. The work reported in this thesis sought to do that, focusing on two key questions: How are abstract terms such as "private" and "open" translated into physical form? How might one build a tool to assist designers with this process? The Architect's Collaborator (TAC) was built to explore these issues. It is a design assistant that supports iterative design refinement, and that represents and reasons about how experiential qualities are manifested in physical form. Given a starting design and a set of design goals, TAC explores the space of possible designs in search of solutions that satisfy the goals. It employs a strategy we've called dependency-directed redesign: it evaluates a design with respect to a set of goals, then uses an explanation of the evaluation to guide proposal and refinement of repair suggestions; it then carries out the repair suggestions to create new designs. A series of experiments was run to study TAC's behavior. Issues of control structure, goal set size, goal order, and modification operator capabilities were explored. In addition, TAC's use as a design assistant was studied in an experiment using a house in the process of being redesigned. TAC's use as an analysis tool was studied in an experiment using Frank Lloyd Wright's Prairie houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Kineticist's Workbench is a program that simulates chemical reaction mechanisms by predicting, generating, and interpreting numerical data. Prior to simulation, it analyzes a given mechanism to predict that mechanism's behavior; it then simulates the mechanism numerically; and afterward, it interprets and summarizes the data it has generated. In performing these tasks, the Workbench uses a variety of techniques: graph- theoretic algorithms (for analyzing mechanisms), traditional numerical simulation methods, and algorithms that examine simulation results and reinterpret them in qualitative terms. The Workbench thus serves as a prototype for a new class of scientific computational tools---tools that provide symbiotic collaborations between qualitative and quantitative methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electroosmotic flow is a convenient mechanism for transporting polar fluid in a microfluidic device. The flow is generated through the application of an external electric field that acts on the free charges that exists in a thin Debye layer at the channel walls. The charge on the wall is due to the chemistry of the solid-fluid interface, and it can vary along the channel, e.g. due to modification of the wall. This investigation focuses on the simulation of the electroosmotic flow (EOF) profile in a cylindrical microchannel with step change in zeta potential. The modified Navier-Stoke equation governing the velocity field and a non-linear two-dimensional Poisson-Boltzmann equation governing the electrical double-layer (EDL) field distribution are solved numerically using finite control-volume method. Continuities of flow rate and electric current are enforced resulting in a non-uniform electrical field and pressure gradient distribution along the channel. The resulting parabolic velocity distribution at the junction of the step change in zeta potential, which is more typical of a pressure-driven velocity flow profile, is obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lecture slides, handouts for tutorials, exam papers, and numerical examples for a third year course on Control System Design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durante la crisis financiera global de 2008 muchas organizaciones y mercados financieros tuvieron que terminar sus operaciones o replantearlas debido a los choques que golpearon el bienestar de sus empresas. A pesar de esta grave situación, en la actualidad se pueden encontrar empresas que se recuperaron y salieron del terrible panorama que les presentó la crisis, incluso encontrando nuevas oportunidades de negocio y fortaleciendo su futuro. Esta capacidad que algunas organizaciones tuvieron y que permitió su salida victoriosa de la crisis se denomina resiliencia, la cual es la capacidad de sobreponerse a los efectos negativos de choques internos o externos (Briguglio, Cordina, Farrugia & Vella 2009). Por tanto en el presente trabajo se estudiará esta capacidad tanto en la organización como en los líderes para hallar factores que mejoren el desempeño de las empresas en crisis como la que ocurrió en el 2008 – 2009. Primero se realizará un estudio sobre los sucesos y el desarrollo de la crisis subprime del año 2008 para tener un entendimiento claro de sus antecedentes, desarrollo, magnitud y consecuencias. Posteriormente se realizará un estudio profundo sobre la teoría de la resiliencia organizacional y la resiliencia en el líder como individuo y los estilos de liderazgo. Finalmente teniendo un sustento teórico tanto de la crisis como del concepto de resiliencia se tomarán casos de estudio de empresas que lograron perdurar en la crisis financiera del 2008 y empresas que no lograron sobrevivir para posteriormente hallar características del líder y del liderazgo que puedan aumentar o afectar la capacidad de resiliencia de las organizaciones con el objetivo de brindar herramientas a los líderes actuales para que manejen de forma eficiente y eficaz las empresas en un mundo complejo y variable como el actual.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La implementación del MCS es una necesidad que demandan las organizaciones en la medida en que incrementan de tamaño, pero la experiencia muestra que esta metodología tiene casos de éxito como de fracaso, por lo que es importante identificar y contemplar los factores que influyen en la implementación para que el sistema sea efectivo. Este proyecto pretende analizar las variables y herramientas para la implementación de un MCS en una organización. Para este análisis se hizo una amplia revisión literaria teórica y práctica. Finalmente el resultado que se obtuvo fue definir cuáles son los factores determinantes para la implementación de un MCS efectivo en una empresa.