916 resultados para discrete event systems
Resumo:
An international seminar-workshop entitled "Facilitation of trade and transport in Latin America: situation and outlook" was held at the headquarters of the Economic Commission for Latin America and the Caribbean (ECLAC) on 29 and 30 November 2005, organized jointly by the ECLAC Division of International Trade and Integration and the United Nations Conference on Trade and Development (UNCTAD). The event was attended by about 50 persons involved in customs modernization and/or the implementation of single window systems for foreign trade in 20 Ibero-American countries.The main purpose of the seminar-workshop was to exchange ideas, opinions and proposals concerning the efficient implementation of trade facilitation instruments. The conclusions reached at this event point to the need to seek convergence among the existing trade agreements associated with trade facilitation in Latin America. Customs modernization requires the re-design of processes and procedures in order to achieve interoperability among the systems, and single window systems for foreign trade can only be implemented successfully if clear political leadership is established with broad participation from both public and private organizations.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The timed-initiation paradigm developed by Ghez and colleagues (1997) has revealed two modes of motor planning: continuous and discrete. Continuous responding occurs when targets are separated by less than 60° of spatial angle, and discrete responding occurs when targets are separated by greater than 60°. Although these two modes are thought to reflect the operation of separable strategic planning systems, a new theory of movement preparation, the Dynamic Field Theory, suggests that two modes emerge flexibly from the same system. Experiment 1 replicated continuous and discrete performance using a task modified to allow for a critical test of the single system view. In Experiment 2, participants were allowed to correct their movements following movement initiation (the standard task does not allow corrections). Results showed continuous planning performance at large and small target separations. These results are consistent with the proposal that the two modes reflect the time-dependent “preshaping” of a single planning system.
Resumo:
The aim of solving the Optimal Power Flow problem is to determine the optimal state of an electric power transmission system, that is, the voltage magnitude and phase angles and the tap ratios of the transformers that optimize the performance of a given system, while satisfying its physical and operating constraints. The Optimal Power Flow problem is modeled as a large-scale mixed-discrete nonlinear programming problem. This paper proposes a method for handling the discrete variables of the Optimal Power Flow problem. A penalty function is presented. Due to the inclusion of the penalty function into the objective function, a sequence of nonlinear programming problems with only continuous variables is obtained and the solutions of these problems converge to a solution of the mixed problem. The obtained nonlinear programming problems are solved by a Primal-Dual Logarithmic-Barrier Method. Numerical tests using the IEEE 14, 30, 118 and 300-Bus test systems indicate that the method is efficient. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
Resumo:
The monitoring of cognitive functions aims at gaining information about the current cognitive state of the user by decoding brain signals. In recent years, this approach allowed to acquire valuable information about the cognitive aspects regarding the interaction of humans with external world. From this consideration, researchers started to consider passive application of brain–computer interface (BCI) in order to provide a novel input modality for technical systems solely based on brain activity. The objective of this thesis is to demonstrate how the passive Brain Computer Interfaces (BCIs) applications can be used to assess the mental states of the users, in order to improve the human machine interaction. Two main studies has been proposed. The first one allows to investigate whatever the Event Related Potentials (ERPs) morphological variations can be used to predict the users’ mental states (e.g. attentional resources, mental workload) during different reactive BCI tasks (e.g. P300-based BCIs), and if these information can predict the subjects’ performance in performing the tasks. In the second study, a passive BCI system able to online estimate the mental workload of the user by relying on the combination of the EEG and the ECG biosignals has been proposed. The latter study has been performed by simulating an operative scenario, in which the occurrence of errors or lack of performance could have significant consequences. The results showed that the proposed system is able to estimate online the mental workload of the subjects discriminating three different difficulty level of the tasks ensuring a high reliability.
Resumo:
Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.
Resumo:
A general approach is presented for implementing discrete transforms as a set of first-order or second-order recursive digital filters. Clenshaw's recurrence formulae are used to formulate the second-order filters. The resulting structure is suitable for efficient implementation of discrete transforms in VLSI or FPGA circuits. The general approach is applied to the discrete Legendre transform as an illustration.
Resumo:
Tropical Storm Lee produced 25-36 cm of rainfall in north-central Pennsylvania on September 4th through 8th of 2011. Loyalsock Creek, Muncy Creek, and Fishing Creek experienced catastrophic flooding resulting in new channel formation, bank erosion, scour of chutes, deposition/reworking of point bars and chute bars, and reactivation of the floodplain. This study was created to investigate aspects of both geomorphology and sedimentology by studying the well-exposed gravel deposits left by the flood, before these features are removed by humans or covered by vegetation. By recording the composition of gravel bars in the study area and creating lithofacies models, it is possible to understand the 2011 flooding. Surficial clasts on gravel bars are imbricated, but the lack of imbrication and high matrix content of sediments at depth suggests that surface imbrication of the largest clasts took place during hyperconcentrated flow (40-70% sediment concentration). The imbricated clasts on the surface are the largest observed within the bars. The lithofacies recorded are atypical for mixed-load stream lithofacies and more similar to glacial outburst flood lithofacies. This paper suggests that the accepted lithofacies model for mixed-load streams with gravel bedload may not always be useful for interpreting depositional systems. A flume study, which attempted to duplicate the stratigraphy recorded in the field, was run in order to better understand hyperconcentrated flows in the study area. Results from the study in the Bucknell Geology Flume Laboratory indicate that surficial imbrication is possible in hyperconcentrated conditions. After flooding the flume to entrain large amounts of sand and gravel, deposition of surficially imbricated gravel with massive or upward coarsening sedimentology occurred. Imbrication was not observed at depth. These experimental flume deposits support our interpretation of the lithofacies discovered in the field. The sizes of surficial gravel bar clasts show clear differences between chute and point bars. On point bars, gravels fine with increasing distance from the channel. Fining also occurs at the downstream end of point bars. In chute deposits, dramatic fining occurs down the axis of the chute, and lateral grain sizes are nearly uniform. Measuring the largest grain size of sandstone clasts at 8-11 kilometer intervals on each river reveals anomalies in the downstream fining trends. Gravel inputs from bedrock outcrops, tributaries, and erosion of Pleistocene outwash terraces may explain observed variations in grain size along streams either incised into the Appalachian Plateau or located near the Wisconsinan glacial boundary. Atomic Mass Spectrometry (AMS) radiocarbon dating of sediment from recently scoured features on Muncy Creek and Loyalsock Creek returned respective ages of 500 BP and 2490 BP. These dates suggest that the recurrence interval of the 2011 flooding may be several hundred to several thousand years. This geomorphic interval of recurrence is much longer then the 120 year interval calculated by the USGS using historical stream gauge records.