908 resultados para computer-based instrumentation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous improvement of Ethernet technologies is boosting the eagerness of extending their use to cover factory-floor distributed real time applications. Indeed, it is remarkable the considerable amount of research work that has been devoted to the timing analysis of Ethernet-based technologies in the past few years. It happens, however, that the majority of those works are restricted to the analysis of sub-sets of the overall computing and communication system, thus without addressing timeliness in a holistic fashion. To this end, we address an approach, based on simulation, aiming at extracting temporal properties of commercial-off-the-shelf (COTS) Ethernet-based factory-floor distributed systems. This framework is applied to a specific COTS technology, Ethernet/IP. We reason about the modeling and simulation of Ethernet/IP-based systems, and on the use of statistical analysis techniques to provide useful results on timeliness. The approach is part of a wider framework related to the research project INDEPTH NDustrial-Ethernet ProTocols under Holistic analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this article is to show how it is possible to integrate stories and ICT in Content Language Integrated Learning (CLIL) for English as a foreign language (EFL) learning in bilingual schools. Two Units of Work are presented. One, for the second year of Primary, is based on a Science topic, ‘Materials’. The story used is ‘The three little pigs’ and the computer program ‘JClic’. The other one is based on a Science and Arts topic for the sixth year of Primary, the story used is ‘Charlotte’s Web’ and the computer program ‘Atenex’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional Real-Time Operating Systems (RTOS) are not designed to accommodate application specific requirements. They address a general case and the application must co-exist with any limitations imposed by such design. For modern real-time applications this limits the quality of services offered to the end-user. Research in this field has shown that it is possible to develop dynamic systems where adaptation is the key for success. However, adaptation requires full knowledge of the system state. To overcome this we propose a framework to gather data, and interact with the operating system, extending the traditional POSIX trace model with a partial reflective model. Such combination still preserves the trace mechanism semantics while creating a powerful platform to develop new dynamic systems, with little impact in the system and avoiding complex changes in the kernel source code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinear Optimization Problems are usual in many engineering fields. Due to its characteristics the objective function of some problems might not be differentiable or its derivatives have complex expressions. There are even cases where an analytical expression of the objective function might not be possible to determine either due to its complexity or its cost (monetary, computational, time, ...). In these cases Nonlinear Optimization methods must be used. An API, including several methods and algorithms to solve constrained and unconstrained optimization problems was implemented. This API can be accessed not only as traditionally, by installing it on the developer and/or user computer, but it can also be accessed remotely using Web Services. As long as there is a network connection to the server where the API is installed, applications always access to the latest API version. Also an Web-based application, using the proposed API, was developed. This application is to be used by users that do not want to integrate methods in applications, and simply want to have a tool to solve Nonlinear Optimization Problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural health monitoring has long been identified as a prominent application of Wireless Sensor Networks (WSNs), as traditional wired-based solutions present some inherent limitations such as installation/maintenance cost, scalability and visual impact. Nevertheless, there is a lack of ready-to-use and off-the-shelf WSN technologies that are able to fulfill some most demanding requirements of these applications, which can span from critical physical infrastructures (e.g. bridges, tunnels, mines, energy grid) to historical buildings or even industrial machinery and vehicles. Low-power and low-cost yet extremely sensitive and accurate accelerometer and signal acquisition hardware and stringent time synchronization of all sensors data are just examples of the requirements imposed by most of these applications. This paper presents a prototype system for health monitoring of civil engineering structures that has been jointly conceived by a team of civil, and electrical and computer engineers. It merges the benefits of standard and off-the-shelf (COTS) hardware and communication technologies with a minimum set of custom-designed signal acquisition hardware that is mandatory to fulfill all application requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new simulation environment for a virtual laboratory to educational proposes is presented. The Logisim platform was adopted as the base digital simulation tool, since it has a modular implementation in Java. All the hardware devices used in the laboratory course was designed as components accessible by the simulation tool, and integrated as a library. Moreover, this new library allows the user to access an external interface. This work was motivated by the needed to achieve better learning times on co-design projects, based on hardware and software implementations, and to reduce the laboratory time, decreasing the operational costs of engineer teaching. Furthermore, the use of virtual laboratories in educational environments allows the students to perform functional tests, before they went to a real laboratory. Moreover, these functional tests allow to speed-up the learning when a problem based approach methodology is considered. © 2014 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weblabs are spreading their influence in Science and Engineering (S&E) courses providing a way to remotely conduct real experiments. Typically, they are implemented by different architectures and infrastructures supported by Instruments and Modules (I&Ms) able to be remotely controlled and observed. Besides the inexistence of a standard solution for implementing weblabs, their reconfiguration is limited to a setup procedure that enables interconnecting a set of preselected I&Ms into an Experiment Under Test (EUT). Moreover, those I&Ms are not able to be replicated or shared by different weblab infrastructures, since they are usually based on hardware platforms. Thus, to overcome these limitations, this paper proposes a standard solution that uses I&Ms embedded into Field-Programmable Gate Array (FPGAs) devices. It is presented an architecture based on the IEEE1451.0 Std. supported by a FPGA-based weblab infrastructure able to be remotely reconfigured with I&Ms, described through standard Hardware Description Language (HDL) files, using a Reconfiguration Tool (RecTool).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

15th International Conference on Mixed Design of Integrated Circuits and Systems, pp. 177 – 180, Poznan, Polónia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic and distributed environments are hard to model since they suffer from unexpected changes, incomplete knowledge, and conflicting perspectives and, thus, call for appropriate knowledge representation and reasoning (KRR) systems. Such KRR systems must handle sets of dynamic beliefs, be sensitive to communicated and perceived changes in the environment and, consequently, may have to drop current beliefs in face of new findings or disregard any new data that conflicts with stronger convictions held by the system. Not only do they need to represent and reason with beliefs, but also they must perform belief revision to maintain the overall consistency of the knowledge base. One way of developing such systems is to use reason maintenance systems (RMS). In this paper we provide an overview of the most representative types of RMS, which are also known as truth maintenance systems (TMS), which are computational instances of the foundations-based theory of belief revision. An RMS module works together with a problem solver. The latter feeds the RMS with assumptions (core beliefs) and conclusions (derived beliefs), which are accompanied by their respective foundations. The role of the RMS module is to store the beliefs, associate with each belief (core or derived belief) the corresponding set of supporting foundations and maintain the consistency of the overall reasoning by keeping, for each represented belief, the current supporting justifications. Two major approaches are used to reason maintenance: single-and multiple-context reasoning systems. Although in the single-context systems, each belief is associated to the beliefs that directly generated it—the justification-based TMS (JTMS) or the logic-based TMS (LTMS), in the multiple context counterparts, each belief is associated with the minimal set of assumptions from which it can be inferred—the assumption-based TMS (ATMS) or the multiple belief reasoner (MBR).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CO2 capture from gaseous effluents is one of the great challenges faced by chemical and environmental engineers, as the increase in CO2 levels in the Earth atmosphere might be responsible for dramatic climate changes. From the existing capture technologies, the only proven and mature technology is chemical absorption using aqueous amine solutions. However, bearing in mind that this process is somewhat expensive, it is important to choose the most efficient and, at the same time, the least expensive solvents. For this purpose, a pilot test facility was assembled and includes an absorption column, as well as a stripping column, a heat exchanger between the two columns, a reboiler for the stripping column, pumping systems, surge tanks and all necessary instrumentation and control systems. Some different aquous amine solutions were tested on this facility and it was found that, from a set of six tested amines, diethanol amine is the one that turned out to be the most economical choice, as it showed a higher CO2 loading capacity (0.982 mol of CO2 per mol of amine) and the lowest price per litre (25.70 ¤/L), even when compared with monoethanolamine, the benchmark solvent, exhibiting a price per litre of 30.50 ¤/L.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis presented in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the subject of Electrical and Computer Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation presented at the Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa to obtain the Master degree in Electrical and Computer Engineering.