997 resultados para Deep architecture
Resumo:
This paper describes an architecture conceived to integrate Power Sys-tems tools in a Power System Control Centre, based on an Ambient Intelligent (AmI) paradigm. This architecture is an instantiation of the generic architecture proposed in [1] for developing systems that interact with AmI environments. This architecture has been proposed as a consequence of a methodology for the inclu-sion of Artificial Intelligence in AmI environments (ISyRAmI - Intelligent Sys-tems Research for Ambient Intelligence). The architecture presented in the paper will be able to integrate two applications in the control room of a power system transmission network. The first is SPARSE expert system, used to get diagnosis of incidents and to support power restoration. The second application is an Intelligent Tutoring System (ITS) incorporating two training tools. The first tutoring tool is used to train operators to get the diagnosis of incidents. The second one is another tutoring tool used to train operators to perform restoration procedures.
Resumo:
This paper presents the proposal of an architecture for developing systems that interact with Ambient Intelligence (AmI) environments. This architecture has been proposed as a consequence of a methodology for the inclusion of Artificial Intelligence in AmI environments (ISyRAmI - Intelligent Systems Research for Ambient Intelligence). The ISyRAmI architecture considers several modules. The first is related with the acquisition of data, information and even knowledge. This data/information knowledge deals with our AmI environment and can be acquired in different ways (from raw sensors, from the web, from experts). The second module is related with the storage, conversion, and handling of the data/information knowledge. It is understood that incorrectness, incompleteness, and uncertainty are present in the data/information/knowledge. The third module is related with the intelligent operation on the data/information/knowledge of our AmI environment. Here we include knowledge discovery systems, expert systems, planning, multi-agent systems, simulation, optimization, etc. The last module is related with the actuation in the AmI environment, by means of automation, robots, intelligent agents and users.
Resumo:
A novel high throughput and scalable unified architecture for the computation of the transform operations in video codecs for advanced standards is presented in this paper. This structure can be used as a hardware accelerator in modern embedded systems to efficiently compute all the two-dimensional 4 x 4 and 2 x 2 transforms of the H.264/AVC standard. Moreover, its highly flexible design and hardware efficiency allows it to be easily scaled in terms of performance and hardware cost to meet the specific requirements of any given video coding application. Experimental results obtained using a Xilinx Virtex-5 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which presents a throughput per unit of area relatively higher than other similar recently published designs targeting the H.264/AVC standard. Such results also showed that, when integrated in a multi-core embedded system, this architecture provides speedup factors of about 120x concerning pure software implementations of the transform algorithms, therefore allowing the computation, in real-time, of all the above mentioned transforms for Ultra High Definition Video (UHDV) sequences (4,320 x 7,680 @ 30 fps).
Resumo:
Deep Ocean Species. The little that is known mostly comes from collected specimens. L.A. Rocha et al. Letter "Specimen collection: An essential tool" (23 May, 344: 814) brilliantly discuss the importance of specimen collection and present the evolution of collecting since the mid-19th century until our present strict codes and conducts. However, it is also important to emphasize the fact that the vast majority of deep ocean macro-organisms are only known to us because of collection and this is a strong argument that should be present in our actions as scientists. If the deep is considered the least known of Earth’s habitats (1% or so according to recent estimates) then what awesome collection of yet to discover species are still there to be properly described? As the authors point citing (1), something around 86% of species remain unknown. Voucher specimens are fundamental for the reasons pointed out and perhaps the vast depths of the World’s oceans are the best example of that importance. The resumed report of 2010 Census of Marine Life (2) showed that among the millions of specimens collected in both familiar and seldom-explored waters, the Census found more than 6,000 potentially new species and completed formal descriptions of more than 1,200 of them. It also found that a number of rare species are in fact common. Voucher specimens are essential and, again agreeing with L.A. Rocha et al. Letter (see above), the modern approach for collecting will not be a cause for extinctions but instead a valuable tool for knowledge, description and even, as seen above, a way to find out that supposed rare species may not be that rare and even prove to reach abundant populations.
Resumo:
Dissertação de Mestrado, Estudos Integrados dos Oceanos, 25 de Julho 2013, Universidade dos Açores.
Resumo:
The deep-sea environment is difficult to sample, and often only small quantities of samples can be obtained when using less destructive methods than dredging. When working with marine animals that are difficult to sample and with limited quantities of tissue to extract lipids, it is essential to ensure that the used method extracts the maximum possible quantity of lipids. This study evaluates the efficiency of introducing modifications to the method originally described by Bligh & Dyer (1959). This lipid extraction method is broadly used with modifications, although these usually lack proper description and evaluation of increment in lipids. In this study we consider the improvement in terms of amount of lipids extracted by changing the method. Lipid content was determined by gravimetric measurements in eight invertebrates from the deep-sea, including deep-sea hydrothermal vents animals, using three different approaches. Results show increases of 14% to 30% in lipid contents obtained from hydrothermal vent invertebrate tissues and whole animals by placing the samples in methanol for 24 hours before applying the Bligh & Dyer mixture. Efficiency of the extractions using frozen and freeze-dried samples was also compared. For large sponges, the use of lyophilized materials resulted in increases of 3 to 7 times more lipids extracted when compared with extractions using frozen samples.
Resumo:
Tese de Doutoramento em Ciências do Mar, especialidade em Ecologia Marinha.
Resumo:
Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.
Host-symbiont interactions in the deep-sea vent mussel Bathymodiolus azoricus : a molecular approach
Resumo:
Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 19 de Dezembro de 2015, Universidade dos Açores.
Resumo:
A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).
Resumo:
Conferência: IEEE 24th International Conference on Application-Specific Systems, Architectures and Processors (ASAP)- Jun 05-07, 2013
Resumo:
The Robuter is a robotic mobile platform that is located in the “Hands-On” Laboratory of the IPP-Hurray! Research Group, at the School of Engineering of the Polytechnic Institute of Porto. Recently, the Robuter was subject of an upgrading process addressing two essential areas: the Hardware Architecture and the Software Architecture. This upgrade in process was triggered due to technical problems on-board of the robot and also to the fact that the hardware/software architecture has become obsolete. This Technical Report overviews the most important aspects of the new Hardware and Software Architectures of the Robuter. This document also presents a first approach on the first steps towards the use of the Robuter platform, and provides some hints on future work that may be carried out using this mobile platform.
Resumo:
In Distributed Computer-Controlled Systems (DCCS), both real-time and reliability requirements are of major concern. Architectures for DCCS must be designed considering the integration of processing nodes and the underlying communication infrastructure. Such integration must be provided by appropriate software support services. In this paper, an architecture for DCCS is presented, its structure is outlined, and the services provided by the support software are presented. These are considered in order to guarantee the real-time and reliability requirements placed by current and future systems.
Resumo:
This paper presents an architecture (Multi-μ) being implemented to study and develop software based fault tolerant mechanisms for Real-Time Systems, using the Ada language (Ada 95) and Commercial Off-The-Shelf (COTS) components. Several issues regarding fault tolerance are presented and mechanisms to achieve fault tolerance by software active replication in Ada 95 are discussed. The Multi-μ architecture, based on a specifically proposed Fault Tolerance Manager (FTManager), is then described. Finally, some considerations are made about the work being done and essential future developments.
Resumo:
In the past years, Software Architecture has attracted increased attention by academia and industry as the unifying concept to structure the design of complex systems. One particular research area deals with the possibility of reconfiguring architectures to adapt the systems they describe to new requirements. Reconfiguration amounts to adding and removing components and connections, and may have to occur without stopping the execution of the system being reconfigured. This work contributes to the formal description of such a process. Taking as a premise that a single formalism hardly ever satisfies all requirements in every situation, we present three approaches, each one with its own assumptions about the systems it can be applied to and with different advantages and disadvantages. Each approach is based on work of other researchers and has the aesthetic concern of changing as little as possible the original formalism, keeping its spirit. The first approach shows how a given reconfiguration can be specified in the same manner as the system it is applied to and in a way to be efficiently executed. The second approach explores the Chemical Abstract Machine, a formalism for rewriting multisets of terms, to describe architectures, computations, and reconfigurations in a uniform way. The last approach uses a UNITY-like parallel programming design language to describe computations, represents architectures by diagrams in the sense of Category Theory, and specifies reconfigurations by graph transformation rules.