892 resultados para multi-platform development
Resumo:
The derivation of a detailed sea-surface paleotemperature curve for the middle Miocene-Holocene (10-0 Ma) from ODP Site 811 on the Queensland Plateau, northeast Australia, has clarified the role of sea-surface temperature fluctuations as a control on the initiation and development of the extensive carbonate platforms of this region. This curve was derived from isotopic analyses of the planktonic foraminifer Globigerinoides ruber, and converted to temperature using the surface-water paleotemperature equation accounting for variations in global ice volume. The accuracy of these data were confirmed by derivation of paleotemperatures using the water column isotopic gradient (Delta delta18O), corrected for salinity and variations in seafloor water mass temperature. Results indicate that during this period surface-water temperatures were, on average, greater than the minimum required for tropical reef growth (20°C; Veron, 1986), with the exception of the late Miocene and earliest early Pliocene (10-4.9 Ma), when there were repeated intervals of temperatures between 18-20°C. Tropical reef growth on the Queensland Plateau was extensive from the early to early middle Miocene (~21-13 Ma), after which reef development began to decline. A lowstand near 11 Ma probably exposed shallower portions of the plateau; after re-immersion near 7 Ma, the areal extent of reef development was greatly reduced (~ 50%). Paleotemperature data from Site 811 indicate that decreased sea-surface temperatures were likely to have been instrumental in reducing the area of active reef growth on the Queensland Plateau. Reduced reefal growth rates continued until the late Pliocene or Quaternary, despite the increase of average sea-surface paleotemperatures to 22-23°C. Studies on modern corals show that when sea-surface temperatures are below ~24°C, as they were from the late Miocene to the Pleistocene off northeast Australia, corals are stressed and growth rates are greatly reduced. Consequently, when temperatures are in this range, corals have difficulty keeping pace with subsidence and changing environmental factors. In the late Pliocene, sedimentation rates increased due to increases in non-reefal carbonate production and falling sea levels. It was not until the mid-Quaternary (0.6-0.7 Ma) that sea-surface paleotemperatures increased above 24°C as a result of the formation of a western Coral Sea warm water pool. Because of age discrepancies, it is unclear exactly when an effective barrier developed on the central Great Barrier Reef; the formation of the warm water pool was likely to have either assisted the formation of this barrier and/or permitted increased coral growth rates. Fluctuations in sea-surface temperature can account for much of the observed spatial and temporal variations of reef growth and carbonate platform distribution off northeast Australia, and therefore we conclude that paleotemperature variations are a critical control on the development of carbonate platforms, and must be considered an important cause of ancient platform "drowning".
Resumo:
The Semantic Binary Data Model (SBM) is a viable alternative to the now-dominant relational data model. SBM would be especially advantageous for applications dealing with complex interrelated networks of objects provided that a robust efficient implementation can be achieved. This dissertation presents an implementation design method for SBM, algorithms, and their analytical and empirical evaluation. Our method allows building a robust and flexible database engine with a wider applicability range and improved performance. ^ Extensions to SBM are introduced and an implementation of these extensions is proposed that allows the database engine to efficiently support applications with a predefined set of queries. A New Record data structure is proposed. Trade-offs of employing Fact, Record and Bitmap Data structures for storing information in a semantic database are analyzed. ^ A clustering ID distribution algorithm and an efficient algorithm for object ID encoding are proposed. Mapping to an XML data model is analyzed and a new XML-based XSDL language facilitating interoperability of the system is defined. Solutions to issues associated with making the database engine multi-platform are presented. An improvement to the atomic update algorithm suitable for certain scenarios of database recovery is proposed. ^ Specific guidelines are devised for implementing a robust and well-performing database engine based on the extended Semantic Data Model. ^
Resumo:
Reverberation is caused by the reflection of the sound in adjacent surfaces close to the sound source during its propagation to the listener. The impulsive response of an environment represents its reverberation characteristics. Being dependent on the environment, reverberation takes to the listener characteristics of the space where the sound is originated and its absence does not commonly sounds like “natural”. When recording sounds, it is not always possible to have the desirable characteristics of reverberation of an environment, therefore methods for artificial reverberation have been developed, always seeking a more efficient implementations and more faithful to the real environments. This work presents an implementation in FPGAs (Field Programmable Gate Arrays ) of a classic digital reverberation audio structure, based on a proposal of Manfred Schroeder, using sets of all-pass and comb filters. The developed system exploits the use of reconfigurable hardware as a platform development and implementation of digital audio effects, focusing on the modularity and reuse characteristics
Resumo:
SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.
Resumo:
The main drivers for the development and evolution of Cyber Physical Systems (CPS) are the reduction of development costs and time along with the enhancement of the designed products. The aim of this survey paper is to provide an overview of different types of system and the associated transition process from mechatronics to CPS and cloud-based (IoT) systems. It will further consider the requirement that methodologies for CPS-design should be part of a multi-disciplinary development process within which designers should focus not only on the separate physical and computational components, but also on their integration and interaction. Challenges related to CPS-design are therefore considered in the paper from the perspectives of the physical processes, computation and integration respectively. Illustrative case studies are selected from different system levels starting with the description of the overlaying concept of Cyber Physical Production Systems (CPPSs). The analysis and evaluation of the specific properties of a sub-system using a condition monitoring system, important for the maintenance purposes, is then given for a wind turbine.
Resumo:
The publication of material in electronic form should ideally preserve, in a unified document representation, all of the richness of the printed document while maintaining enough of its underlying structure to enable searching and other forms of semantic processing. Until recently it has been hard to find a document representation which combined these attributes and which also stood some chance of becoming a de facto multi-platform standard. This paper sets out experience gained within the Electronic Publishing Research Group at the University of Nottingham in using Adobe Acrobat software and its underlying PDF (Portable Document Format) notation. The CAJUN project1 (CD-ROM Acrobat Journals Using Networks) began in 1993 and has used Acrobat software to produce electronic versions of journal papers for network and CD-ROM dissemination. The paper describes the project's progress so far and also gives a brief assessment of PDF's suitability as a universal document interchange standard.
Resumo:
This book examines an emerging and fast evolving phenomena: that a growing number of people engage with two screens whilst watching television. It seems a simple concept – until we discover the important implications. In doing this, this book will move way beyond the study of online and multimedia content. It will go past the impact of mobile and multi-platform technology on the media. Instead it will examine how this new concept of second screen interactivity changes the way we watch, produce, commission and monetise television programmes in the UK.
Resumo:
Copernicus is a European system for monitoring the Earth. COPERNICUS-CMEMS products and services are meant to serve all marine applications: Marine resources, Maritime safety, Coastal and Marine Environment, Seasonal Forecast & Climate. The service is ambitious as the ocean is complex and many processes are involved, from physical oceanography, biology, geology, ocean-atmosphere fluxes, solar radiations, moon induced tides, anthropic activity. A multi-platform approach is essential, taking into account sea-level stations, coastal buoys, HF radars, river flows, drifting buoys, sea-mammal or fishes fitted with sensors, vessels, gliders, floats.
Oceanic Near-inertial internal waves generation, propagation and interaction with mesoscale dynamics
Resumo:
Oceans play a key role in the climate system, being the largest heat sinks on Earth. Part of the energy balance of ocean circulation is driven by the Near-inertial internal waves (NIWs). Strong NIWs are observed during a multi-platform, multi-disciplinary and multi-scale campaign led by the NATO-STO CMRE in autumn 2017 in the Ligurian Sea (northwestern Mediterranean Sea). The objectives of this work are as follows: characterise the studied area at different scales; study the NIWs generation and their propagation; estimate the NIWs properties; study the interaction between NIWs and mesoscale structures. This work provides, to the author’s knowledge, the first characterization of NIWs in the Mediterranean Sea. The near-surface NIWs observed at the fixed moorings are locally generated by wind bursts while the deeper waves originate in other regions and arrive at the moorings several days later. Most of the observed NIWs energy propagates downward with a mean vertical group velocity of (2.2±0.3) ⋅10-4 m s-1. On average, the NIWs have an amplitude of 0.13 m s-1 and mean horizontal and vertical wavelengths of 43±25 km and 125±35 m, while shorter wavelengths are observed at the near-coastal mooring, 36±2 km and 33±2 m, respectively. Most of the observed NIWs are blue shifted and reach a value 9% higher than the local inertial frequency. Only two observed NIWs are characterised by a redshift (up to 3% lower than the local inertial frequency). In support of the in situ observations, a high resolution numerical model is implemented using NEMO (Madec et al., 2019). Results show that anticyclones (cyclones) shift the frequency of NIWs to lower (higher) frequencies with respect to the local inertial frequency. Anticyclones facilitate the downward propagation of NIW energy, while cyclones dampen it. Absence of NIWs energy within an anticyclone is also investigated.
Resumo:
Multi-agent approaches have been widely used to model complex systems of distributed nature with a large amount of interactions between the involved entities. Power systems are a reference case, mainly due to the increasing use of distributed energy sources, largely based on renewable sources, which have potentiated huge changes in the power systems’ sector. Dealing with such a large scale integration of intermittent generation sources led to the emergence of several new players, as well as the development of new paradigms, such as the microgrid concept, and the evolution of demand response programs, which potentiate the active participation of consumers. This paper presents a multi-agent based simulation platform which models a microgrid environment, considering several different types of simulated players. These players interact with real physical installations, creating a realistic simulation environment with results that can be observed directly in the reality. A case study is presented considering players’ responses to a demand response event, resulting in an intelligent increase of consumption in order to face the wind generation surplus.
Resumo:
LUDA is a research project of Key Action 4 "City of Tomorrow & Cultural Heritage" of the programme "Energy, Environment and Sustainable Development" within the Fifth Framework Programme of the European Commission
Resumo:
The increasing number of players that operate in power systems leads to a more complex management. In this paper a new multi-agent platform is proposed, which simulates the real operation of power system players. MASGriP – A Multi-Agent Smart Grid Simulation Platform is presented. Several consumer and producer agents are implemented and simulated, considering real characteristics and different goals and actuation strategies. Aggregator entities, such as Virtual Power Players and Curtailment Service Providers are also included. The integration of MASGriP agents in MASCEM (Multi-Agent System for Competitive Electricity Markets) simulator allows the simulation of technical and economical activities of several players. An energy resources management architecture used in microgrids is also explained.
Resumo:
: In this work we derive an analytical solution given by Bessel series to the transient and one-dimensional (1D) bioheat transfer equation in a multi-layer region with spatially dependent heat sources. Each region represents an independent biological tissue characterized by temperature-invariant physiological parameters and a linearly temperature dependent metabolic heat generation. Moreover, 1D Cartesian, cylindrical or spherical coordinates are used to define the geometry and temperature boundary conditions of first, second and third kinds are assumed at the inner and outer surfaces. We present two examples of clinical applications for the developed solution. In the first one, we investigate two different heat source terms to simulate the heating in a tumor and its surrounding tissue, induced during a magnetic fluid hyperthermia technique used for cancer treatment. To obtain an accurate analytical solution, we determine the error associated with the truncated Bessel series that defines the transient solution. In the second application, we explore the potential of this model to study the effect of different environmental conditions in a multi-layered human head model (brain, bone and scalp). The convective heat transfer effect of a large blood vessel located inside the brain is also investigated. The results are further compared with a numerical solution obtained by the Finite Element Method and computed with COMSOL Multi-physics v4.1 (c). (c) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Modern multicore processors for the embedded market are often heterogeneous in nature. One feature often available are multiple sleep states with varying transition cost for entering and leaving said sleep states. This research effort explores the energy efficient task-mapping on such a heterogeneous multicore platform to reduce overall energy consumption of the system. This is performed in the context of a partitioned scheduling approach and a very realistic power model, which improves over some of the simplifying assumptions often made in the state-of-the-art. The developed heuristic consists of two phases, in the first phase, tasks are allocated to minimise their active energy consumption, while the second phase trades off a higher active energy consumption for an increased ability to exploit savings through more efficient sleep states. Extensive simulations demonstrate the effectiveness of the approach.
Resumo:
More than ever, the economic globalization is creating the need to increase business competitiveness. Lean manufacturing is a management philosophy oriented to the elimination of activities that do not create any type of value and are thus considered a waste. One of the main differences from other management philosophies is the shop-floor focus and the operators' involvement. Therefore, the training of all organization levels is crucial for the success of lean manufacturing. Universities should also participate actively in this process by developing students' lean management skills and promoting a better and faster integration of students into their future organizations. This paper proposes a single realistic manufacturing platform, involving production and assembly operations, to learn by playing many of the lean tools such as VSM, 5S, SMED, poke-yoke, line balance, TPM, Mizusumashi, plant layout, and JIT/kanban. This simulation game was built in tight cooperation with experienced lean companies under the international program “Lean Learning Academy,”http://www.leanlearningacademy.eu/ and its main aim is to make bachelor and master courses in applied sciences more attractive by integrating classic lectures with a simulated production environment that could result in more motivated students and higher study yields. The simulation game results show that our approach is efficient in providing a realistic platform for the effective learning of lean principles, tools, and mindset, which can be easily included in course classes of less than two hours.