471 resultados para .NET Framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work summarises the Intercalibration Exercise (IE) required for the Common Implementation Strategy of the Water Framework Directive (WFD; 2000/60/EC) that was carried out in Portugal, and applied to a coastal region. The WFD aims to achieve good ec ological status for all waters in the European Community by 2015. The Ecological Status of a water body is determined us ing a range of Hydromorphological and Physico-Chemical Quality Elements as well Biological Quality Elements (BQE ). In coastal waters, the Biological Elements include Phytoplankton, Other Aquatic Flora and Benthic Inverteb rate Fauna. Good cooperation with the other Member States allowed the IE to proceed without a complete da ta set, and Portugal was ab le to intercalibrate and harmonise methods within the North Ea st Atlantic Geographica l Intercalibration Group for most of the BQE. The appropriate metrics and corre sponding methods were agreed under the framework of the RECITAL (Reference Conditions and Intercalibra tion) project, funded by the Port uguese Water Institu te, INAG. Some preliminary sampling was undertaken, but not su fficient to establish the Reference Conditions. The study area was a coastal lagoon in the southern part of Portugal. The focus was on the Phytoplankton Quality Element, but other BQE were also taken into account. Two sampli ng stations in Ria Formosa coastal lagoon were considered in this exercise: Ramalhete a nd Ponte. The metrics adopted by the Intercalibration Exercise groups were applied enabli ng the classification for the two sta tions of Good/High Status for the majority of the BQE parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel copper-substituted aluminophosphate (CuIST-2) has been synthesized using methylamine as a templating agent. X-Ray diffraction studies provide evidence for an AEN topology similar to the parent Cu-free IST-2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We write to comment on the recently published paper “Defining phytoplankton class boundaries in Portuguese transitional waters: an evaluation of the ecological quality status according to the Water Framework Directive” (Brito et al., 2012). This paper presents an integrated methodology to analyse the ecological quality status of several Portuguese transitional waters, using phytoplanktonrelated metrics. One of the systems analysed, the Guadiana estuary in southern Portugal, is considered the most problematic estuary, with its upstream water bodies classified as Poor in terms of ecological status. We strongly disagree with this conclusion and we would like to raise awareness to some methodological constraints that, in our opinion, are the basis of such deceptive conclusions and should therefore not be neglected when using phytoplankton to assess the ecological status of natural waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the importance of websites to the tourism and hospitality industries is fundamental. Hotels, for instance, spend a huge amount of money improving their websites to show all the activities and amenities that they can provide. There are studies to evaluate website performance based on functionality, usability and other factors; nevertheless, they are not exhaustive. This paper presents a framework for the characterization of hotel and resort websites, that was applied as a case study to the websites of 5-star hotels and resorts that operate in the tourism region of the Algarve, Portugal. This framework allows us to identify a set of features for the hotel and resort websites characterization. From these features, we propose a set of comprehensive indicators, grouping them into ten fundamental information dimensions, as well as the application of these indicators and information dimensions, which allows obtaining quantitative and qualitative results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Revenue Management’s most cited definitions is probably “to sell the right accommodation to the right customer, at the right time and the right price, with optimal satisfaction for customers and hoteliers”. Smart Revenue Management (SRM) is a project, which aims the development of smart automatic techniques for an efficient optimization of occupancy and rates of hotel accommodations, commonly referred to, as revenue management. One of the objectives of this project is to demonstrate that the collection of Big Data, followed by an appropriate assembly of functionalities, will make possible to generate a Data Warehouse necessary to produce high quality business intelligence and analytics. This will be achieved through the collection of data extracted from a variety of sources, including from the web. This paper proposes a three stage framework to develop the Big Data Warehouse for the SRM. Namely, the compilation of all available information, in the present case, it was focus only the extraction of information from the web by a web crawler – raw data. The storing of that raw data in a primary NoSQL database, and from that data the conception of a set of functionalities, rules, principles and semantics to select, combine and store in a secondary relational database the meaningful information for the Revenue Management (Big Data Warehouse). The last stage will be the principal focus of the paper. In this context, clues will also be giving how to compile information for Business Intelligence. All these functionalities contribute to a holistic framework that, in the future, will make it possible to anticipate customers and competitor’s behavior, fundamental elements to fulfill the Revenue Management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2015-12

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, several distributed video coding (DVC) solutions based on the distributed source coding (DSC) paradigm have appeared in the literature. Wyner-Ziv (WZ) video coding, a particular case of DVC where side information is made available at the decoder, enable to achieve a flexible distribution of the computational complexity between the encoder and decoder, promising to fulfill novel requirements from applications such as video surveillance, sensor networks and mobile camera phones. The quality of the side information at the decoder has a critical role in determining the WZ video coding rate-distortion (RD) performance, notably to raise it to a level as close as possible to the RD performance of standard predictive video coding schemes. Towards this target, efficient motion search algorithms for powerful frame interpolation are much needed at the decoder. In this paper, the RD performance of a Wyner-Ziv video codec is improved by using novel, advanced motion compensated frame interpolation techniques to generate the side information. The development of these type of side information estimators is a difficult problem in WZ video coding, especially because the decoder only has available some reference, decoded frames. Based on the regularization of the motion field, novel side information creation techniques are proposed in this paper along with a new frame interpolation framework able to generate higher quality side information at the decoder. To illustrate the RD performance improvements, this novel side information creation framework has been integrated in a transform domain turbo coding based Wyner-Ziv video codec. Experimental results show that the novel side information creation solution leads to better RD performance than available state-of-the-art side information estimators, with improvements up to 2 dB: moreover, it allows outperforming H.264/AVC Intra by up to 3 dB with a lower encoding complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current regulatory framework for maintenance outage scheduling in distribution systems needs revision to face the challenges of future smart grids. In the smart grid context, generation units and the system operator perform new roles with different objectives, and an efficient coordination between them becomes necessary. In this paper, the distribution system operator (DSO) of a microgrid receives the proposals for shortterm (ST) planned outages from the generation and transmission side, and has to decide the final outage plans, which is mandatory for the members to follow. The framework is based on a coordination procedure between the DSO and other market players. This paper undertakes the challenge of optimization problem in a smart grid where the operator faces with uncertainty. The results show the effectiveness and applicability of the proposed regulatory framework in the modified IEEE 34- bus test system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity markets are complex environments with very particular characteristics. A critical issue regarding these specific characteristics concerns the constant changes they are subject to. This is a result of the electricity markets’ restructuring, which was performed so that the competitiveness could be increased, but it also had exponential implications in the increase of the complexity and unpredictability in those markets scope. The constant growth in markets unpredictability resulted in an amplified need for market intervenient entities in foreseeing market behaviour. The need for understanding the market mechanisms and how the involved players’ interaction affects the outcomes of the markets, contributed to the growth of usage of simulation tools. Multi-agent based software is particularly well fitted to analyze dynamic and adaptive systems with complex interactions among its constituents, such as electricity markets. This dissertation presents ALBidS – Adaptive Learning strategic Bidding System, a multiagent system created to provide decision support to market negotiating players. This system is integrated with the MASCEM electricity market simulator, so that its advantage in supporting a market player can be tested using cases based on real markets’ data. ALBidS considers several different methodologies based on very distinct approaches, to provide alternative suggestions of which are the best actions for the supported player to perform. The approach chosen as the players’ actual action is selected by the employment of reinforcement learning algorithms, which for each different situation, simulation circumstances and context, decides which proposed action is the one with higher possibility of achieving the most success. Some of the considered approaches are supported by a mechanism that creates profiles of competitor players. These profiles are built accordingly to their observed past actions and reactions when faced with specific situations, such as success and failure. The system’s context awareness and simulation circumstances analysis, both in terms of results performance and execution time adaptation, are complementary mechanisms, which endow ALBidS with further adaptation and learning capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally challenging to determine end-to-end delays of applications for maximizing the aggregate system utility subject to timing constraints. Many practical approaches suggest the use of intermediate deadline of tasks in order to control and upper-bound their end-to-end delays. This paper proposes a unified framework for different time-sensitive, global optimization problems, and solves them in a distributed manner using Lagrangian duality. The framework uses global viewpoints to assign intermediate deadlines, taking resource contention among tasks into consideration. For soft real-time tasks, the proposed framework effectively addresses the deadline assignment problem while maximizing the aggregate quality of service. For hard real-time tasks, we show that existing heuristic solutions to the deadline assignment problem can be incorporated into the proposed framework, enriching their mathematical interpretation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded real-time applications increasingly present high computation requirements, which need to be completed within specific deadlines, but that present highly variable patterns, depending on the set of data available in a determined instant. The current trend to provide parallel processing in the embedded domain allows providing higher processing power; however, it does not address the variability in the processing pattern. Dimensioning each device for its worst-case scenario implies lower average utilization, and increased available, but unusable, processing in the overall system. A solution for this problem is to extend the parallel execution of the applications, allowing networked nodes to distribute the workload, on peak situations, to neighbour nodes. In this context, this report proposes a framework to develop parallel and distributed real-time embedded applications, transparently using OpenMP and Message Passing Interface (MPI), within a programming model based on OpenMP. The technical report also devises an integrated timing model, which enables the structured reasoning on the timing behaviour of these hybrid architectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stringent cost and energy constraints impose the use of low-cost and low-power radio transceivers in large-scale wireless sensor networks (WSNs). This fact, together with the harsh characteristics of the physical environment, requires a rigorous WSN design. Mechanisms for WSN deployment and topology control, MAC and routing, resource and mobility management, greatly depend on reliable link quality estimators (LQEs). This paper describes the RadiaLE framework, which enables the experimental assessment, design and optimization of LQEs. RadiaLE comprises (i) the hardware components of the WSN testbed and (ii) a software tool for setting-up and controlling the experiments, automating link measurements gathering through packets-statistics collection, and analyzing the collected data, allowing for LQEs evaluation. We also propose a methodology that allows (i) to properly set different types of links and different types of traffic, (ii) to collect rich link measurements, and (iii) to validate LQEs using a holistic and unified approach. To demonstrate the validity and usefulness of RadiaLE, we present two case studies: the characterization of low-power links and a comparison between six representative LQEs. We also extend the second study for evaluating the accuracy of the TOSSIM 2 channel model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSN) are being used for a number of applications involving infrastructure monitoring, building energy monitoring and industrial sensing. The difficulty of programming individual sensor nodes and the associated overhead have encouraged researchers to design macro-programming systems which can help program the network as a whole or as a combination of subnets. Most of the current macro-programming schemes do not support multiple users seamlessly deploying diverse applications on the same shared sensor network. As WSNs are becoming more common, it is important to provide such support, since it enables higher-level optimizations such as code reuse, energy savings, and traffic reduction. In this paper, we propose a macro-programming framework called Nano-CF, which, in addition to supporting in-network programming, allows multiple applications written by different programmers to be executed simultaneously on a sensor networking infrastructure. This framework enables the use of a common sensing infrastructure for a number of applications without the users having to worrying about the applications already deployed on the network. The framework also supports timing constraints and resource reservations using the Nano-RK operating system. Nano- CF is efficient at improving WSN performance by (a) combining multiple user programs, (b) aggregating packets for data delivery, and (c) satisfying timing and energy specifications using Rate- Harmonized Scheduling. Using representative applications, we demonstrate that Nano-CF achieves 90% reduction in Source Lines-of-Code (SLoC) and 50% energy savings from aggregated data delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several projects in the recent past have aimed at promoting Wireless Sensor Networks as an infrastructure technology, where several independent users can submit applications that execute concurrently across the network. Concurrent multiple applications cause significant energy-usage overhead on sensor nodes, that cannot be eliminated by traditional schemes optimized for single-application scenarios. In this paper, we outline two main optimization techniques for reducing power consumption across applications. First, we describe a compiler based approach that identifies redundant sensing requests across applications and eliminates those. Second, we cluster the radio transmissions together by concatenating packets from independent applications based on Rate-Harmonized Scheduling.