946 resultados para Prototype Verification System


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Part 6: Engineering and Implementation of Collaborative Networks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On-time completion is an important temporal QoS (Quality of Service) dimension and one of the fundamental requirements for high-confidence workflow systems. In recent years, a workflow temporal verification framework, which generally consists of temporal constraint setting, temporal checkpoint selection, temporal verification, and temporal violation handling, has been the major approach for the high temporal QoS assurance of workflow systems. Among them, effective temporal checkpoint selection, which aims to timely detect intermediate temporal violations along workflow execution plays a critical role. Therefore, temporal checkpoint selection has been a major topic and has attracted significant efforts. In this paper, we will present an overview of work-flow temporal checkpoint selection for temporal verification. Specifically, we will first introduce the throughput based and response-time based temporal consistency models for business and scientific cloud workflow systems, respectively. Then the corresponding benchmarking checkpoint selection strategies that satisfy the property of “necessity and sufficiency” are presented. We also provide experimental results to demonstrate the effectiveness of our checkpoint selection strategies, and finally points out some possible future issues in this research area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to long-term drought conditions coupled with the apparent influence of global warming, compounding water loss has been a very serious issue across the vast majority of the Australian continent. During these drought conditions, the evaporative effect outweighs the amount of precipitation being received on a year to year basis. Several methods have been introduced in recent history to inhibit the amount of evaporative loss from various types of water bodies such as the application of thin layer chemical films (monolayers). A series of solvent, solid and suspension derived prototype monolayers, based on ethylene glycol monooctadecyl ether (C18E1), are examined in this current study as an approach to eliminate the problems seen to occur with the previous types of monolayers. This research evaluates the fundamental effect of wind and wave based activity upon these prototype monolayers in an atmospherically controlled enclosure positioned over a large extended water tank using real-time environmental measurements. Selected performance results for the prototype monolayers as measured within the enclosed water tank were compared to results measured from a control monolayer film based on a commonly used octadecanol suspension film. The results show that under varying wind and wave conditions the prototype monolayers inhibit evaporation at a level similar to or better than the octadecanol standard, even when delivered at lower raw dosages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is establishing itself as the latest computing paradigm in recent years. As doing science in the cloud is becoming a reality, scientists are now able to access public cloud centers and employ high-performance computing resources to run scientific applications. However, due to the dynamic nature of the cloud environment, the usability of scientific cloud workflow systems can be significantly deteriorated if without effective service quality assurance strategies. Specifically, workflow temporal verification as the major approach for workflow temporal QoS (Quality of Service) assurance plays a critical role in the on-time completion of large-scale scientific workflows. Great efforts have been dedicated to the area of workflow temporal verification in recent years and it is high time that we should define the key research issues for scientific cloud workflows in order to keep our research on the right track. In this paper, we systematically investigate this problem and present four key research issues based on the introduction of a generic temporal verification framework. Meanwhile, state-of-the-art solutions for each research issue and open challenges are also presented. Finally, SwinDeW-V, an ongoing research project on temporal verification as part of our SwinDeW-C cloud workflow system, is also demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workflow temporal verification is conducted to guarantee on-time completion, which is one of the most important QoS (Quality of Service) dimensions for business processes running in the cloud. However, as today's business systems often need to handle a large number of concurrent customer requests, conventional response-time based process monitoring strategies conducted in a one-by-one fashion cannot be applied efficiently to a large batch of parallel processes because of significant time overhead. Similar situations may also exist in software companies where multiple software projects are carried out at the same time by software developers. To address such a problem, based on a novel runtime throughput consistency model, this paper proposes a QoS-aware throughput based checkpoint selection strategy, which can dynamically select a small number of checkpoints along the system timeline to facilitate the temporal verification of throughput constraints and achieve the target on-time completion rate. Experimental results demonstrate that our strategy can achieve the best efficiency and effectiveness compared with the state-of-the-art as and other representative response-time based checkpoint selection strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two key solutions to reduce the greenhouse gas emissions and increase the overall energy efficiency are to maximize the utilization of renewable energy resources (RERs) to generate energy for load consumption and to shift to low or zero emission plug-in electric vehicles (PEVs) for transportation. The present U.S. aging and overburdened power grid infrastructure is under a tremendous pressure to handle the issues involved in penetration of RERS and PEVs. The future power grid should be designed with for the effective utilization of distributed RERs and distributed generations to intelligently respond to varying customer demand including PEVs with high level of security, stability and reliability. This dissertation develops and verifies such a hybrid AC-DC power system. The system will operate in a distributed manner incorporating multiple components in both AC and DC styles and work in both grid-connected and islanding modes. ^ The verification was performed on a laboratory-based hybrid AC-DC power system testbed as hardware/software platform. In this system, RERs emulators together with their maximum power point tracking technology and power electronics converters were designed to test different energy harvesting algorithms. The Energy storage devices including lithium-ion batteries and ultra-capacitors were used to optimize the performance of the hybrid power system. A lithium-ion battery smart energy management system with thermal and state of charge self-balancing was proposed to protect the energy storage system. A grid connected DC PEVs parking garage emulator, with five lithium-ion batteries was also designed with the smart charging functions that can emulate the future vehicle-to-grid (V2G), vehicle-to-vehicle (V2V) and vehicle-to-house (V2H) services. This includes grid voltage and frequency regulations, spinning reserves, micro grid islanding detection and energy resource support. ^ The results show successful integration of the developed techniques for control and energy management of future hybrid AC-DC power systems with high penetration of RERs and PEVs.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prototype for a Responsive Fisheries Management System (RFMS) was developed in the context of the European FP7 project EcoFishMan and tested on the Portuguese crustacean trawl fishery. Building on Results Based Management principles, RFMS involves the definition of specific and measurable objectives for a fishery by the relevant authorities but allows resource users the freedom to find ways to achieve the objectives and to provide adequate documentation. Taking into account the main goals of the new Common Fisheries Policy, such as sustainable utilization of the resources, end of discards and unwanted catches, a management plan for the Portuguese crustacean trawl fishery was developed in cooperation with the fishing industry, following the process and design laid out in the RFMS concept. The plan considers biological, social and economic goals and assigns a responsibility for increased data collection to the resource users. The performance of the plan with regard to selected indicators was evaluated through simulations. In this paper the process towards a RFMS is described and the lessons learnt from the interaction with stakeholders in the development of an alternative management plan are discussed. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of olive oil generates several by-products that can be seen as an additional business opportunity. Among them are the olive pits, already used for heat and/or electricity generation in some mills. They contain compounds that are commercially very interesting and, if recovered, contribute to the sustainability of the olive mills. The work presented in this paper is a preliminary evaluation of the economic feasibility of implementing a system based on a batch prototype with 1 m3 for the extraction of high value-added bioactive molecules from olive pits that are separated during the production of virgin olive oil. For the analysis, a small representative olive mill in Portugal was considered and the traditional Discounted Cash Flow Method was applied. Based on the assumptions made, the simple payback for implementation a system for the extraction of value-added molecules from the olive pits is around 7 years.