275 resultados para Electricity generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a model for the generation of a MAC tag using a stream cipher. The input message is used indirectly to control segments of the keystream that form the MAC tag. Several recent proposals can be considered as instances of this general model, as they all perform message accumulation in this way. However, they use slightly different processes in the message preparation and finalisation phases. We examine the security of this model for different options and against different types of attack, and conclude that the indirect injection model can be used to generate MAC tags securely for certain combinations of options. Careful consideration is required at the design stage to avoid combinations of options that result in susceptibility to forgery attacks. Additionally, some implementations may be vulnerable to side-channel attacks if used in Authenticated Encryption (AE) algorithms. We give design recommendations to provide resistance to these attacks for proposals following this model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of extreme movements in the spot price of electricity represents a significant source of risk to retailers. A range of approaches have been considered with respect to modelling electricity prices; these models, however, have relied on time-series approaches, which typically use restrictive decay schemes placing greater weight on more recent observations. This study develops an alternative, semi-parametric method for forecasting, which uses state-dependent weights derived from a kernel function. The forecasts that are obtained using this method are accurate and therefore potentially useful to electricity retailers in terms of risk management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a series of operating schedules for Battery Energy Storage Companies (BESC) to provide peak shaving and spinning reserve services in the electricity markets under increasing wind penetration. As individual market participants, BESC can bid in ancillary services markets in an Independent System Operator (ISO) and contribute towards frequency and voltage support in the grid. Recent development in batteries technologies and availability of the day-ahead spot market prices would make BESC economically feasible. Profit maximization of BESC is achieved by determining the optimum capacity of Energy Storage Systems (ESS) required for meeting spinning reserve requirements as well as peak shaving. Historic spot market prices and frequency deviations from Australia Energy Market Operator (AEMO) are used for numerical simulations and the economic benefits of BESC is considered reflecting various aspects in Australia’s National Electricity Markets (NEM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Media education has been included as a mandatory component of the Arts within the new Australian national curriculum, which purports to set out a framework that encompasses core knowledge, understanding and skills critical to twenty-first century learning. This will position Australia as the only country to require media education as a compulsory aspect of Arts education and one of the first to implement a sequenced national media education curriculum from pre-school to year 12. A broad framework has been outlined for what the Media Arts curriculum will encompass and in this article we investigate the extent to which this framework is likely to provide media educators the opportunity to broaden the scope of established media education to effectively educate students about the ever-changing nature of media ecologies. The article outlines significant shifts occurring in the film and television industries to identify the types of knowledge students may need to understand these changes. This is followed by an analysis of existing state-based media curricula offered at years 11 and 12 in Australia to demonstrate that the concepts of institutions and audiences are not currently approached in ways that reflect contemporary media ecologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integration of small-scale electricity generators, known as Distributed Generation (DG), into the distribution networks has become increasingly popular at the present. This tendency together with the falling price of synchronous-type generator has potential to give the DG a better chance in participating in the voltage regulation process together with other devices already available in the system. The voltage control issue turns out to be a very challenging problem for the distribution engineers since existing control coordination schemes would need to be reconsidered to take into account the DG operation. In this paper, we propose a control coordination technique, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimizes interaction with other active devices, such as On-load Tap Changing Transformer (OLTC) and voltage regulator. The technique has been developed based on the concept of control zone, Line Drop Compensation (LDC), as well as the choice of controller's parameters. Simulations carried out on an Australian system show that the technique is suitable and flexible for any system with multiple regulating devices including DG.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large number of methods have been published that aim to evaluate various components of multi-view geometry systems. Most of these have focused on the feature extraction, description and matching stages (the visual front end), since geometry computation can be evaluated through simulation. Many data sets are constrained to small scale scenes or planar scenes that are not challenging to new algorithms, or require special equipment. This paper presents a method for automatically generating geometry ground truth and challenging test cases from high spatio-temporal resolution video. The objective of the system is to enable data collection at any physical scale, in any location and in various parts of the electromagnetic spectrum. The data generation process consists of collecting high resolution video, computing accurate sparse 3D reconstruction, video frame culling and down sampling, and test case selection. The evaluation process consists of applying a test 2-view geometry method to every test case and comparing the results to the ground truth. This system facilitates the evaluation of the whole geometry computation process or any part thereof against data compatible with a realistic application. A collection of example data sets and evaluations is included to demonstrate the range of applications of the proposed system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-resolution, high-contrast, three-dimensional images of live cell and tissue architecture can be obtained using second harmonic generation (SHG), which comprises non-absorptive frequency changes in an excitation laser line. SHG does not require any exogenous antibody or fluorophore labeling, and can generate images from unstained sections of several key endogenous biomolecules, in a wide variety of species and from different types of processed tissue. Here, we examined normal control human skin sections and human burn scar tissues using SHG on a multi-photon microscope (MPM). Examination and comparison of normal human skin and burn scar tissue demonstrated a clear arrangement of fibers in the dermis, similar to dermal collagen fiber signals. Fluorescence-staining confirmed the MPM-SHG collagen colocalization with antibody staining for dermal collagen type-I but not fibronectin or elastin. Furthermore, we were able to detect collagen MPM-SHG signal in human frozen sections as well as in unstained paraffin embedded tissue sections that were then compared with hematoxylin and eosin staining in the identical sections. This same approach was also successful in localizing collagen in porcine and ovine skin samples, and may be particularly important when species-specific antibodies may not be available. Collectively, our results demonstrate that MPM SHG-detection is a useful tool for high resolution examination of collagen architecture in both normal and wounded human, porcine and ovine dermal tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electric Energy Storage (EES) is considered as one of the promising options for reducing the need for costly upgrades in distribution networks in Queensland (QLD). However, It is expected, the full potential for storage for distribution upgrade deferral cannot be fully realized due to high cost of EES. On the other hand, EES used for distribution deferral application can support a variety of complementary storage applications such as energy price arbitrage, time of use (TOU) energy cost reduction, wholesale electricity market ancillary services, and transmission upgrade deferral. Aggregation of benefits of these complementary storage applications would have the potential for increasing the amount of EES that may be financially attractive to defer distribution network augmentation in QLD. In this context, this paper analyzes distribution upgrade deferral, energy price arbitrage, TOU energy cost reduction, and integrated solar PV-storage benefits of EES devices in QLD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the mixed convection heat transfer and fluid flow behaviors in a lid-driven square cavity filled with high Prandtl number fluid (Pr = 5400, ν = 1.2×10-4 m2/s) at low Reynolds number is studied using thermal Lattice Boltzmann method (TLBM) where ν is the viscosity of the fluid. The LBM has built up on the D2Q9 model and the single relaxation time method called the Lattice-BGK (Bhatnagar-Gross-Krook) model. The effects of the variations of non dimensional mixed convection parameter called Richardson number(Ri) with and without heat generating source on the thermal and flow behavior of the fluid inside the cavity are investigated. The results are presented as velocity and temperature profiles as well as stream function and temperature contours for Ri ranging from 0.1 to 5.0 with other controlling parameters that present in this study. It is found that LBM has good potential to simulate mixed convection heat transfer and fluid flow problem. Finally the simulation results have been compared with the previous numerical and experimental results and it is found to be in good agreement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a demand side response model (DSR) which assists small electricity consumers, through an aggregator, exposed to the market price to proactively mitigate price and peak impact on the electrical system. The proposed model allows consumers to manage air-conditioning when as a function of possible price spikes. The main contribution of this research is to demonstrate how consumers can minimise the total expected cost by optimising air-conditioning to account for occurrences of a price spike in the electricity market. This model investigates how pre-cooling method can be used to minimise energy costs when there is a substantial risk of an electricity price spike. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics during hot days on weekdays in the period 2011 to 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed generation (DG) resources are commonly used in the electric systems to obtain minimum line losses, as one of the benefits of DG, in radial distribution systems. Studies have shown the importance of appropriate selection of location and size of DGs. This paper proposes an analytical method for solving optimal distributed generation placement (ODGP) problem to minimize line losses in radial distribution systems using loss sensitivity factor (LSF) based on bus-injection to branch-current (BIBC) matrix. The proposed method is formulated and tested on 12 and 34 bus radial distribution systems. The classical grid search algorithm based on successive load flows is employed to validate the results. The main advantages of the proposed method as compared with the other conventional methods are the robustness and no need to calculate and invert large admittance or Jacobian matrices. Therefore, the simulation time and the amount of computer memory, required for processing data especially for the large systems, decreases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synthetic hydrogels selectively decorated with cell adhesion motifs are rapidly emerging as promising substrates for 3D cell culture. When cells are grown in 3D they experience potentially more physiologically relevant cell-cell interactions and physical cues compared with traditional 2D cell culture on stiff surfaces. A newly developed polymer based on poly(2-oxazoline)s has been used for the first time to control attachment of fibroblast cells and is discussed here for its potential use in 3D cell culture with particular focus on cancer cells towards the ultimate aim of high throughput screening of anti-cancer therapies. Advantages and limitations of using poly(2-oxazoline) hydrogels are discussed and compared with more established polymers, especially polyethylene glycol (PEG).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.