52 resultados para aggregated multicast
Resumo:
Grid operators and electricity retailers in Ireland manage peak demand, power system balancing and grid congestion by offering relevant incentives to consumers to reduce or shift their load. The need for active consumers in the home using smart appliances has never been greater, due to increased variable renewable generation and grid constraints. In this paper an aggregated model of a single compressor fridge-freezer population is developed. A price control strategy is examined to quantify and value demand response savings during a representative winter and summer week for Ireland in 2020. The results show an average reduction in fridge-freezer operating cost of 8.2% during winter and significantly lower during summer in Ireland. A peak reduction of at least 68% of the average winter refrigeration load is achieved consistently during the week analysed using a staggering control mode. An analysis of the current ancillary service payments confirms that these are insufficient to ensure widespread uptake by the small consumer, and new mechanisms need to be developed to make becoming an active consumer attractive. Demand response is proposed as a new ancillary service called ramping capability, as the need for this service will increase with more renewable energy penetration on the power system.
Resumo:
We examined a remnant host plant (Primula veris L.) habitat network that was last inhabited by the rare butterfly Hamearis lucina L. in north Wales in 1943, to assess the relative contribution of several spatial parameters to its regional extinction. We first examined relationships between P. veris characteristics and H. lucina eggs in surviving H. lucina populations, and used these to predict the suitability and potential carrying capacity of the habitat network in north Wales. This resulted in an estimate of roughly 4500 eggs (ca 227 adults). We developed a discrete space, discrete time metapopulation model to evaluate the relative contribution of dispersal distance, habitat and environmental stochasticity as possible causes of extinction. We simulated the potential persistence of the butterfly in the current network as well as in three artificial (historical and present) habitat networks that differed in the quantity (current and X3) and fragmentation of the habitat (current and aggregated). We identified that reduced habitat quantity and increased isolation would have increased the probability of regional extinction, in conjunction with environmental stochasticity and H. lucina's dispersal distance. This general trend did not change in a qualitative manner when we modified the ability of dispersing females to stay in, and find suitable habitats (by changing the size of the grid cells used in the model). Contrary to most metapopulation model predictions, system persistence declined with increasing migration rate, suggesting that the mortality of migrating individuals in fragmented landscapes may pose significant risks to system-wide persistence. Based on model predictions for the present landscape we argue that a major programme of habitat restoration would be required for a re-established metapopulation to persist for > 100 years.
Resumo:
We investigate a collision-sensitive secondary network that intends to opportunistically aggregate and utilize spectrum of a primary network to achieve higher data rates. In opportunistic spectrum access with imperfect sensing of idle primary spectrum, secondary transmission can collide with primary transmission. When the secondary network aggregates more channels in the presence of the imperfect sensing, collisions could occur more often, limiting the performance obtained by spectrum aggregation. In this context, we aim to address a fundamental query, that is, how much spectrum aggregation is worthy with imperfect sensing. For collision occurrence, we focus on two different types of collision: one is imposed by asynchronous transmission; and the other by imperfect spectrum sensing. The collision probability expression has been derived in closed-form with various secondary network parameters: primary traffic load, secondary user transmission parameters, spectrum sensing errors, and the number of aggregated sub-channels. In addition, the impact of spectrum aggregation on data rate is analysed under the constraint of collision probability. Then, we solve an optimal spectrum aggregation problem and propose the dynamic spectrum aggregation approach to increase the data rate subject to practical collision constraints. Our simulation results show clearly that the proposed approach outperforms the benchmark that passively aggregates sub-channels with lack of collision awareness.
Resumo:
Modification of citrate and hydroxylamine reduced Ag colloids with thiocholine bromide, a thiol functionalized quaternary ammonium salt, creates particles where the zeta potential is switched from the normal values of ca. -50 mV to ca. + 50 mV. These colloids are stable but can be aggregated with metal salts in much the same way as the parent colloids. They are excellent SERS substrates for detection of anionic targets since their positive zeta potentials promote adsorption of negatively charged ions. This is important because the vast majority of published SERS studies involve cationic or neutral targets. Moreover, the fact that the modifier is a quaternary ammonium ion means that the negative surface charge is maintained even at alkaline pH. The modified colloids can be used to detect compounds which cannot be detected using conventional negatively-charged citrate or hydroxylamine reduced metal nanoparticles, for example the detection limit was 5.0 x 10(-5) M for perchlorate and
Resumo:
Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.
Resumo:
In many applications, and especially those where batch processes are involved, a target scalar output of interest is often dependent on one or more time series of data. With the exponential growth in data logging in modern industries such time series are increasingly available for statistical modeling in soft sensing applications. In order to exploit time series data for predictive modelling, it is necessary to summarise the information they contain as a set of features to use as model regressors. Typically this is done in an unsupervised fashion using simple techniques such as computing statistical moments, principal components or wavelet decompositions, often leading to significant information loss and hence suboptimal predictive models. In this paper, a functional learning paradigm is exploited in a supervised fashion to derive continuous, smooth estimates of time series data (yielding aggregated local information), while simultaneously estimating a continuous shape function yielding optimal predictions. The proposed Supervised Aggregative Feature Extraction (SAFE) methodology can be extended to support nonlinear predictive models by embedding the functional learning framework in a Reproducing Kernel Hilbert Spaces setting. SAFE has a number of attractive features including closed form solution and the ability to explicitly incorporate first and second order derivative information. Using simulation studies and a practical semiconductor manufacturing case study we highlight the strengths of the new methodology with respect to standard unsupervised feature extraction approaches.
Resumo:
This letter investigates the uplink spectral efficiency (SE) of a two-tier cellular network, where massive multiple-input multiple-output macro base stations are overlaid with dense small cells. Macro user equipments (MUEs) and small cells with single user equipment uniformly scattered are modeled as two independent homogeneous Poisson point processes. By applying stochastic geometry, we analyze the SE of the multiuser uplink at a macro base station that employs a zero-forcing detector and we obtain a novel lower bound as well as its approximation. According to the simple and near-exact analytical expression, we observe that the ideal way to improve the SE is by increasing the MUE density and the base station antennas synchronously rather than increasing them individually. Furthermore, a large value of path loss exponent has a positive effect on the SE due to the reduced aggregated interference.