980 resultados para Grid code
Resumo:
Integrating renewable energy into built environments requires additional attention to the balancing of supply and demand due to their intermittent nature. Demand Side Response (DSR) has the potential to make money for organisations as well as support the System Operator as the generation mix changes. There is an opportunity to increase the use of existing technologies in order to manage demand. Company-owned standby generators are a rarely used resource; their maintenance schedule often accounts for a majority of their running hours. DSR encompasses a range of technologies and organisations; Sustainability First (2012) suggest that the System Operator (SO), energy supply companies, Distribution Network Operators (DNOs), Aggregators and Customers all stand to benefit from DSR. It is therefore important to consider impact of DSR measures to each of these stakeholders. This paper assesses the financial implications of organisations using existing standby generation equipment for DSR in order to avoid peak electricity charges. It concludes that under the current GB electricity pricing structure, there are several regions where running diesel generators at peak times is financially beneficial to organisations. Issues such as fuel costs, Carbon Reduction Commitment (CRC) charges, maintenance costs and electricity prices are discussed.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
Multi-model ensembles are frequently used to assess understanding of the response of ozone and methane lifetime to changes in emissions of ozone precursors such as NOx, VOCs (volatile organic compounds) and CO. When these ozone changes are used to calculate radiative forcing (RF) (and climate metrics such as the global warming potential (GWP) and global temperature-change potential (GTP)) there is a methodological choice, determined partly by the available computing resources, as to whether the mean ozone (and methane) concentration changes are input to the radiation code, or whether each model's ozone and methane changes are used as input, with the average RF computed from the individual model RFs. We use data from the Task Force on Hemispheric Transport of Air Pollution source–receptor global chemical transport model ensemble to assess the impact of this choice for emission changes in four regions (East Asia, Europe, North America and South Asia). We conclude that using the multi-model mean ozone and methane responses is accurate for calculating the mean RF, with differences up to 0.6% for CO, 0.7% for VOCs and 2% for NOx. Differences of up to 60% for NOx 7% for VOCs and 3% for CO are introduced into the 20 year GWP. The differences for the 20 year GTP are smaller than for the GWP for NOx, and similar for the other species. However, estimates of the standard deviation calculated from the ensemble-mean input fields (where the standard deviation at each point on the model grid is added to or subtracted from the mean field) are almost always substantially larger in RF, GWP and GTP metrics than the true standard deviation, and can be larger than the model range for short-lived ozone RF, and for the 20 and 100 year GWP and 100 year GTP. The order of averaging has most impact on the metrics for NOx, as the net values for these quantities is the residual of the sum of terms of opposing signs. For example, the standard deviation for the 20 year GWP is 2–3 times larger using the ensemble-mean fields than using the individual models to calculate the RF. The source of this effect is largely due to the construction of the input ozone fields, which overestimate the true ensemble spread. Hence, while the average of multi-model fields are normally appropriate for calculating mean RF, GWP and GTP, they are not a reliable method for calculating the uncertainty in these fields, and in general overestimate the uncertainty.
Resumo:
Flavonoids reduce cardiovascular disease risk through anti-inflammatory, anti-coagulant and anti-platelet actions. One key flavonoid inhibitory mechanism is blocking kinase activity that drives these processes. Flavonoids attenuate activities of kinases including phosphoinositide-3-kinase (PI3K), Fyn, Lyn, Src, Syk, PKC, PIM1/2, ERK, JNK, and PKA. X-ray crystallographic analyses of kinase-flavonoid complexes show that flavonoid ring systems and their hydroxyl substitutions are important structural features for their binding to kinases. A clearer understanding of structural interactions of flavonoids with kinases is necessary to allow construction of more potent and selective counterparts. We examined flavonoid (quercetin, apigenin and catechin) interactions with Src-family kinases (Lyn, Fyn and Hck) applying the Sybyl docking algorithm and GRID. A homology model (Lyn) was used in our analyses to demonstrate that high quality predicted kinase structures are suitable for flavonoid computational studies. Our docking results revealed potential hydrogen bond contacts between flavonoid hydroxyls and kinase catalytic site residues. Identification of plausible contacts indicated that quercetin formed the most energetically stable interactions, apigenin lacked hydroxyl groups necessary for important contacts, and the non-planar structure of catechin could not support predicted hydrogen bonding patterns. GRID analysis using a hydroxyl functional group supported docking results. Based on these findings, we predicted that quercetin would inhibit activities of Src-family kinases with greater potency than apigenin and catechin. We validated this prediction using in vitro kinase assays. We conclude that our study can be used as a basis to construct virtual flavonoid interaction libraries to guide drug discovery using these compounds as molecular templates.
Resumo:
Two varieties of Greek are spoken on the island of Cyprus: the local dialect, namely the Greek-Cypriot Dialect (GCD), and Standard Modern Greek (SMG). English is also influential, as Cyprus was an English colony until 1960. The dialect is rarely employed for everyday written purposes; however, it is now evident in computer-mediated communication (CMC). As a contribution to the field of code-switching in writing, this study examines how Greek-Cypriot internet users employ GCD, SMG, and English in their Facebook interactions. In particular, we investigate how identities (discursive and social) are performed and indexed through the linguistic choices of Greek-Cypriot internet users. The findings indicate that switches to GCD add a humorous tone and express solidarity and informality. SMG is mostly used for ‘official’ statements, and it is preferred by mature internet users, while English is used with expressions of affect and evaluative comments.
Resumo:
In this paper, we consider one particularly interesting feature of the Lieber Code, which is the fact that it was drawn up by the U.S. Government to regulate the conduct of its armed forces in a civil war. In so doing, we hope to explore the extent to which there may be links between the Lieber Code and the contemporary regulation of non-international armed conflicts. In particular, we explore some similarities and contrasts between the views on the regulation of civil war that existed at the time of the drafting of the Lieber Code and the position that exists today.
Resumo:
Smart grid research has tended to be compartmentalised, with notable contributions from economics, electrical engineering and science and technology studies. However, there is an acknowledged and growing need for an integrated systems approach to the evaluation of smart grid initiatives. The capacity to simulate and explore smart grid possibilities on various scales is key to such an integrated approach but existing models – even if multidisciplinary – tend to have a limited focus. This paper describes an innovative and flexible framework that has been developed to facilitate the simulation of various smart grid scenarios and the interconnected social, technical and economic networks from a complex systems perspective. The architecture is described and related to realised examples of its use, both to model the electricity system as it is today and to model futures that have been envisioned in the literature. Potential future applications of the framework are explored, along with its utility as an analytic and decision support tool for smart grid stakeholders.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Bilingualism is reported to re-structure executive control networks, but it remains unknown which aspects of the bilingual experience cause this modulation. This study explores the impact of three code-switching types on executive functions: (1) alternation of languages, (2) insertion of lexicon of one language into grammar of another, (3) dense code-switching with co-activation of lexicon and grammar. Current models hypothesise that they challenge different aspects of the executive system because they vary in the extent and scope of language separation. Two groups of German-English bilinguals differing in dense code-switching frequency participated in a flanker task under conditions varying in degree of trial-mixing and resulting demands to conflict-monitoring. Bilinguals engaging in more dense code-switching showed inhibitory advantages in the condition requiring most conflict-monitoring. Moreover, dense code-switching frequency correlated positively with monitoring skills. This suggests that the management of co-activated languages during dense code-switching engages conflict-monitoring and that the consolidation processes taking place within co-activated linguistic systems involve local inhibition. Code-switching types requiring greater degrees of language separation may involve more global forms of inhibition. This study shows that dense code-switching is a key experience shaping bilinguals’ executive functioning and highlights the importance of controlling for participants’ code-switching habits in bilingualism research.
Resumo:
Classical nova remnants are important scenarios for improving the photoionization modeling. This work describes the pseudo-three-dimensional code RAINY3D, which drives the photoionization code Cloudy as a subroutine. Photoionization simulations of old nova remnants are also presented and discussed. In these simulations we analyze the effect of condensation in the remnant spectra. The condensed mass fraction affects the Balmer lines by a factor of greater than 4 when compared with homogeneous models, and this directly impacts the shell mass determination. The He II 4686/H beta ratio decreases by a factor of 10 in clumpy shells. These lines are also affected by the clump size and density distributions. The behavior of the strongest nebular line observed in nova remnants is also analyzed for heterogeneous shells. The gas diagnoses in novae ejecta are thought to be more accurate during the nebular phase, but we have determined that at this phase the matter distribution can strongly affect the derived shell physical properties and chemical abundances.
Resumo:
In 2006 the Route load balancing algorithm was proposed and compared to other techniques aiming at optimizing the process allocation in grid environments. This algorithm schedules tasks of parallel applications considering computer neighborhoods (where the distance is defined by the network latency). Route presents good results for large environments, although there are cases where neighbors do not have an enough computational capacity nor communication system capable of serving the application. In those situations the Route migrates tasks until they stabilize in a grid area with enough resources. This migration may take long time what reduces the overall performance. In order to improve such stabilization time, this paper proposes RouteGA (Route with Genetic Algorithm support) which considers historical information on parallel application behavior and also the computer capacities and load to optimize the scheduling. This information is extracted by using monitors and summarized in a knowledge base used to quantify the occupation of tasks. Afterwards, such information is used to parameterize a genetic algorithm responsible for optimizing the task allocation. Results confirm that RouteGA outperforms the load balancing carried out by the original Route, which had previously outperformed others scheduling algorithms from literature.
Resumo:
Neutron multiplicities for several targets and spallation products of proton-induced reactions in thin targets of interest to an accelerator-driven system obtained with the CRISP code have been reported. This code is a Monte Carlo calculation that simulates the intranuclear cascade and evaporationl fission competition processes. Results are compared with experimental data, and agreement between each other can be considered quite satisfactory in a very broad energy range of incitant particles and different targets.