37 resultados para high harmonics generation
Resumo:
PV only generates electricity during daylight hours and primarily generates over summer. In the UK, the carbon intensity of grid electricity is higher during the daytime and over winter. This work investigates whether the grid electricity displaced by PV is high or low carbon compared to the annual mean carbon intensity using carbon factors at higher temporal resolutions (half-hourly and daily). UK policy for carbon reporting requires savings to be calculated using the annual mean carbon intensity of grid electricity. This work offers an insight into whether this technique is appropriate. Using half hourly data on the generating plant supplying the grid from November 2008 to May 2010, carbon factors for grid electricity at half-hourly and daily resolution have been derived using technology specific generation emission factors. Applying these factors to generation data from PV systems installed on schools, it is possible to assess the variation in the carbon savings from displacing grid electricity with PV generation using carbon factors with different time resolutions. The data has been analyzed for a period of 363 to 370 days and so cannot account for inter-year variations in the relationship between PV generation and carbon intensity of the electricity grid. This analysis suggests that PV displaces more carbon intensive electricity using half-hourly carbon factors than using daily factors but less compared with annual ones. A similar methodology could provide useful insights on other variable renewable and demand-side technologies and in other countries where PV performance and grid behavior are different.
Resumo:
Wind generation’s contribution to meeting extreme peaks in electricity demand is a key concern for the integration of wind power. In Great Britain (GB), robustly assessing this contribution directly from power system data (i.e. metered wind-supply and electricity demand) is difficult as extreme peaks occur infrequently (by definition) and measurement records are both short and inhomogeneous. Atmospheric circulation-typing combined with meteorological reanalysis data is proposed as a means to address some of these difficulties, motivated by a case study of the extreme peak demand events in January 2010. A preliminary investigation of the physical and statistical properties of these circulation types suggests that they can be used to identify the conditions that are most likely to be associated with extreme peak demand events. Three broad cases are highlighted as requiring further investigation. The high-over-Britain anticyclone is found to be generally associated with very low winds but relatively moderate temperatures (and therefore moderate peak demands, somewhat in contrast to the classic low-wind cold snap that is sometimes apparent in the literature). In contrast, both longitudinally extended blocking over Scotland/Scandinavia and latitudinally extended troughs over western Europe appear to be more closely linked to the very cold GB temperatures (usually associated with extreme peak demands). In both of these latter situations, wind resource averaged across GB appears to be more moderate.
Resumo:
Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.
Resumo:
This article combines institutional and resources’ arguments to show that the institutional distance between the home and the host country, and the headquarters’ financial performance have a relevant impact on the environmental standardization decision in multinational companies. Using a sample of 135 multinational companies in three different industries with headquarters and subsidiaries based in the USA, Canada, Mexico, France, and Spain, we find that a high environmental institutional distance between headquarters’ and subsidiaries’ countries deters the standardization of environmental practices. On the other hand, high-profit headquarters are willing to standardize their environmental practices, rather than taking advantage of countries with lax environmental protection to undertake more pollution-intensive activities. Finally, we show that headquarters’ financial performance also imposes a moderating effect on the relationship between environmental institutional distance between countries and environmental standardization within the multinational company.
Resumo:
Various studies investigating the future impacts of integrating high levels of renewable energy make use of historical meteorological (met) station data to produce estimates of future generation. Hourly means of 10m horizontal wind are extrapolated to a standard turbine hub height using the wind profile power or log law and used to simulate the hypothetical power output of a turbine at that location; repeating this procedure using many viable locations can produce a picture of future electricity generation. However, the estimate of hub height wind speed is dependent on the choice of the wind shear exponent a or the roughness length z0, and requires a number of simplifying assumptions. This paper investigates the sensitivity of this estimation on generation output using a case study of a met station in West Freugh, Scotland. The results show that the choice of wind shear exponent is a particularly sensitive parameter which can lead to significant variation of estimated hub height wind speed and hence estimated future generation potential of a region.
Resumo:
Traditional vaccines such as inactivated or live attenuated vaccines, are gradually giving way to more biochemically defined vaccines that are most often based on a recombinant antigen known to possess neutralizing epitopes. Such vaccines can offer improvements in speed, safety and manufacturing process but an inevitable consequence of their high degree of purification is that immunogenicity is reduced through the lack of the innate triggering molecules present in more complex preparations. Targeting recombinant vaccines to antigen presenting cells (APCs) such as dendritic cells however can improve immunogenicity by ensuring that antigen processing is as efficient as possible. Immune complexes, one of a number of routes of APC targeting, are mimicked by a recombinant approach, crystallizable fragment (Fc) fusion proteins, in which the target immunogen is linked directly to an antibody effector domain capable of interaction with receptors, FcR, on the APC cell surface. A number of virus Fc fusion proteins have been expressed in insect cells using the baculovirus expression system and shown to be efficiently produced and purified. Their use for immunization next to non-Fc tagged equivalents shows that they are powerfully immunogenic in the absence of added adjuvant and that immune stimulation is the result of the Fc-FcR interaction.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Meteorological (met) station data is used as the basis for a number of influential studies into the impacts of the variability of renewable resources. Real turbine output data is not often easy to acquire, whereas meteorological wind data, supplied at a standardised height of 10 m, is widely available. This data can be extrapolated to a standard turbine height using the wind profile power law and used to simulate the hypothetical power output of a turbine. Utilising a number of met sites in such a manner can develop a model of future wind generation output. However, the accuracy of this extrapolation is strongly dependent on the choice of the wind shear exponent alpha. This paper investigates the accuracy of the simulated generation output compared to reality using a wind farm in North Rhins, Scotland and a nearby met station in West Freugh. The results show that while a single annual average value for alpha may be selected to accurately represent the long term energy generation from a simulated wind farm, there are significant differences between simulation and reality on an hourly power generation basis, with implications for understanding the impact of variability of renewables on short timescales, particularly system balancing and the way that conventional generation may be asked to respond to a high level of variable renewable generation on the grid in the future.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.
Resumo:
Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.
Resumo:
Descent and spreading of high salinity water generated by salt rejection during sea ice formation in an Antarctic coastal polynya is studied using a hydrostatic, primitive equation three-dimensional ocean model called the Proudman Oceanographic Laboratory Coastal Ocean Modeling System (POLCOMS). The shape of the polynya is assumed to be a rectangle 100 km long and 30 km wide, and the salinity flux into the polynya at its surface is constant. The model has been run at high horizontal spatial resolution (500 m), and numerical simulations reveal a buoyancy-driven coastal current. The coastal current is a robust feature and appears in a range of simulations designed to investigate the influence of a sloping bottom, variable bottom drag, variable vertical turbulent diffusivities, higher salinity flux, and an offshore position of the polynya. It is shown that bottom drag is the main factor determining the current width. This coastal current has not been produced with other numerical models of polynyas, which may be because these models were run at coarser resolutions. The coastal current becomes unstable upstream of its front when the polynya is adjacent to the coast. When the polynya is situated offshore, an unstable current is produced from its outset owing to the capture of cyclonic eddies. The effect of a coastal protrusion and a canyon on the current motion is investigated. In particular, due to the convex shape of the coastal protrusion, the current sheds a dipolar eddy.
Resumo:
Epigenetic modification of the genome via cytosine methylation is a dynamic process that responds to changes in the growing environment. This modification can also be heritable. The combination of both properties means that there is the potential for the life experiences of the parental generation to modify the methylation profiles of their offspring and so potentially to ‘pre-condition’ them to better accommodate abiotic conditions encountered by their parents. We recently identified high vapor pressure deficit (vpd)-induced DNA methylation at two gene loci in the stomatal development pathway and an associated reduction in leaf stomatal frequency.1 Here, we test whether this epigenetic modification pre-conditioned parents and their offspring to the more severe water stress of periodic drought. We found that three generations of high vpd-grown plants were better able to withstand periodic drought stress over two generations. This resistance was not directly associated with de novo methylation of the target stomata genes, but was associated with the cmt3 mutant’s inability to maintain asymmetric sequence context methylation. If our finding applies widely, it could have significant implications for evolutionary biology and breeding for stressful environments.
Resumo:
MAGIC populations represent one of a new generation of crop genetic mapping resources combining high genetic recombination and diversity. We describe the creation and validation of an eight-parent MAGIC population consisting of 1091 F7 lines of winter-sown wheat (Triticum aestivum L.). Analyses based on genotypes from a 90,000-single nucleotide polymorphism (SNP) array find the population to be well-suited as a platform for fine-mapping quantitative trait loci (QTL) and gene isolation. Patterns of linkage disequilibrium (LD) show the population to be highly recombined; genetic marker diversity among the founders was 74% of that captured in a larger set of 64 wheat varieties, and 54% of SNPs segregating among the 64 lines also segregated among the eight founder lines. In contrast, a commonly used reference bi-parental population had only 54% of the diversity of the 64 varieties with 27% of SNPs segregating. We demonstrate the potential of this MAGIC resource by identifying a highly diagnostic marker for the morphological character "awn presence/absence" and independently validate it in an association-mapping panel. These analyses show this large, diverse, and highly recombined MAGIC population to be a powerful resource for the genetic dissection of target traits in wheat, and it is well-placed to efficiently exploit ongoing advances in phenomics and genomics. Genetic marker and trait data, together with instructions for access to seed, are available at http://www.niab.com/MAGIC/.
Resumo:
With a rapidly increasing fraction of electricity generation being sourced from wind, extreme wind power generation events such as prolonged periods of low (or high) generation and ramps in generation, are a growing concern for the efficient and secure operation of national power systems. As extreme events occur infrequently, long and reliable meteorological records are required to accurately estimate their characteristics. Recent publications have begun to investigate the use of global meteorological “reanalysis” data sets for power system applications, many of which focus on long-term average statistics such as monthly-mean generation. Here we demonstrate that reanalysis data can also be used to estimate the frequency of relatively short-lived extreme events (including ramping on sub-daily time scales). Verification against 328 surface observation stations across the United Kingdom suggests that near-surface wind variability over spatiotemporal scales greater than around 300 km and 6 h can be faithfully reproduced using reanalysis, with no need for costly dynamical downscaling. A case study is presented in which a state-of-the-art, 33 year reanalysis data set (MERRA, from NASA-GMAO), is used to construct an hourly time series of nationally-aggregated wind power generation in Great Britain (GB), assuming a fixed, modern distribution of wind farms. The resultant generation estimates are highly correlated with recorded data from National Grid in the recent period, both for instantaneous hourly values and for variability over time intervals greater than around 6 h. This 33 year time series is then used to quantify the frequency with which different extreme GB-wide wind power generation events occur, as well as their seasonal and inter-annual variability. Several novel insights into the nature of extreme wind power generation events are described, including (i) that the number of prolonged low or high generation events is well approximated by a Poission-like random process, and (ii) whilst in general there is large seasonal variability, the magnitude of the most extreme ramps is similar in both summer and winter. An up-to-date version of the GB case study data as well as the underlying model are freely available for download from our website: http://www.met.reading.ac.uk/~energymet/data/Cannon2014/.
Resumo:
Using data from the EISCAT (European Incoherent Scatter) VHF and CUTLASS (Co-operative UK Twin- Located Auroral Sounding System) HF radars, we study the formation of ionospheric polar cap patches and their relationship to the magnetopause reconnection pulses identified in the companion paper by Lockwood et al. (2005). It is shown that the poleward-moving, high-concentration plasma patches observed in the ionosphere by EISCAT on 23 November 1999, as reported by Davies et al. (2002), were often associated with corresponding reconnection rate pulses. However, not all such pulses generated a patch and only within a limited MLT range (11:00–12:00 MLT) did a patch result from a reconnection pulse. Three proposed mechanisms for the production of patches, and of the concentration minima that separate them, are analysed and evaluated: (1) concentration enhancement within the patches by cusp/cleft precipitation; (2) plasma depletion in the minima between the patches by fast plasma flows; and (3) intermittent injection of photoionisation-enhanced plasma into the polar cap. We devise a test to distinguish between the effects of these mechanisms. Some of the events repeat too frequently to apply the test. Others have sufficiently long repeat periods and mechanism (3) is shown to be the only explanation of three of the longer-lived patches seen on this day. However, effect (2) also appears to contribute to some events. We conclude that plasma concentration gradients on the edges of the larger patches arise mainly from local time variations in the subauroral plasma, via the mechanism proposed by Lockwood et al. (2000).