974 resultados para wind generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional vaccines such as inactivated or live attenuated vaccines, are gradually giving way to more biochemically defined vaccines that are most often based on a recombinant antigen known to possess neutralizing epitopes. Such vaccines can offer improvements in speed, safety and manufacturing process but an inevitable consequence of their high degree of purification is that immunogenicity is reduced through the lack of the innate triggering molecules present in more complex preparations. Targeting recombinant vaccines to antigen presenting cells (APCs) such as dendritic cells however can improve immunogenicity by ensuring that antigen processing is as efficient as possible. Immune complexes, one of a number of routes of APC targeting, are mimicked by a recombinant approach, crystallizable fragment (Fc) fusion proteins, in which the target immunogen is linked directly to an antibody effector domain capable of interaction with receptors, FcR, on the APC cell surface. A number of virus Fc fusion proteins have been expressed in insect cells using the baculovirus expression system and shown to be efficiently produced and purified. Their use for immunization next to non-Fc tagged equivalents shows that they are powerfully immunogenic in the absence of added adjuvant and that immune stimulation is the result of the Fc-FcR interaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variability of renewable energy is widely recognised as a challenge for integrating high levels of renewable generation into electricity systems. However, to explore its implications effectively, variability itself should first be clearly understood. This is particularly true for national electricity systems with high planned penetration of renewables and limited interconnection such as the UK. Variability cannot be considered as a distinct resource property with a single measurable parameter, but is a multi-faceted concept best described by a range of distinct characteristics. This paper identifies relevant characteristics of variability, and considers their implications for energy research. This is done through analysis of wind, solar and tidal current resources, with a primary focus on the Bristol Channel region in the UK. The relationship with electricity demand is considered, alongside the potential benefits of resource diversity. Analysis is presented in terms of persistence, distribution, frequency and correlation between supply and demand. Marked differences are seen between the behaviours of the individual resources, and these give rise to a range of different implications for system integration. Wind shows strong persistence and a useful seasonal pattern, but also a high spread in energy levels at timescales beyond one or two days. The solar resource is most closely correlated with electricity demand, but is undermined by night-time zero values and an even greater spread of monthly energy delivered than wind. In contrast, the tidal resource exhibits very low persistence, but also much greater consistency in energy values assessed across monthly time scales. Whilst this paper focuses primarily on the behaviour of resources, it is noted that discrete variability characteristics can be related to different system impacts. Persistence and predictability are relevant for system balancing, whereas statistical distribution is more relevant when exploring issues of asset utilisation and energy curtailment. Areas of further research are also identified, including the need to assess the value of predictability in relation to other characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Control and optimization of flavor is the ultimate challenge for the food and flavor industry. The major route to flavor formation during thermal processing is the Maillard reaction, which is a complex cascade of interdependent reactions initiated by the reaction between a reducing sugar and an amino compd. The complexity of the reaction means that researchers turn to kinetic modeling in order to understand the control points of the reaction and to manipulate the flavor profile. Studies of the kinetics of flavor formation have developed over the past 30 years from single- response empirical models of binary aq. systems to sophisticated multi-response models in food matrixes, based on the underlying chem., with the power to predict the formation of some key aroma compds. This paper discusses in detail the development of kinetic models of thermal generation of flavor and looks at the challenges involved in predicting flavor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our aim is to reconstruct the brain-body loop of stroke patients via an EEG-driven robotic system. After the detection of motor command generation, the robotic arm should assist patient’s movement at the correct moment and in a natural way. In this study we performed EEG measurements from healthy subjects performing discrete spontaneous motion. An EEG analysis based on the temporal correlation of the brain activity was employed to determine the onset of single motion motor command generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nearly all chemistry–climate models (CCMs) have a systematic bias of a delayed springtime breakdown of the Southern Hemisphere (SH) stratospheric polar vortex, implying insufficient stratospheric wave drag. In this study the Canadian Middle Atmosphere Model (CMAM) and the CMAM Data Assimilation System (CMAM-DAS) are used to investigate the cause of this bias. Zonal wind analysis increments from CMAMDAS reveal systematic negative values in the stratosphere near 608S in winter and early spring. These are interpreted as indicating a bias in the model physics, namely, missing gravity wave drag (GWD). The negative analysis increments remain at a nearly constant height during winter and descend as the vortex weakens, much like orographic GWD. This region is also where current orographic GWD parameterizations have a gap in wave drag, which is suggested to be unrealistic because of missing effects in those parameterizations. These findings motivate a pair of free-runningCMAMsimulations to assess the impact of extra orographicGWDat 608S. The control simulation exhibits the cold-pole bias and delayed vortex breakdown seen in the CCMs. In the simulation with extra GWD, the cold-pole bias is significantly reduced and the vortex breaks down earlier. Changes in resolved wave drag in the stratosphere also occur in response to the extra GWD, which reduce stratospheric SH polar-cap temperature biases in late spring and early summer. Reducing the dynamical biases, however, results in degraded Antarctic column ozone. This suggests that CCMs that obtain realistic column ozone in the presence of an overly strong and persistent vortex may be doing so through compensating errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of the Chemistry‐Climate Model Validation (CCMVal) activity is to improve understanding of chemistry‐climate models (CCMs) through process‐oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozonedepleting substances, and hence for understanding the ozone and climate forecasts produced by the models participating in this activity. Here we introduce and review the models used for the second round (CCMVal‐2) of this intercomparison, regarding the implementation of chemical, transport, radiative, and dynamical processes in these models. In particular, we review the advantages and problems associated with approaches used to model processes of relevance to stratospheric dynamics and chemistry. Furthermore, we state the definitions of the reference simulations performed, and describe the forcing data used in these simulations. We identify some developments in chemistry‐climate modeling that make models more physically based or more comprehensive, including the introduction of an interactive ocean, online photolysis, troposphere‐stratosphere chemistry, and non‐orographic gravity‐wave deposition as linked to tropospheric convection. The relatively new developments indicate that stratospheric CCM modeling is becoming more consistent with our physically based understanding of the atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of the variability of equatorial ozone profiles in the Satellite Aerosol and Gas Experiment‐corrected Solar Backscatter Ultraviolet data set demonstrates a strong seasonal persistence of interannual ozone anomalies, revealing a seasonal dependence to equatorial ozone variability. In the lower stratosphere (40–25 hPa) and in the upper stratosphere (6–4 hPa), ozone anomalies persist from approximately November until June of the following year, while ozone anomalies in the layer between 16 and 10 hPa persist from June to December. Analysis of zonal wind fields in the lower stratosphere and temperature fields in the upper stratosphere reveals a similar seasonal persistence of the zonal wind and temperature anomalies associated with the quasi‐biennial oscillation (QBO). Thus, the persistence of interannual ozone anomalies in the lower and upper equatorial stratosphere, which are mainly associated with the well‐known QBO ozone signal through the QBO-induced meridional circulation, is related to a newly identified seasonal persistence of the QBO itself. The upper stratospheric QBO ozone signal is argued to arise from a combination of QBO‐induced temperature and NOx perturbations, with the former dominating at 5 hPa and the latter at 10 hPa. Ozone anomalies in the transition zone between dynamical and photochemical control of ozone (16–10 hPa) are less influenced by the QBO signal and show a quite different seasonal persistence compared to the regions above and below.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A climatology of the late summer stratospheric zonal wind turnaround phenomenon is presented, with a particular focus on the behaviour over the Meteorological Service of Canada’s balloon-launching site at Vanscoy, Saskatchewan (52°N, 107°W). Turnaround refers to the change in sign of the zonal wind velocity and occurs twice each year at stratospheric mid-latitudes, in early spring and in late summer. The late summer turnaround is of particular interest to the high-altitude ballooning community because it offers the ideal conditions for launch, but it is also an interesting dynamical phenomenon in its own right. It is studied here using both the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis and the United Kingdom Meteorological Office (MetO) analysis products as well as climate simulation data from the Canadian Middle Atmosphere Model (CMAM). The phenomenon and its interannual variability are documented. The predictability of the late summer turnaround over Vanscoy is investigated using both statistical averages and autocorrelation analysis. From the statistical averages, it is found that during every year since 1993, the period from 26 August to 5 September has contained appropriate launch dates. From the autocorrelation analysis, it is found that stratospheric zonal wind anomalies can persist for a month or more during most of the summer, but there is a predictability horizon at the end of the summer — just before turnaround

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The redistribution of a finite amount of martian surface dust during global dust storms and in the intervening periods has been modelled in a dust lifting version of the UK Mars General Circulation Model. When using a constant, uniform threshold in the model’s wind stress lifting parameterisation and assuming an unlimited supply of surface dust, multiannual simulations displayed some variability in dust lifting activity from year to year, arising from internal variability manifested in surface wind stress, but dust storms were limited in size and formed within a relatively short seasonal window. Lifting thresholds were then allowed to vary at each model gridpoint, dependent on the rates of emission or deposition of dust. This enhanced interannual variability in dust storm magnitude and timing, such that model storms covered most of the observed ranges in size and initiation date within a single multiannual simulation. Peak storm magnitude in a given year was primarily determined by the availability of surface dust at a number of key sites in the southern hemisphere. The observed global dust storm (GDS) frequency of roughly one in every 3 years was approximately reproduced, but the model failed to generate these GDSs spontaneously in the southern hemisphere, where they have typically been observed to initiate. After several years of simulation, the surface threshold field—a proxy for net change in surface dust density—showed good qualitative agreement with the observed pattern of martian surface dust cover. The model produced a net northward cross-equatorial dust mass flux, which necessitated the addition of an artificial threshold decrease rate in order to allow the continued generation of dust storms over the course of a multiannual simulation. At standard model resolution, for the southward mass flux due to cross-equatorial flushing storms to offset the northward flux due to GDSs on a timescale of ∼3 years would require an increase in the former by a factor of 3–4. Results at higher model resolution and uncertainties in dust vertical profiles mean that quasi-periodic redistribution of dust on such a timescale nevertheless appears to be a plausible explanation for the observed GDS frequency.