920 resultados para Online generation of trajectories
Resumo:
Expression of biologically active molecules as fusion proteins with antibody Fc can substantially extend the plasma half-life of the active agent but may also influence function. We have previously generated a number of fusion proteins comprising a complement regulator coupled to Fc and shown that the hybrid molecule has a long plasma half-life and retains biological activity. However, several of the fusion proteins generated had substantially reduced biological activity when compared with the native regulator or regulator released from the Fc following papain cleavage. We have taken advantage of this finding to engineer a prodrug with low complement regulatory activity that is cleaved at sites of inflammation to release active regulator. Two model prodrugs, comprising, respectively, the four short consensus repeats of human decay accelerating factor (CD55) linked to IgG4 Fc and the three NH2-terminal short consensus repeats of human decay accelerating factor linked to IgG2 Fc have been developed. In each, specific cleavage sites for matrix metalloproteinases and/or aggrecanases have been incorporated between the complement regulator and the Fc. These prodrugs have markedly decreased complement inhibitory activity when compared with the parent regulator in vitro. Exposure of the prodrugs to the relevant enzymes, either purified, or in supernatants of cytokine-stimulated chondrocytes or in synovial fluid, efficiently cleaved the prodrug, releasing active regulator. Such agents, having negligible systemic effects but active at sites of inflammation, represent a paradigm for the next generation of anti-C therapeutics.
Resumo:
A prerequisite for the enrichment of antibodies screened from phage display libraries is their stable expression on a phage during multiple selection rounds. Thus, if stringent panning procedures are employed, selection is simultaneously driven by antigen affinity, stability and solubility. To take advantage of robust pre-selected scaffolds of such molecules, we grafted single-chain Fv (scFv) antibodies, previously isolated from a human phage display library after multiple rounds of in vitro panning on tumor cells, with the specificity of the clinically established murine monoclonal anti-CD22 antibody RFB4. We show that a panel of grafted scFvs retained the specificity of the murine monoclonal antibody, bound to the target antigen with high affinity (6.4-9.6 nM), and exhibited exceptional biophysical stability with retention of 89-93% of the initial binding activity after 6 days of incubation in human serum at 37degreesC. Selection of stable human scaffolds with high sequence identity to both the human germline and the rodent frameworks required only a small number of murine residues to be retained within the human frameworks in order to maintain the structural integrity of the antigen binding site. We expect this approach may be applicable for the rapid generation of highly stable humanized antibodies with low immunogenic potential.
Resumo:
Quantitative control of aroma generation during the Maillard reaction presents great scientific and industrial interest. Although there have been many studies conducted in simplified model systems, the results are difficult to apply to complex food systems, where the presence of other components can have a significant impact. In this work, an aqueous extract of defatted beef liver was chosen as a simplified food matrix for studying the kinetics of the Mallard reaction. Aliquots of the extract were heated under different time and temperature conditions and analyzed for sugars, amino acids, and methylbutanals, which are important Maillard-derived aroma compounds formed in cooked meat. Multiresponse kinetic modeling, based on a simplified mechanistic pathway, gave a good fit with the experimental data, but only when additional steps were introduced to take into account the interactions of glucose and glucose-derived intermediates with protein and other amino compounds. This emphasizes the significant role of the food matrix in controlling the Maillard reaction.
Resumo:
Waves with periods shorter than the inertial period exist in the atmosphere (as inertia-gravity waves) and in the oceans (as Poincaré and internal gravity waves). Such waves owe their origin to various mechanisms, but of particular interest are those arising either from local secondary instabilities or spontaneous emission due to loss of balance. These phenomena have been studied in the laboratory, both in the mechanically-forced and the thermally-forced rotating annulus. Their generation mechanisms, especially in the latter system, have not yet been fully understood, however. Here we examine short period waves in a numerical model of the rotating thermal annulus, and show how the results are consistent with those from earlier laboratory experiments. We then show how these waves are consistent with being inertia-gravity waves generated by a localised instability within the thermal boundary layer, the location of which is determined by regions of strong shear and downwelling at certain points within a large-scale baroclinic wave flow. The resulting instability launches small-scale inertia-gravity waves into the geostrophic interior of the flow. Their behaviour is captured in fully nonlinear numerical simulations in a finite-difference, 3D Boussinesq Navier-Stokes model. Such a mechanism has many similarities with those responsible for launching small- and meso-scale inertia-gravity waves in the atmosphere from fronts and local convection.
Resumo:
A new mild method has been devised for generating o-(naphtho)quinone methides via fluoride-induced desilylation of silyl derivatives of o-hydroxybenzyl(or 1-naphthylmethyl) nitrate. The reactive o-(naphtho)quinone methide intermediates were trapped by C, O, N and S nucleophiles and underwent “inverse electron-demand” hetero Diels- Alder reaction with dienophiles to give stable adducts. The method has useful potential application in natural product synthesis and drug research
Resumo:
Military doctrine is one of the conceptual components of war. Its raison d’être is that of a force multiplier. It enables a smaller force to take on and defeat a larger force in battle. This article’s departure point is the aphorism of Sir Julian Corbett, who described doctrine as ‘the soul of warfare’. The second dimension to creating a force multiplier effect is forging doctrine with an appropriate command philosophy. The challenge for commanders is how, in unique circumstances, to formulate, disseminate and apply an appropriate doctrine and combine it with a relevant command philosophy. This can only be achieved by policy-makers and senior commanders successfully answering the Clausewitzian question: what kind of conflict are they involved in? Once an answer has been provided, a synthesis of these two factors can be developed and applied. Doctrine has implications for all three levels of war. Tactically, doctrine does two things: first, it helps to create a tempo of operations; second, it develops a transitory quality that will produce operational effect, and ultimately facilitate the pursuit of strategic objectives. Its function is to provide both training and instruction. At the operational level instruction and understanding are critical functions. Third, at the strategic level it provides understanding and direction. Using John Gooch’s six components of doctrine, it will be argued that there is a lacunae in the theory of doctrine as these components can manifest themselves in very different ways at the three levels of war. They can in turn affect the transitory quality of tactical operations. Doctrine is pivotal to success in war. Without doctrine and the appropriate command philosophy military operations cannot be successfully concluded against an active and determined foe.
Resumo:
This article combines institutional and resources’ arguments to show that the institutional distance between the home and the host country, and the headquarters’ financial performance have a relevant impact on the environmental standardization decision in multinational companies. Using a sample of 135 multinational companies in three different industries with headquarters and subsidiaries based in the USA, Canada, Mexico, France, and Spain, we find that a high environmental institutional distance between headquarters’ and subsidiaries’ countries deters the standardization of environmental practices. On the other hand, high-profit headquarters are willing to standardize their environmental practices, rather than taking advantage of countries with lax environmental protection to undertake more pollution-intensive activities. Finally, we show that headquarters’ financial performance also imposes a moderating effect on the relationship between environmental institutional distance between countries and environmental standardization within the multinational company.
Resumo:
Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.