47 resultados para HIGH-HARMONIC-GENERATION
Resumo:
A prerequisite for the enrichment of antibodies screened from phage display libraries is their stable expression on a phage during multiple selection rounds. Thus, if stringent panning procedures are employed, selection is simultaneously driven by antigen affinity, stability and solubility. To take advantage of robust pre-selected scaffolds of such molecules, we grafted single-chain Fv (scFv) antibodies, previously isolated from a human phage display library after multiple rounds of in vitro panning on tumor cells, with the specificity of the clinically established murine monoclonal anti-CD22 antibody RFB4. We show that a panel of grafted scFvs retained the specificity of the murine monoclonal antibody, bound to the target antigen with high affinity (6.4-9.6 nM), and exhibited exceptional biophysical stability with retention of 89-93% of the initial binding activity after 6 days of incubation in human serum at 37degreesC. Selection of stable human scaffolds with high sequence identity to both the human germline and the rodent frameworks required only a small number of murine residues to be retained within the human frameworks in order to maintain the structural integrity of the antigen binding site. We expect this approach may be applicable for the rapid generation of highly stable humanized antibodies with low immunogenic potential.
Resumo:
This study was carried out to examine the effect or inulin (IN), fructooligosaccharide (FOS), polydextrose (POL) and isomaltooligosaccharides (ISO), alone and in combination, on gas production, gas composition and prebiotic effects. Static batch culture fermentation was performed with faecal samples from three healthy volunteers to study the volume and composition of gas generated and changes in bacterial populations. Four carbohydrates alone or mixed with one another (50:50) were examined. Prebiotic index (PI) was calculated and used to compare the prebiotic effect. The high amount of gas produced by IN was reduced by mixing it with FOS. No reduction in gas generation was observed when POL and ISO mixed with other substrates. It was found that the mixture of IN and FOS was effective in reducing the amount of gas produced while augmenting or maintaining their potential to Support the growth of bifidobacteria in Faecal batch culture as the highest PI was achieved with FOS alone and a mixture of FOS and IN. It was also found that high volume of gas was generated in presence of POL and ISO and they had lower prebiotic effect. The results of this study imply that a Mixture of prebiotics could prove effective in reducing the amount of gas generated by the gut microflora. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use.
Resumo:
The atmospheric component of the United Kingdom’s new High-resolution Global Environmental Model (HiGEM) has been run with interactive aerosol schemes that include biomass burning and mineral dust. Dust emission, transport, and deposition are parameterized within the model using six particle size divisions, which are treated independently. The biomass is modeled in three nonindependent modes, and emissions are prescribed from an external dataset. The model is shown to produce realistic horizontal and vertical distributions of these aerosols for each season when compared with available satellite- and ground-based observations and with other models. Combined aerosol optical depths off the coast of North Africa exceed 0.5 both in boreal winter, when biomass is the main contributor, and also in summer, when the dust dominates. The model is capable of resolving smaller-scale features, such as dust storms emanating from the Bode´ le´ and Saharan regions of North Africa and the wintertime Bode´ le´ low-level jet. This is illustrated by February and July case studies, in which the diurnal cycles of model variables in relation to dust emission and transport are examined. The top-of-atmosphere annual mean radiative forcing of the dust is calculated and found to be globally quite small but locally very large, exceeding 20 W m22 over the Sahara, where inclusion of dust aerosol is shown to improve the model radiative balance. This work extends previous aerosol studies by combining complexity with increased global resolution and represents a step toward the next generation of models to investigate aerosol–climate interactions. 1. Introduction Accurate modeling of mineral dust is known to be important because of its radiative impact in both numerical weather prediction models (Milton et al. 2008; Haywood et
Resumo:
In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.
Resumo:
We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.
Resumo:
We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.
Resumo:
To ensure minimum loss of system security and revenue it is essential that faults on underground cable systems be located and repaired rapidly. Currently in the UK, the impulse current method is used to prelocate faults, prior to using acoustic methods to pinpoint the fault location. The impulse current method is heavily dependent on the engineer's knowledge and experience in recognising/interpreting the transient waveforms produced by the fault. The development of a prototype real-time expert system aid for the prelocation of cable faults is described. Results from the prototype demonstrate the feasibility and benefits of the expert system as an aid for the diagnosis and location of faults on underground cable systems.
Resumo:
This paper provides an overview of the reduction targets that Ireland has set in the context of decarbonising their electricity generation through the use of renewables. The main challenges associated with integrating high levels (>20% of installed capacity) of non-dispatchable renewable generation are identified. The rising complexity of the challenge as renewable penetration levels increase is highlighted. A list of relevant research questions is then proposed, and an overview is given into the previous work that has gone into answering some of them. In particular, studies into the Irish energy market are identified, the current knowledge gap is described, and areas of necessary future research are suggested
Resumo:
PV only generates electricity during daylight hours and primarily generates over summer. In the UK, the carbon intensity of grid electricity is higher during the daytime and over winter. This work investigates whether the grid electricity displaced by PV is high or low carbon compared to the annual mean carbon intensity using carbon factors at higher temporal resolutions (half-hourly and daily). UK policy for carbon reporting requires savings to be calculated using the annual mean carbon intensity of grid electricity. This work offers an insight into whether this technique is appropriate. Using half hourly data on the generating plant supplying the grid from November 2008 to May 2010, carbon factors for grid electricity at half-hourly and daily resolution have been derived using technology specific generation emission factors. Applying these factors to generation data from PV systems installed on schools, it is possible to assess the variation in the carbon savings from displacing grid electricity with PV generation using carbon factors with different time resolutions. The data has been analyzed for a period of 363 to 370 days and so cannot account for inter-year variations in the relationship between PV generation and carbon intensity of the electricity grid. This analysis suggests that PV displaces more carbon intensive electricity using half-hourly carbon factors than using daily factors but less compared with annual ones. A similar methodology could provide useful insights on other variable renewable and demand-side technologies and in other countries where PV performance and grid behavior are different.
Resumo:
Wind generation’s contribution to meeting extreme peaks in electricity demand is a key concern for the integration of wind power. In Great Britain (GB), robustly assessing this contribution directly from power system data (i.e. metered wind-supply and electricity demand) is difficult as extreme peaks occur infrequently (by definition) and measurement records are both short and inhomogeneous. Atmospheric circulation-typing combined with meteorological reanalysis data is proposed as a means to address some of these difficulties, motivated by a case study of the extreme peak demand events in January 2010. A preliminary investigation of the physical and statistical properties of these circulation types suggests that they can be used to identify the conditions that are most likely to be associated with extreme peak demand events. Three broad cases are highlighted as requiring further investigation. The high-over-Britain anticyclone is found to be generally associated with very low winds but relatively moderate temperatures (and therefore moderate peak demands, somewhat in contrast to the classic low-wind cold snap that is sometimes apparent in the literature). In contrast, both longitudinally extended blocking over Scotland/Scandinavia and latitudinally extended troughs over western Europe appear to be more closely linked to the very cold GB temperatures (usually associated with extreme peak demands). In both of these latter situations, wind resource averaged across GB appears to be more moderate.
Resumo:
Warfarin resistance was first discovered among Norway rat (Rattus norvegicus) populations in Scotland in 1958 and further reports of resistance, both in this species and in others, soon followed from other parts of Europe and the United States. Researchers quickly defined the practical impact of these resistance phenomena and developed robust methods by which to monitor their spread. These tasks were relatively simple because of the high degree of immunity to warfarin conferred by the resistance genes. Later, the second generation anticoagulants were introduced to control rodents resistant to the warfarin-like compounds, but resistance to difenacoum, bromadiolone and brodifacoum is now reported in certain localities in Europe and elsewhere. However, the adoption of test methods designed initially for use with the first generation compounds to identify resistance to compounds of the second generation has led to some practical difficulties in conducting tests and in establishing meaningful resistance baselines. In particular, the results of certain test methodologies are difficult to interpret in terms of the likely impact on practical control treatments of the resistance phenomena they seek to identify. This paper defines rodenticide resistance in the context of both first and second generation anticoagulants. It examines the advantages and disadvantages of existing laboratory and field methods used in the detection of rodent populations resistant to anticoagulants and proposes some improvements in the application of these techniques and in the interpretation of their results.
Resumo:
This article combines institutional and resources’ arguments to show that the institutional distance between the home and the host country, and the headquarters’ financial performance have a relevant impact on the environmental standardization decision in multinational companies. Using a sample of 135 multinational companies in three different industries with headquarters and subsidiaries based in the USA, Canada, Mexico, France, and Spain, we find that a high environmental institutional distance between headquarters’ and subsidiaries’ countries deters the standardization of environmental practices. On the other hand, high-profit headquarters are willing to standardize their environmental practices, rather than taking advantage of countries with lax environmental protection to undertake more pollution-intensive activities. Finally, we show that headquarters’ financial performance also imposes a moderating effect on the relationship between environmental institutional distance between countries and environmental standardization within the multinational company.
Resumo:
In this article we describe recent progress on the design, analysis and implementation of hybrid numerical-asymptotic boundary integral methods for boundary value problems for the Helmholtz equation that model time harmonic acoustic wave scattering in domains exterior to impenetrable obstacles. These hybrid methods combine conventional piecewise polynomial approximations with high-frequency asymptotics to build basis functions suitable for representing the oscillatory solutions. They have the potential to solve scattering problems accurately in a computation time that is (almost) independent of frequency and this has been realized for many model problems. The design and analysis of this class of methods requires new results on the analysis and numerical analysis of highly oscillatory boundary integral operators and on the high-frequency asymptotics of scattering problems. The implementation requires the development of appropriate quadrature rules for highly oscillatory integrals. This article contains a historical account of the development of this currently very active field, a detailed account of recent progress and, in addition, a number of original research results on the design, analysis and implementation of these methods.
Resumo:
Various studies investigating the future impacts of integrating high levels of renewable energy make use of historical meteorological (met) station data to produce estimates of future generation. Hourly means of 10m horizontal wind are extrapolated to a standard turbine hub height using the wind profile power or log law and used to simulate the hypothetical power output of a turbine at that location; repeating this procedure using many viable locations can produce a picture of future electricity generation. However, the estimate of hub height wind speed is dependent on the choice of the wind shear exponent a or the roughness length z0, and requires a number of simplifying assumptions. This paper investigates the sensitivity of this estimation on generation output using a case study of a met station in West Freugh, Scotland. The results show that the choice of wind shear exponent is a particularly sensitive parameter which can lead to significant variation of estimated hub height wind speed and hence estimated future generation potential of a region.