949 resultados para Reduced-basis method
Resumo:
The possibilities and need for adaptation and mitigation depends on uncertain future developments with respect to socio-economic factors and the climate system. Scenarios are used to explore the impacts of different strategies under uncertainty. In this chapter, some scenarios are presented that are used in the ADAM project for this purpose. One scenario explores developments with no mitigation, and thus with high temperature increase and high reliance on adaptation (leading to 4oC increase by 2100 compared to pre-industrial levels). A second scenario explores an ambitious mitigation strategy (leading to 2oC increase by 2100 compared to pre-industrial levels). In the latter scenario, stringent mitigation strategies effectively reduces the risks of climate change, but based on uncertainties in the climate system a temperature increase of 3oC or more cannot be excluded. The analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but even then adaptation will be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which adaptation is more cost-effective than mitigation, but mitigation can help reduce damages and the cost of adaptation. For agriculture, finally, only the scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.
Resumo:
Background and purpose: Molecular mechanisms underlying the links between dietary intake of flavonoids and reduced cardiovascular disease risk are only partially understood. Key events in the pathogenesis of cardiovascular disease, particularly thrombosis, are inhibited by these polyphenolic compounds via mechanisms such as inhibition of platelet activation and associated signal transduction, attenuation of generation of reactive oxygen species, enhancement of nitric oxide production and binding to thromboxane A2 receptors. In vivo, effects of flavonoids are mediated by their metabolites, but the effects and modes of action of these compounds are not well-characterized. A good understanding of flavonoid structure–activity relationships with regard to platelet function is also lacking. Experimental approach: Inhibitory potencies of structurally distinct flavonoids (quercetin, apigenin and catechin) and plasma metabolites (tamarixetin, quercetin-3′-sulphate and quercetin-3-glucuronide) for collagen-stimulated platelet aggregation and 5-hydroxytryptamine secretion were measured in human platelets. Tyrosine phosphorylation of total protein, Syk and PLCγ2 (immunoprecipitation and Western blot analyses), and Fyn kinase activity were also measured in platelets. Internalization of flavonoids and metabolites in a megakaryocytic cell line (MEG-01 cells) was studied by fluorescence confocal microscopy. Key results: The inhibitory mechanisms of these compounds included blocking Fyn kinase activity and the tyrosine phosphorylation of Syk and PLCγ2 following internalization. Principal functional groups attributed to potent inhibition were a planar, C-4 carbonyl substituted and C-3 hydroxylated C ring in addition to a B ring catechol moiety. Conclusions and implications: The structure–activity relationship for flavonoids on platelet function presented here may be exploited to design selective inhibitors of cell signalling.
Resumo:
In recent years nonpolynomial finite element methods have received increasing attention for the efficient solution of wave problems. As with their close cousin the method of particular solutions, high efficiency comes from using solutions to the Helmholtz equation as basis functions. We present and analyze such a method for the scattering of two-dimensional scalar waves from a polygonal domain that achieves exponential convergence purely by increasing the number of basis functions in each element. Key ingredients are the use of basis functions that capture the singularities at corners and the representation of the scattered field towards infinity by a combination of fundamental solutions. The solution is obtained by minimizing a least-squares functional, which we discretize in such a way that a matrix least-squares problem is obtained. We give computable exponential bounds on the rate of convergence of the least-squares functional that are in very good agreement with the observed numerical convergence. Challenging numerical examples, including a nonconvex polygon with several corner singularities, and a cavity domain, are solved to around 10 digits of accuracy with a few seconds of CPU time. The examples are implemented concisely with MPSpack, a MATLAB toolbox for wave computations with nonpolynomial basis functions, developed by the authors. A code example is included.
Resumo:
Radial basis functions can be combined into a network structure that has several advantages over conventional neural network solutions. However, to operate effectively the number and positions of the basis function centres must be carefully selected. Although no rigorous algorithm exists for this purpose, several heuristic methods have been suggested. In this paper a new method is proposed in which radial basis function centres are selected by the mean-tracking clustering algorithm. The mean-tracking algorithm is compared with k means clustering and it is shown that it achieves significantly better results in terms of radial basis function performance. As well as being computationally simpler, the mean-tracking algorithm in general selects better centre positions, thus providing the radial basis functions with better modelling accuracy
Resumo:
A sampling oscilloscope is one of the main units in automatic pulse measurement system (APMS). The time jitter in waveform samplers is an important error source that affect the precision of data acquisition. In this paper, this kind of error is greatly reduced by using the deconvolution method. First, the probability density function (PDF) of time jitter distribution is determined by the statistical approach, then, this PDF is used as convolution kern to deconvolve with the acquired waveform data with additional averaging, and the result is the waveform data in which the effect of time jitter has been removed, and the measurement precision of APMS is greatly improved. In addition, some computer simulations are given which prove the success of the method given in this paper.
Resumo:
A predominance of small, dense low-density lipoprotein (LDL) is a major component of an atherogenic lipoprotein phenotype, and a common, but modifiable, source of increased risk for coronary heart disease in the free-living population. While much of the atherogenicity of small, dense LDL is known to arise from its structural properties, the extent to which an increase in the number of small, dense LDL particles (hyper-apoprotein B) contributes to this risk of coronary heart disease is currently unknown. This study reports a method for the recruitment of free-living individuals with an atherogenic lipoprotein phenotype for a fish-oil intervention trial, and critically evaluates the relationship between LDL particle number and the predominance of small, dense LDL. In this group, volunteers were selected through local general practices on the basis of a moderately raised plasma triacylglycerol (triglyceride) level (>1.5 mmol/l) and a low concentration of high-density-lipoprotein cholesterol (<1.1 mmol/l). The screening of LDL subclasses revealed a predominance of small, dense LDL (LDL subclass pattern B) in 62% of the cohort. As expected, subjects with LDL subclass pattern B were characterized by higher plasma triacylglycerol and lower high-density lipoprotein cholesterol (<1.1 mmol/l) levels and, less predictably, by lower LDL cholesterol and apoprotein B levels (P<0.05; LDL subclass A compared with subclass B). While hyper-apoprotein B was detected in only five subjects, the relative percentage of small, dense LDL-III in subjects with subclass B showed an inverse relationship with LDL apoprotein B (r=-0.57; P<0.001), identifying a subset of individuals with plasma triacylglycerol above 2.5 mmol/l and a low concentration of LDL almost exclusively in a small and dense form. These findings indicate that a predominance of small, dense LDL and hyper-apoprotein B do not always co-exist in free-living groups. Moreover, if coronary risk increases with increasing LDL particle number, these results imply that the risk arising from a predominance of small, dense LDL may actually be reduced in certain cases when plasma triacylglycerol exceeds 2.5 mmol/l.
Resumo:
The background error covariance matrix, B, is often used in variational data assimilation for numerical weather prediction as a static and hence poor approximation to the fully dynamic forecast error covariance matrix, Pf. In this paper the concept of an Ensemble Reduced Rank Kalman Filter (EnRRKF) is outlined. In the EnRRKF the forecast error statistics in a subspace defined by an ensemble of states forecast by the dynamic model are found. These statistics are merged in a formal way with the static statistics, which apply in the remainder of the space. The combined statistics may then be used in a variational data assimilation setting. It is hoped that the nonlinear error growth of small-scale weather systems will be accurately captured by the EnRRKF, to produce accurate analyses and ultimately improved forecasts of extreme events.
Resumo:
Scenarios are used to explore the consequences of different adaptation and mitigation strategies under uncertainty. In this paper, two scenarios are used to explore developments with (1) no mitigation leading to an increase of global mean temperature of 4 °C by 2100 and (2) an ambitious mitigation strategy leading to 2 °C increase by 2100. For the second scenario, uncertainties in the climate system imply that a global mean temperature increase of 3 °C or more cannot be ruled out. Our analysis shows that, in many cases, adaptation and mitigation are not trade-offs but supplements. For example, the number of people exposed to increased water resource stress due to climate change can be substantially reduced in the mitigation scenario, but adaptation will still be required for the remaining large numbers of people exposed to increased stress. Another example is sea level rise, for which, from a global and purely monetary perspective, adaptation (up to 2100) seems more effective than mitigation. From the perspective of poorer and small island countries, however, stringent mitigation is necessary to keep risks at manageable levels. For agriculture, only a scenario based on a combination of adaptation and mitigation is able to avoid serious climate change impacts.
Resumo:
The Routh-stability method is employed to reduce the order of discrete-time system transfer functions. It is shown that the Routh approximant is well suited to reduce both the denominator and the numerator polynomials, although alternative methods, such as PadÃ�Â(c)-Markov approximation, are also used to fit the model numerator coefficients.
Resumo:
Previous studies have reported that cheese curd syneresis kinetics can be monitored by dilution of chemical tracers, such as Blue Dextran, in whey. The objective of this study was to evaluate an improved tracer method to monitor whey volumes expelled over time during syneresis. Two experiments with different ranges of milk fat (0-5% and 2.3-3.5%) were carried out in an 11 L double-O laboratory scale cheese vat. Tracer was added to the curd-whey mixture during the cutting phase of cheese making and samples were taken at 10 min intervals up to 75 min after cutting. The volume of whey expelled was measured gravimetrically and the dilution of tracer in the whey was measured by absorbance at 620 nm. The volumes of whey expelled were significantly reduced at higher milk fat levels. Whey yield was predicted with a SEP ranging from 3.2 to 6.3 g whey/100 mL of milk and a CV ranging from 2.03 to 2.7% at different milk fat levels.
Resumo:
The coarse spacing of automatic rain gauges complicates near-real- time spatial analyses of precipitation. We test the possibility of improving such analyses by considering, in addition to the in situ measurements, the spatial covariance structure inferred from past observations with a denser network. To this end, a statistical reconstruction technique, reduced space optimal interpolation (RSOI), is applied over Switzerland, a region of complex topography. RSOI consists of two main parts. First, principal component analysis (PCA) is applied to obtain a reduced space representation of gridded high- resolution precipitation fields available for a multiyear calibration period in the past. Second, sparse real-time rain gauge observations are used to estimate the principal component scores and to reconstruct the precipitation field. In this way, climatological information at higher resolution than the near-real-time measurements is incorporated into the spatial analysis. PCA is found to efficiently reduce the dimensionality of the calibration fields, and RSOI is successful despite the difficulties associated with the statistical distribution of daily precipitation (skewness, dry days). Examples and a systematic evaluation show substantial added value over a simple interpolation technique that uses near-real-time observations only. The benefit is particularly strong for larger- scale precipitation and prominent topographic effects. Small-scale precipitation features are reconstructed at a skill comparable to that of the simple technique. Stratifying the reconstruction method by the types of weather type classifications yields little added skill. Apart from application in near real time, RSOI may also be valuable for enhancing instrumental precipitation analyses for the historic past when direct observations were sparse.
Resumo:
This paper explores the possibility of combining moderate vacuum frying followed by post-frying high vacuum application during the oil drainage stage, with the aim to reduce oil content in potato chips. Potato slices were initially vacuum fried under two operating conditions (140 °C, 20 kPa and 162 °C, 50.67 kPa) until the moisture content reached 10 and 15 % (wet basis), prior to holding the samples in the head space under high vacuum level (1.33 kPa). This two-stage process was found to lower significantly the amount of oil taken up by potato chips by an amount as high as 48 %, compared to drainage at the same pressure as the frying pressure. Reducing the pressure value to 1.33 kPa reduced the water saturation temperature (11 °C), causing the product to continuously lose moisture during the course of drainage. Continuous release of water vapour prevented the occluded surface oil from penetrating into the product structure and released it from the surface of the product. When frying and drainage occurred at the same pressure, the temperature of the product fell below the water saturation temperature soon after it was lifted out of the oil, which resulted in the oil getting sucked into the product. Thus, lowering the pressure after frying to a value well below the frying pressure is a promising method to lower oil uptake by the product.
Resumo:
The elaboration of curli fimbriae by Escherichia coli is associated with the development of a lacy colony morphology when groan on colonisation factor antigen agar at 25 degrees C. Avian colisepticaemia E. coli isolates screened for curliation by this culture technique showed lacy and smooth colonial morphologies and the genetic basis of the non-curliated smooth colonial phenotype was analysed. Two smooth E, coli O78:K80 isolates possessed about 40 copies of the IS1 element within their respective genomes of which one copy insertionally inactivated the csgB gene, the nucleator gene for curli fibril formation. One of these two isolates also possessed a defective rpoS gene which is a known regulator of curli expression. In the day-old chick model, both smooth isolates were as invasive as a known virulent O78:K80 isolate as determined by extent of liver and spleen colonisation post oral inoculation but were less persistent in terms of caecal colonisation. (C) 1999 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.
Resumo:
This article presents a reinterpretation of James Harrington's writings. It takes issue with J. G. A. Pocock's reading, which treats him as importing into England a Machiavellian ‘language of political thought’. This reading is the basis of Pocock's stress on the republicanism of eighteenth-century opposition values. Harrington's writings were in fact a most implausible channel for such ideas. His outlook owed much to Stoicism. Unlike the Florentine, he admired the contemplative life; was sympathetic to commerce; and was relaxed about the threat of ‘corruption’ (a concept that he did not understand). These views can be associated with his apparent aims: the preservation of a national church with a salaried but politically impotent clergy; and the restoration of the royalist gentry to a leading role in English politics. Pocock's hypothesis is shown to be conditioned by his method; its weaknesses reflect some difficulties inherent in the notion of ‘languages of thought’.
Resumo:
In this paper we propose and analyze a hybrid $hp$ boundary element method for the solution of problems of high frequency acoustic scattering by sound-soft convex polygons, in which the approximation space is enriched with oscillatory basis functions which efficiently capture the high frequency asymptotics of the solution. We demonstrate, both theoretically and via numerical examples, exponential convergence with respect to the order of the polynomials, moreover providing rigorous error estimates for our approximations to the solution and to the far field pattern, in which the dependence on the frequency of all constants is explicit. Importantly, these estimates prove that, to achieve any desired accuracy in the computation of these quantities, it is sufficient to increase the number of degrees of freedom in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods.