925 resultados para Shifted Legendre polynomials
Resumo:
The SternReview(2006) on the economics of climate change has changed the ground onwhich arguments over climate change are fought. In making the economic case for mitigation over adaptation, Stern has sought to undermine one of the primary arguments against decisive early action - that the immediate costs outweigh the long-term benefits. While this argument is made at the global level, it is appropriate to ask what implications Stern's arguments might have at the construction sectoral level. The implications of the Stern Review for construction can be divided into three main questions. How construction would be different if Stern's economics were applied? How would construction companies act differently if Stern's ethics were adopted? How will the political response to the Stern Review change construction's policy and regulatory landscape? The impact of the Review has shifted the debate from the natural science community into the public policy domain and onto an economic rationale. There seems to have been a pronounced shift away from the debate over the science and towards the economics of mitigation versus adaptation. In this context, the academic debate over Stern's methods is irrelevant - it is his findings, authority and use of the hegemonic power of economic argument that carry the day. This is likely to be Stern's true legacy, and through this will come his greatest impact on construction.
Resumo:
The rheological properties of dough and gluten are important for end-use quality of flour but there is a lack of knowledge of the relationships between fundamental and empirical tests and how they relate to flour composition and gluten quality. Dough and gluten from six breadmaking wheat qualities were subjected to a range of rheological tests. Fundamental (small-deformation) rheological characterizations (dynamic oscillatory shear and creep recovery) were performed on gluten to avoid the nonlinear influence of the starch component, whereas large deformation tests were conducted on both dough and gluten. A number of variables from the various curves were considered and subjected to a principal component analysis (PCA) to get an overview of relationships between the various variables. The first component represented variability in protein quality, associated with elasticity and tenacity in large deformation (large positive loadings for resistance to extension and initial slope of dough and gluten extension curves recorded by the SMS/Kieffer dough and gluten extensibility rig, and the tenacity and strain hardening index of dough measured by the Dobraszczyk/Roberts dough inflation system), the elastic character of the hydrated gluten proteins (large positive loading for elastic modulus [G'], large negative loadings for tan delta and steady state compliance [J(e)(0)]), the presence of high molecular weight glutenin subunits (HMW-GS) 5+10 vs. 2+12, and a size distribution of glutenin polymers shifted toward the high-end range. The second principal component was associated with flour protein content. Certain rheological data were influenced by protein content in addition to protein quality (area under dough extension curves and dough inflation curves [W]). The approach made it possible to bridge the gap between fundamental rheological properties, empirical measurements of physical properties, protein composition, and size distribution. The interpretation of this study gave indications of the molecular basis for differences in breadmaking performance.
Resumo:
A finite element numerical study has been carried out on the isothermal flow of power law fluids in lid-driven cavities with axial throughflow. The effects of the tangential flow Reynolds number (Re-U), axial flow Reynolds number (Re-W), cavity aspect ratio and shear thinning property of the fluids on tangential and axial velocity distributions and the frictional pressure drop are studied. Where comparison is possible, very good agreement is found between current numerical results and published asymptotic and numerical results. For shear thinning materials in long thin cavities in the tangential flow dominated flow regime, the numerical results show that the frictional pressure drop lies between two extreme conditions, namely the results for duct flow and analytical results from lubrication theory. For shear thinning materials in a lid-driven cavity, the interaction between the tangential flow and axial flow is very complex because the flow is dependent on the flow Reynolds numbers and the ratio of the average axial velocity and the lid velocity. For both Newtonian and shear thinning fluids, the axial velocity peak is shifted and the frictional pressure drop is increased with increasing tangential flow Reynolds number. The results are highly relevant to industrial devices such as screw extruders and scraped surface heat exchangers. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Optical density measurements were used to estimate the effect of heat treatments on the single-cell lag times of Listeria innocua fitted to a shifted gamma distribution. The single-cell lag time was subdivided into repair time ( the shift of the distribution assumed to be uniform for all cells) and adjustment time (varying randomly from cell to cell). After heat treatments in which all of the cells recovered (sublethal), the repair time and the mean and the variance of the single-cell adjustment time increased with the severity of the treatment. When the heat treatments resulted in a loss of viability (lethal), the repair time of the survivors increased with the decimal reduction of the cell numbers independently of the temperature, while the mean and variance of the single-cell adjustment times remained the same irrespective of the heat treatment. Based on these observations and modeling of the effect of time and temperature of the heat treatment, we propose that the severity of a heat treatment can be characterized by the repair time of the cells whether the heat treatment is lethal or not, an extension of the F value concept for sublethal heat treatments. In addition, the repair time could be interpreted as the extent or degree of injury with a multiple-hit lethality model. Another implication of these results is that the distribution of the time for cells to reach unacceptable numbers in food is not affected by the time-temperature combination resulting in a given decimal reduction.
Resumo:
The potential of clarification questions (CQs) to act as a form of corrective input for young children's grammatical errors was examined. Corrective responses were operationalized as those occasions when child speech shifted from erroneous to correct (E -> C) contingent on a clarification question. It was predicted that E -> C sequences would prevail over shifts in the opposite direction (C -> E), as can occur in the case of nonerror-contingent CQs. This prediction was tested via a standard intervention paradigm, whereby every 60s a sequence of two clarification requests (either specific or general) was introduced into conversation with a total of 45 2- and 4-year-old children. For 10 categories of grammatical structure, E -> C sequences predominated over their C -> E counterparts, with levels of E -> C shifts increasing after two clarification questions. Children were also more reluctant to repeat erroneous forms than their correct counterparts, following the intervention of CQs. The findings provide support for Saxton's prompt hypothesis, which predicts that error-contingent CQs bear the potential to cue recall of previously acquired grammatical forms.
Resumo:
As Terabyte datasets become the norm, the focus has shifted away from our ability to produce and store ever larger amounts of data, onto its utilization. It is becoming increasingly difficult to gain meaningful insights into the data produced. Also many forms of the data we are currently producing cannot easily fit into traditional visualization methods. This paper presents a new and novel visualization technique based on the concept of a Data Forest. Our Data Forest has been designed to be used with vir tual reality (VR) as its presentation method. VR is a natural medium for investigating large datasets. Our approach can easily be adapted to be used in a variety of different ways, from a stand alone single user environment to large multi-user collaborative environments. A test application is presented using multi-dimensional data to demonstrate the concepts involved.
Resumo:
In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.
Resumo:
This study probes the molecular interactions between model drugs and poloxamers that facilitate dissolution rate improvements using solid dispersions. Ibuprofen and ketoprofen solid dispersions were prepared at different mole ratios using poloxamers 407 and 188. The carbonyl stretching vibration of the ibuprofen dimer shifted to higher wavenumber in the infrared spectra of 2:1 drug:carrier mole ratio solid dispersions, indicating disruption of the ibuprofen dimer concomitant with hydrogen bond formation between the drug and carrier. Solid dispersions with mole ratios >2:1 drug:carrier (up to 29:1) showed both ibuprofen hydrogen-bonded to the poloxamer, and excess drug present as dimers. X-ray diffraction studies confirmed these findings with no evidence of crystalline drug in 2:1 mole ratio systems whereas higher drug loadings retained crystalline ibuprofen. Similar results were found with ketoprofen-poloxamer solid dispersions. Thermal analysis of ibuprofen-poloxamer 407 solid dispersions and their resultant phase diagram suggested solid solutions and a eutectic system were formed, depending on drug loading. Dissolution studies showed fastest release from the solid solutions; dissolution rates from solid solutions were 12-fold greater than the dissolution of ibuprofen powder whereas the eutectic system gave a 6-fold improvement over the powder. When designing solid dispersions to improve the delivery of poorly-water soluble drugs, the nature of drug:carrier interactions, which are governed by the stochiometry of the composition, can affect the dissolution rate improvement.
Resumo:
Based on the potential benefits to human health, there is interest in developing sustainable nutritional strategies to enhance the concentration of long-chain n-3 fatty acids in ruminant-derived foods. Four Aberdeen Angus steers fitted with rumen and duodenal cannulae were used in a 4 × 4 Latin square experiment with 21 d experimental periods to examine the potential of fish oil (FO) in the diet to enhance the supply of 20 : 5n-3 and 22 : 6n-3 available for absorption in growing cattle. Treatments consisted of total mixed rations based on maize silage fed at a rate of 85 g DM/kg live weight0·75/d containing 0, 8, 16 and 24 g FO/kg diet DM. Supplements of FO reduced linearly (P < 0·01) DM intake and shifted (P < 0·01) rumen fermentation towards propionate at the expense of acetate and butyrate. FO in the diet enhanced linearly (P < 0·05) the flow of trans-16 : 1, trans-18 : 1, trans-18 : 2, 20 : 5n-3 and 22 : 6n-3, and decreased linearly (P < 0·05) 18 : 0 and 18 : 3n-3 at the duodenum. Increases in the flow of trans-18 : 1 were isomer dependent and were determined primarily by higher amounts of trans-11 reaching the duodenum. In conclusion, FO alters ruminal lipid metabolism of growing cattle in a dose-dependent manner consistent with an inhibition of ruminal biohydrogenation, and enhances the amount of long-chain n-3 fatty acids at the duodenum, but the increases are marginal due to extensive biohydrogenation in the rumen.
Resumo:
This paper presents a unique two-stage image restoration framework especially for further application of a novel rectangular poor-pixels detector, which, with properties of miniature size, light weight and low power consumption, has great value in the micro vision system. To meet the demand of fast processing, only a few measured images shifted up to subpixel level are needed to join the fusion operation, fewer than those required in traditional approaches. By maximum likelihood estimation with a least squares method, a preliminary restored image is linearly interpolated. After noise removal via Canny operator based level set evolution, the final high-quality restored image is achieved. Experimental results demonstrate effectiveness of the proposed framework. It is a sensible step towards subsequent image understanding and object identification.
Resumo:
We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.
Resumo:
We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.
Resumo:
A simple parameter adaptive controller design methodology is introduced in which steady-state servo tracking properties provide the major control objective. This is achieved without cancellation of process zeros and hence the underlying design can be applied to non-minimum phase systems. As with other self-tuning algorithms, the design (user specified) polynomials of the proposed algorithm define the performance capabilities of the resulting controller. However, with the appropriate definition of these polynomials, the synthesis technique can be shown to admit different adaptive control strategies, e.g. self-tuning PID and self-tuning pole-placement controllers. The algorithm can therefore be thought of as an embodiment of other self-tuning design techniques. The performances of some of the resulting controllers are illustrated using simulation examples and the on-line application to an experimental apparatus.
Resumo:
The problem of identification of a nonlinear dynamic system is considered. A two-layer neural network is used for the solution of the problem. Systems disturbed with unmeasurable noise are considered, although it is known that the disturbance is a random piecewise polynomial process. Absorption polynomials and nonquadratic loss functions are used to reduce the effect of this disturbance on the estimates of the optimal memory of the neural-network model.
Resumo:
High-resolution satellite radar observations of erupting volcanoes can yield valuable information on rapidly changing deposits and geomorphology. Using the TerraSAR-X (TSX) radar with a spatial resolution of about 2 m and a repeat interval of 11-days, we show how a variety of techniques were used to record some of the eruptive history of the Soufriere Hills Volcano, Montserrat between July 2008 and February 2010. After a 15-month pause in lava dome growth, a vulcanian explosion occurred on 28 July 2008 whose vent was hidden by dense cloud. We were able to show the civil authorities using TSX change difference images that this explosion had not disrupted the dome sufficient to warrant continued evacuation. Change difference images also proved to be valuable in mapping new pyroclastic flow deposits: the valley-occupying block-and-ash component tending to increase backscatter and the marginal surge deposits reducing it, with the pattern reversing after the event. By comparing east- and west-looking images acquired 12 hours apart, the deposition of some individual pyroclastic flows can be inferred from change differences. Some of the narrow upper sections of valleys draining the volcano received many tens of metres of rockfall and pyroclastic flow deposits over periods of a few weeks. By measuring the changing shadows cast by these valleys in TSX images the changing depth of infill by deposits could be estimated. In addition to using the amplitude data from the radar images we also used their phase information within the InSAR technique to calculate the topography during a period of no surface activity. This enabled areas of transient topography, crucial for directing future flows, to be captured.