931 resultados para Scalar Functions of one Variable


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La majoria de les fallades en elements estructurals són degudes a càrrega per fatiga. En conseqüència, la fatiga mecànica és un factor clau per al disseny d'elements mecànics. En el cas de materials compòsits laminats, el procés de fallada per fatiga inclou diferents mecanismes de dany que resulten en la degradació del material. Un dels mecanismes de dany més importants és la delaminació entre capes del laminat. En el cas de components aeronàutics, les plaques de composit estan exposades a impactes i les delaminacions apareixen facilment en un laminat després d'un impacte. Molts components fets de compòsit tenen formes corbes, superposició de capes i capes amb diferents orientacions que fan que la delaminació es propagui en un mode mixt que depen de la grandària de la delaminació. És a dir, les delaminacions generalment es propaguen en mode mixt variable. És per això que és important desenvolupar nous mètodes per caracteritzar el creixement subcrític en mode mixt per fatiga de les delaminacions. El principal objectiu d'aquest treball és la caracterització del creixement en mode mixt variable de les delaminacions en compòsits laminats per efecte de càrregues a fatiga. Amb aquest fi, es proposa un nou model per al creixement per fatiga de la delaminació en mode mixt. Contràriament als models ja existents, el model que es proposa es formula d'acord a la variació no-monotònica dels paràmetres de propagació amb el mode mixt observada en diferents resultats experimentals. A més, es du a terme un anàlisi de l'assaig mixed-mode end load split (MMELS), la característica més important del qual és la variació del mode mixt a mesura que la delaminació creix. Per a aquest anàlisi, es tenen em compte dos mètodes teòrics presents en la literatura. No obstant, les expressions resultants per l'assaig MMELS no són equivalents i les diferències entre els dos mètodes poden ser importants, fins a 50 vegades. Per aquest motiu, en aquest treball es porta a terme un anàlisi alternatiu més acurat del MMELS per tal d'establir una comparació. Aquest anàlisi alternatiu es basa en el mètode dels elements finits i virtual crack closure technique (VCCT). D'aquest anàlisi en resulten importants aspectes a considerar per a la bona caracterització de materials utilitzant l'assaig MMELS. Durant l'estudi s'ha dissenyat i construït un utillatge per l'assaig MMELS. Per a la caracterització experimental de la propagació per fatiga de delaminacions en mode mixt variable s'utilitzen diferents provetes de laminats carboni/epoxy essencialment unidireccionals. També es du a terme un anàlisi fractogràfic d'algunes de les superfícies de fractura per delaminació. Els resultats experimentals són comparats amb les prediccions del model proposat per la propagació per fatiga d'esquerdes interlaminars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exhibiting is or should be to work against ignorance, especially against the most refractory of all ignorance: the pre-conceived idea of stereo typed culture. To exhibit is to take a calculated risk of disorientation - in the etymological sense: (to lose your bearings), disturbs the harmony, the evident , and the consensus, that constitutes the common place (the banal). Needless to say however it is obvious that an exhibition that deliberately tries to scandalise will create an inverted perversion which results in an obscurantist pseudo-luxury - culture ... between demagogy and provocation, one has to find visual communication's subtle itinerary. Even though an intermediary route is not so stimulating: as Gaston Bachelard said "All the roads lead to Rome, except the roads of compromise." It is becoming ever more evident that museums have undergone changes that are noticeable in numerous areas. As well as the traditional functions of collecting, conserving and exhibiting objects. museums have tried to become a means of communication, open and aware of the worries of modern society. In order to do this , it has started to utilise modern technology now available and lead by the hand of "marketing" and modern business management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we use the no-response test idea, introduced in Luke and Potthast (2003) and Potthast (Preprint) and the inverse obstacle problem, to identify the interface of the discontinuity of the coefficient gamma of the equation del (.) gamma(x)del + c(x) with piecewise regular gamma and bounded function c(x). We use infinitely many Cauchy data as measurement and give a reconstructive method to localize the interface. We will base this multiwave version of the no-response test on two different proofs. The first one contains a pointwise estimate as used by the singular sources method. The second one is built on an energy (or an integral) estimate which is the basis of the probe method. As a conclusion of this, the probe and the singular sources methods are equivalent regarding their convergence and the no-response test can be seen as a unified framework for these methods. As a further contribution, we provide a formula to reconstruct the values of the jump of gamma(x), x is an element of partial derivative D at the boundary. A second consequence of this formula is that the blow-up rate of the indicator functions of the probe and singular sources methods at the interface is given by the order of the singularity of the fundamental solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample Kolmogorov–Smirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10° two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2° global model and a 1/8° assimilative model, might have skill only on some sections in the region

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Orthogonal internal coordinates are defined which have useful properties for constructing the potential energy functions of triatomic molecules with two or three minima on the surface. The coordinates are used to obtain ground state potentials of ClOO and HOF, both of which have three minima.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An adaptive tuned vibration absorber (ATVA) with a smart variable stiffness element is capable of retuning itself in response to a time-varying excitation frequency., enabling effective vibration control over a range of frequencies. This paper discusses novel methods of achieving variable stiffness in an ATVA by changing shape, as inspired by biological paradigms. It is shown that considerable variation in the tuned frequency can be achieved by actuating a shape change, provided that this is within the limits of the actuator. A feasible design for such an ATVA is one in which the device offers low resistance to the required shape change actuation while not being restricted to low values of the effective stiffness of the vibration absorber. Three such original designs are identified: (i) A pinned-pinned arch beam with fixed profile of slight curvature and variable preload through an adjustable natural curvature; (ii) a vibration absorber with a stiffness element formed from parallel curved beams of adjustable curvature vibrating longitudinally; (iii) a vibration absorber with a variable geometry linkage as stiffness element. The experimental results from demonstrators based on two of these designs show good correlation with the theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is little information about the relation between the fatty acid composition of human immune cells and the function of those cells over the habitual range of fatty acid intakes. Objective: The objective of the study was to determine the relation between the fatty acid composition of human peripheral blood mononuclear cell (PBMC) phospholipids and the functions of human immune cells. Design: One hundred fifty healthy adult subjects provided a fasting blood sample. The phagocytic and oxidative burst activities of monocytes and neutrophils were measured in whole blood. PBMCs were isolated and used to measure lymphocyte proliferation in response to the T cell mitogen concanavalin A and the production of cytokines in response to concanavalin A or bacterial lipopolysaccharide. The fatty acid composition of plasma and PBMC phospholipids was determined. Results: Wide variations in fatty acid composition of PBMC phospholipids and immune cell functions were identified among the subjects. The proportions of total Polyunsaturated fatty acids (PUFAs), of total n-6 and n-3 PUFAs, and of several individual PUFAs in PBMC phospholipids were positively correlated with phagocytosis by neutrophils and monocytes, neutrophil oxidative burst, lymphocyte proliferation, and interferon gamma production. The ratios of saturated fatty acids to PUFAs and of n-6 to n-3 PUFAs were negatively correlated with these same immune functions. The relation of PBMC fatty acid composition to monocyte oxidative burst was the reverse of its relation to monocyte phagocytosis and neutrophil oxidative burst. Conclusion: Variations in the fatty acid composition of PBMC phospholipids account for some of the variability in immune cell functions among healthy adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solutions of a two-dimensional dam break problem are presented for two tailwater/reservoir height ratios. The numerical scheme used is an extension of one previously given by the author [J. Hyd. Res. 26(3), 293–306 (1988)], and is based on numerical characteristic decomposition. Thus approximate solutions are obtained via linearised problems, and the method of upwind differencing is used for the resulting scalar problems, together with a flux limiter for obtaining a second order scheme which avoids non-physical, spurious oscillations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper stability of one-step ahead predictive controllers based on non-linear models is established. It is shown that, under conditions which can be fulfilled by most industrial plants, the closed-loop system is robustly stable in the presence of plant uncertainties and input–output constraints. There is no requirement that the plant should be open-loop stable and the analysis is valid for general forms of non-linear system representation including the case out when the problem is constraint-free. The effectiveness of controllers designed according to the algorithm analyzed in this paper is demonstrated on a recognized benchmark problem and on a simulation of a continuous-stirred tank reactor (CSTR). In both examples a radial basis function neural network is employed as the non-linear system model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Serine proteases are a major component of viper venoms and are thought to disrupt several distinct elements of the blood coagulation system of envenomed victims. A detailed understanding of the functions of these enzymes is important both for acquiring a fuller understanding of the pathology of envenoming and because these venom proteins have shown potential in treating blood coagulation disorders. Methodology/Principal Findings: In this study a novel, highly abundant serine protease, which we have named rhinocerase, has been isolated and characterised from the venom of Bitis gabonica rhinoceros using liquid phase isoelectric focusing and gel filtration. Like many viper venom serine proteases, this enzyme is glycosylated; the estimated molecular mass of the native enzyme is approximately 36kDa, which reduces to 31kDa after deglycosylation. The partial amino acid sequence shows similarity to other viper venom serine proteases, but is clearly distinct from the sequence of the only other sequenced serine protease from Bitis gabonica. Other viper venom serine proteases have been shown to exert distinct biological effects, and our preliminary functional characterization of rhinocerase suggest it to be multifunctional. It is capable of degrading α and β chains of fibrinogen, dissolving plasma clots and of hydrolysing a kallikrein substrate. Conclusions/Significance: A novel multifunctional viper venom serine protease has been isolated and characterised. The activities of the enzyme are consistent with the known in vivo effects of Bitis gabonica envenoming, including bleeding disorders, clotting disorders and hypotension. This study will form the basis for future research to understand the mechanisms of serine protease action, and examine the potential for rhinocerase to be used clinically to reduce the risk of human haemostatic disorders such as heart attacks and strokes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transient epileptic amnesia (TEA) is characterized by deficits in autobiographical memory (AM). One of the functions of AM is to maintain the self, suggesting that the self may undergo changes as a result of memory loss in temporal lobe epilepsy. To examine this, we used a modification of a task used to assess the relationship between self and memory (the IAM task) in a single case, E.B. Despite complaints of AM loss, E.B. had no difficulty in producing a range of self-images (e.g., I am a husband) and collections of self-defining AMs in support of these statements. E.B. produced fewer episodic memories at times of self-formation, but this did not seem to impact on the maintenance of self. The results support recent work suggesting the self may be maintained in the absence of episodic memory. The application of tasks such as that used here will further elucidate AM impairment in temporal lobe epilepsy. (C) 2011 Elsevier Inc. All rights reserved.