483 resultados para Indefinite extensibility
Resumo:
The no response test is a new scheme in inverse problems for partial differential equations which was recently proposed in [D. R. Luke and R. Potthast, SIAM J. Appl. Math., 63 (2003), pp. 1292–1312] in the framework of inverse acoustic scattering problems. The main idea of the scheme is to construct special probing waves which are small on some test domain. Then the response for these waves is constructed. If the response is small, the unknown object is assumed to be a subset of the test domain. The response is constructed from one, several, or many particular solutions of the problem under consideration. In this paper, we investigate the convergence of the no response test for the reconstruction information about inclusions D from the Cauchy values of solutions to the Helmholtz equation on an outer surface $\partial\Omega$ with $\overline{D} \subset \Omega$. We show that the one‐wave no response test provides a criterion to test the analytic extensibility of a field. In particular, we investigate the construction of approximations for the set of singular points $N(u)$ of the total fields u from one given pair of Cauchy data. Thus, the no response test solves a particular version of the classical Cauchy problem. Also, if an infinite number of fields is given, we prove that a multifield version of the no response test reconstructs the unknown inclusion D. This is the first convergence analysis which could be achieved for the no response test.
Resumo:
The mechanisms underlying the increase in stress for large mechanical strains of a polymer glass, quantified by the strain-hardening modulus, are still poorly understood. In the present paper we aim to elucidate this matter and present new mechanisms. Molecular-dynamics simulations of two polymers with very different strain-hardening moduli (polycarbonate and polystyrene) have been carried out. Nonaffine displacements occur because of steric hindrances and connectivity constraints. We argue that it is not necessary to introduce the concept of entanglements to understand strain hardening, but that hardening is rather coupled with the increase in the rate of nonaffine particle displacements. This rate increases faster for polycarbonate, which has the higher strain-hardening modulus. Also more nonaffine chain stretching is present for polycarbonate. It is shown that the inner distances of such a nonaffinely deformed chain can be well described by the inner distances of the worm-like chain, but with an effective stiffness length (equal to the Kuhn length for an infinite worm-like chain) that increases during deformation. It originates from the finite extensibility of the chain. In this way the increase in nonaffine particle displacement can be understood as resulting from an increase in the effective stiffness length of the perturbed chain during deformation, so that at larger strains a higher rate of plastic events in terms of nonaffine displacement is necessary, causing in turn the observed strain hardening in polymer glasses.
Resumo:
We consider the application of the conjugate gradient method to the solution of large, symmetric indefinite linear systems. Special emphasis is put on the use of constraint preconditioners and a new factorization that can reduce the number of flops required by the preconditioning step. Results concerning the eigenvalues of the preconditioned matrix and its minimum polynomial are given. Numerical experiments validate these conclusions.
Resumo:
We consider conjugate-gradient like methods for solving block symmetric indefinite linear systems that arise from saddle-point problems or, in particular, regularizations thereof. Such methods require preconditioners that preserve certain sub-blocks from the original systems but allow considerable flexibility for the remaining blocks. We construct a number of families of implicit factorizations that are capable of reproducing the required sub-blocks and (some) of the remainder. These generalize known implicit factorizations for the unregularized case. Improved eigenvalue clustering is possible if additionally some of the noncrucial blocks are reproduced. Numerical experiments confirm that these implicit-factorization preconditioners can be very effective in practice.
Resumo:
Most suspension-feeding trichopterans spin a fine-silk capture net that is used to remove suspended matter from the water. The efficiency of these nets has previously been studied by considering the geometry of the web structure but the material from which the nets is constructed has received little attention. We report measurements of the tensile strength and extensibility of net silk from Hydropsyche siltalai. These measurements place caddisfly silk as one of the weakest natural silks so far reported, with a mean tensile strength of 221 +/- 22 megaNewtons (MN)/m(2). We also show that H. siltalai silk can more than double in length before catastrophic breakage, and that the silk is at least 2 orders of magnitude stronger than the maximum force estimated to act upon it in situ. Possible reasons for this disparity include constraints of evolutionary history and safety margins to prevent net failure or performance reduction.
Resumo:
Xyloglucan-acting enzymes are believed to have effects on type I primary plant cell wall mechanical properties. In order to get a better understanding of these effects, a range of enzymes with different in vitro modes of action were tested against cell wall analogues (bio-composite materials based on Acetobacter xylinus cellulose and xyloglucan). Tomato pericarp xyloglucan endo transglycosylase (tXET) and nasturtium seed xyloglucanase (nXGase) were produced heterologously in Pichia pastoris. Their action against the cell wall analogues was compared with that of a commercial preparation of Trichoderma endo-glucanase (EndoGase). Both 'hydrolytic' enzymes (nXGase and EndoGase) were able to depolymerise not only the cross-link xyloglucan fraction but also the surface-bound fraction. Consequent major changes in cellulose fibril architecture were observed. In mechanical terms, removal of xyloglucan cross-links from composites resulted in increased stiffness (at high strain) and decreased visco-elasticity with similar extensibility. On the other hand, true transglycosylase activity (tXET) did not affect the cellulose/xyloglucan ratio. No change in composite stiffness or extensibility resulted, but a significant increase in creep behaviour was observed in the presence of active tXET. These results provide direct in vitro evidence for the involvement of cell wall xyloglucan-specific enzymes in mechanical changes underlying plant cell wall re-modelling and growth processes. Mechanical consequences of tXET action are shown to be complimentary to those of cucumber expansin.
Resumo:
The rheological properties of dough and gluten are important for end-use quality of flour but there is a lack of knowledge of the relationships between fundamental and empirical tests and how they relate to flour composition and gluten quality. Dough and gluten from six breadmaking wheat qualities were subjected to a range of rheological tests. Fundamental (small-deformation) rheological characterizations (dynamic oscillatory shear and creep recovery) were performed on gluten to avoid the nonlinear influence of the starch component, whereas large deformation tests were conducted on both dough and gluten. A number of variables from the various curves were considered and subjected to a principal component analysis (PCA) to get an overview of relationships between the various variables. The first component represented variability in protein quality, associated with elasticity and tenacity in large deformation (large positive loadings for resistance to extension and initial slope of dough and gluten extension curves recorded by the SMS/Kieffer dough and gluten extensibility rig, and the tenacity and strain hardening index of dough measured by the Dobraszczyk/Roberts dough inflation system), the elastic character of the hydrated gluten proteins (large positive loading for elastic modulus [G'], large negative loadings for tan delta and steady state compliance [J(e)(0)]), the presence of high molecular weight glutenin subunits (HMW-GS) 5+10 vs. 2+12, and a size distribution of glutenin polymers shifted toward the high-end range. The second principal component was associated with flour protein content. Certain rheological data were influenced by protein content in addition to protein quality (area under dough extension curves and dough inflation curves [W]). The approach made it possible to bridge the gap between fundamental rheological properties, empirical measurements of physical properties, protein composition, and size distribution. The interpretation of this study gave indications of the molecular basis for differences in breadmaking performance.
Resumo:
Three large deformation rheological tests, the Kieffer dough extensibility system, the D/R dough inflation system and the 2 g mixograph test, were carried out on doughs made from a large number of winter wheat lines and cultivars grown in Poland. These lines and cultivars represented a broad spread in baking performance in order to assess their suitability as predictors of baking volume. The parameters most closely associated with baking volume were strain hardening index, bubble failure strain, and mixograph bandwidth at 10min. Simple correlations with baking volume indicate that bubble failure strain and strain hardening index give the highest correlations, whilst the use of best subsets regression, which selects the best combination of parameters, gave increased correlations with R-2 = 0.865 for dough inflation parameters, R-2 = 0. 842 for Kieffer parameters and R-2 = 0.760 for mixograph parameters. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.
Resumo:
New nonlinear stability theorems are derived for disturbances to steady basic flows in the context of the multilayer quasi-geostrophic equations. These theorems are analogues of Arnol’d's second stability theorem, the latter applying to the two-dimensional Euler equations. Explicit upper bounds are obtained on both the disturbance energy and disturbance potential enstrophy in terms of the initial disturbance fields. An important feature of the present analysis is that the disturbances are allowed to have non-zero circulation. While Arnol’d's stability method relies on the energy–Casimir invariant being sign-definite, the new criteria can be applied to cases where it is sign-indefinite because of the disturbance circulations. A version of Andrews’ theorem is established for this problem, and uniform potential vorticity flow is shown to be nonlinearly stable. The special case of two-layer flow is treated in detail, with particular attention paid to the Phillips model of baroclinic instability. It is found that the short-wave portion of the marginal stability curve found in linear theory is precisely captured by the new nonlinear stability criteria.
Resumo:
Arnol'd's second hydrodynamical stability theorem, proven originally for the two-dimensional Euler equations, can establish nonlinear stability of steady flows that are maxima of a suitably chosen energy-Casimir invariant. The usual derivations of this theorem require an assumption of zero disturbance circulation. In the present work an analogue of Arnol'd's second theorem is developed in the more general case of two-dimensional quasi-geostrophic flow, with the important feature that the disturbances are allowed to have non-zero circulation. New nonlinear stability criteria are derived, and explicit bounds are obtained on both the disturbance energy and potential enstrophy which are expressed in terms of the initial disturbance fields. While Arnol'd's stability method relies on the second variation of the energy-Casimir invariant being sign-definite, the new criteria can be applied to cases where the second variation is sign-indefinite because of the disturbance circulations. A version of Andrews' theorem is also established for this problem.
Resumo:
We consider a new class of non-self-adjoint matrices that arise from an indefinite self- adjoint linear pencil of matrices, and obtain the spectral asymptotics of the spectra as the size of the matrices diverges to infinity. We prove that the spectrum is qualitatively different when a certain parameter c equals 0, and when it is non-zero, and that certain features of the spectrum depend on Diophantine properties of c.
Resumo:
There are three key components for developing a metadata system: a container structure laying out the key semantic issues of interest and their relationships; an extensible controlled vocabulary providing possible content; and tools to create and manipulate that content. While metadata systems must allow users to enter their own information, the use of a controlled vocabulary both imposes consistency of definition and ensures comparability of the objects described. Here we describe the controlled vocabulary (CV) and metadata creation tool built by the METAFOR project for use in the context of describing the climate models, simulations and experiments of the fifth Coupled Model Intercomparison Project (CMIP5). The CV and resulting tool chain introduced here is designed for extensibility and reuse and should find applicability in many more projects.
Resumo:
Background: In many experimental pipelines, clustering of multidimensional biological datasets is used to detect hidden structures in unlabelled input data. Taverna is a popular workflow management system that is used to design and execute scientific workflows and aid in silico experimentation. The availability of fast unsupervised methods for clustering and visualization in the Taverna platform is important to support a data-driven scientific discovery in complex and explorative bioinformatics applications. Results: This work presents a Taverna plugin, the Biological Data Interactive Clustering Explorer (BioDICE), that performs clustering of high-dimensional biological data and provides a nonlinear, topology preserving projection for the visualization of the input data and their similarities. The core algorithm in the BioDICE plugin is Fast Learning Self Organizing Map (FLSOM), which is an improved variant of the Self Organizing Map (SOM) algorithm. The plugin generates an interactive 2D map that allows the visual exploration of multidimensional data and the identification of groups of similar objects. The effectiveness of the plugin is demonstrated on a case study related to chemical compounds. Conclusions: The number and variety of available tools and its extensibility have made Taverna a popular choice for the development of scientific data workflows. This work presents a novel plugin, BioDICE, which adds a data-driven knowledge discovery component to Taverna. BioDICE provides an effective and powerful clustering tool, which can be adopted for the explorative analysis of biological datasets.