91 resultados para extensibility
Resumo:
AbstractLiterature has unveiled that a paper has not been published yet on using non-parametric stability statistics (NPSSs) for evaluating genotypic stability in dough properties of wheat. Accordingly, the effects of genotype (G), environment (E) and GE interaction (GEI) on alveograph parameters, i.e. dough baking strength (W) and its tenacity (P)/extensibility (L), of 18 wheat (T. aestivum L.) genotypes were studied under irrigated field conditions in an 8-year trial (2006-2014) in central Turkey. Furthermore, genotypic stability for W and P/L was determined using 8 NPSSs viz. RM-Rank mean, RSD-Rank’s standard deviation, RS-Rank Sum, TOP-Ranking, Si(1), Si(2), Si(3) and Si(6) rank statistics. The ANOVA revealed that W and P/L were primarily controlled by E, although G and GEI also had significant effects. Among the 8 NPSSs, only RM, RS and TOP statistics were suitable for detecting the genotypes with high stable and bread making quality (e.g. G1 and G17). In conclusion, using RM, RS and TOP statistics is advisable to select for dough quality in wheat under multi-environment trials (METs).
Resumo:
L’une des particularités fondamentales caractérisant les cellules végétales des cellules animales est la présence de la paroi cellulaire entourant le protoplaste. La paroi cellulaire joue un rôle primordial dans (1) la protection du protoplaste, (2) est impliquée dans les mécanismes de filtration et (3) est le lieu de maintes réactions biochimiques nécessaires à la régulation du métabolisme et des propriétés mécaniques de la cellule. Les propriétés locales d’élasticité, d’extensibilité, de plasticité et de dureté des composants pariétaux déterminent la géométrie et la forme des cellules lors des processus de différentiation et de morphogenèse. Le but de ma thèse est de comprendre les rôles que jouent les différents composants pariétaux dans le modelage de la géométrie et le contrôle de la croissance des cellules végétales. Pour atteindre cet objectif, le modèle cellulaire sur lequel je me suis basé est le tube pollinique ou gamétophyte mâle. Le tube pollinique est une protubérance cellulaire qui se forme à partir du grain de pollen à la suite de son contact avec le stigmate. Sa fonction est la livraison des cellules spermatiques à l’ovaire pour effectuer la double fécondation. Le tube pollinique est une cellule à croissance apicale, caractérisée par la simple composition de sa paroi et par sa vitesse de croissance qui est la plus rapide du règne végétal. Ces propriétés uniques font du tube pollinique le modèle idéal pour l’étude des effets à courts termes du stress sur la croissance et le métabolisme cellulaire ainsi que sur les propriétés mécaniques de la paroi. La paroi du tube pollinique est composée de trois composantes polysaccharidiques : pectines, cellulose et callose et d’une multitude de protéines. Pour comprendre les effets que jouent ces différents composants dans la régulation de la croissance du tube pollinique, j’ai étudié les effets de mutations, de traitements enzymatiques, de l’hyper-gravité et de la gravité omni-directionnelle sur la paroi du tube pollinique. En utilisant des méthodes de modélisation mathématiques combinées à de la biologie moléculaire et de la microscopie à fluorescence et électronique à haute résolution, j’ai montré que (1) la régulation de la chimie des pectines est primordiale pour le contrôle du taux de croissance et de la forme du tube et que (2) la cellulose détermine le diamètre du tube pollinique en partie sub-apicale. De plus, j’ai examiné le rôle d’un groupe d’enzymes digestives de pectines exprimées durant le développement du tube pollinique : les pectate lyases. J’ai montré que ces enzymes sont requises lors de l’initiation de la germination du pollen. J’ai notamment directement prouvé que les pectate lyases sont sécrétées par le tube pollinique dans le but de faciliter sa pénétration au travers du style.
Resumo:
The no response test is a new scheme in inverse problems for partial differential equations which was recently proposed in [D. R. Luke and R. Potthast, SIAM J. Appl. Math., 63 (2003), pp. 1292–1312] in the framework of inverse acoustic scattering problems. The main idea of the scheme is to construct special probing waves which are small on some test domain. Then the response for these waves is constructed. If the response is small, the unknown object is assumed to be a subset of the test domain. The response is constructed from one, several, or many particular solutions of the problem under consideration. In this paper, we investigate the convergence of the no response test for the reconstruction information about inclusions D from the Cauchy values of solutions to the Helmholtz equation on an outer surface $\partial\Omega$ with $\overline{D} \subset \Omega$. We show that the one‐wave no response test provides a criterion to test the analytic extensibility of a field. In particular, we investigate the construction of approximations for the set of singular points $N(u)$ of the total fields u from one given pair of Cauchy data. Thus, the no response test solves a particular version of the classical Cauchy problem. Also, if an infinite number of fields is given, we prove that a multifield version of the no response test reconstructs the unknown inclusion D. This is the first convergence analysis which could be achieved for the no response test.
Resumo:
The mechanisms underlying the increase in stress for large mechanical strains of a polymer glass, quantified by the strain-hardening modulus, are still poorly understood. In the present paper we aim to elucidate this matter and present new mechanisms. Molecular-dynamics simulations of two polymers with very different strain-hardening moduli (polycarbonate and polystyrene) have been carried out. Nonaffine displacements occur because of steric hindrances and connectivity constraints. We argue that it is not necessary to introduce the concept of entanglements to understand strain hardening, but that hardening is rather coupled with the increase in the rate of nonaffine particle displacements. This rate increases faster for polycarbonate, which has the higher strain-hardening modulus. Also more nonaffine chain stretching is present for polycarbonate. It is shown that the inner distances of such a nonaffinely deformed chain can be well described by the inner distances of the worm-like chain, but with an effective stiffness length (equal to the Kuhn length for an infinite worm-like chain) that increases during deformation. It originates from the finite extensibility of the chain. In this way the increase in nonaffine particle displacement can be understood as resulting from an increase in the effective stiffness length of the perturbed chain during deformation, so that at larger strains a higher rate of plastic events in terms of nonaffine displacement is necessary, causing in turn the observed strain hardening in polymer glasses.
Resumo:
Most suspension-feeding trichopterans spin a fine-silk capture net that is used to remove suspended matter from the water. The efficiency of these nets has previously been studied by considering the geometry of the web structure but the material from which the nets is constructed has received little attention. We report measurements of the tensile strength and extensibility of net silk from Hydropsyche siltalai. These measurements place caddisfly silk as one of the weakest natural silks so far reported, with a mean tensile strength of 221 +/- 22 megaNewtons (MN)/m(2). We also show that H. siltalai silk can more than double in length before catastrophic breakage, and that the silk is at least 2 orders of magnitude stronger than the maximum force estimated to act upon it in situ. Possible reasons for this disparity include constraints of evolutionary history and safety margins to prevent net failure or performance reduction.
Resumo:
Xyloglucan-acting enzymes are believed to have effects on type I primary plant cell wall mechanical properties. In order to get a better understanding of these effects, a range of enzymes with different in vitro modes of action were tested against cell wall analogues (bio-composite materials based on Acetobacter xylinus cellulose and xyloglucan). Tomato pericarp xyloglucan endo transglycosylase (tXET) and nasturtium seed xyloglucanase (nXGase) were produced heterologously in Pichia pastoris. Their action against the cell wall analogues was compared with that of a commercial preparation of Trichoderma endo-glucanase (EndoGase). Both 'hydrolytic' enzymes (nXGase and EndoGase) were able to depolymerise not only the cross-link xyloglucan fraction but also the surface-bound fraction. Consequent major changes in cellulose fibril architecture were observed. In mechanical terms, removal of xyloglucan cross-links from composites resulted in increased stiffness (at high strain) and decreased visco-elasticity with similar extensibility. On the other hand, true transglycosylase activity (tXET) did not affect the cellulose/xyloglucan ratio. No change in composite stiffness or extensibility resulted, but a significant increase in creep behaviour was observed in the presence of active tXET. These results provide direct in vitro evidence for the involvement of cell wall xyloglucan-specific enzymes in mechanical changes underlying plant cell wall re-modelling and growth processes. Mechanical consequences of tXET action are shown to be complimentary to those of cucumber expansin.
Resumo:
The rheological properties of dough and gluten are important for end-use quality of flour but there is a lack of knowledge of the relationships between fundamental and empirical tests and how they relate to flour composition and gluten quality. Dough and gluten from six breadmaking wheat qualities were subjected to a range of rheological tests. Fundamental (small-deformation) rheological characterizations (dynamic oscillatory shear and creep recovery) were performed on gluten to avoid the nonlinear influence of the starch component, whereas large deformation tests were conducted on both dough and gluten. A number of variables from the various curves were considered and subjected to a principal component analysis (PCA) to get an overview of relationships between the various variables. The first component represented variability in protein quality, associated with elasticity and tenacity in large deformation (large positive loadings for resistance to extension and initial slope of dough and gluten extension curves recorded by the SMS/Kieffer dough and gluten extensibility rig, and the tenacity and strain hardening index of dough measured by the Dobraszczyk/Roberts dough inflation system), the elastic character of the hydrated gluten proteins (large positive loading for elastic modulus [G'], large negative loadings for tan delta and steady state compliance [J(e)(0)]), the presence of high molecular weight glutenin subunits (HMW-GS) 5+10 vs. 2+12, and a size distribution of glutenin polymers shifted toward the high-end range. The second principal component was associated with flour protein content. Certain rheological data were influenced by protein content in addition to protein quality (area under dough extension curves and dough inflation curves [W]). The approach made it possible to bridge the gap between fundamental rheological properties, empirical measurements of physical properties, protein composition, and size distribution. The interpretation of this study gave indications of the molecular basis for differences in breadmaking performance.
Resumo:
Three large deformation rheological tests, the Kieffer dough extensibility system, the D/R dough inflation system and the 2 g mixograph test, were carried out on doughs made from a large number of winter wheat lines and cultivars grown in Poland. These lines and cultivars represented a broad spread in baking performance in order to assess their suitability as predictors of baking volume. The parameters most closely associated with baking volume were strain hardening index, bubble failure strain, and mixograph bandwidth at 10min. Simple correlations with baking volume indicate that bubble failure strain and strain hardening index give the highest correlations, whilst the use of best subsets regression, which selects the best combination of parameters, gave increased correlations with R-2 = 0.865 for dough inflation parameters, R-2 = 0. 842 for Kieffer parameters and R-2 = 0.760 for mixograph parameters. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
There are three key components for developing a metadata system: a container structure laying out the key semantic issues of interest and their relationships; an extensible controlled vocabulary providing possible content; and tools to create and manipulate that content. While metadata systems must allow users to enter their own information, the use of a controlled vocabulary both imposes consistency of definition and ensures comparability of the objects described. Here we describe the controlled vocabulary (CV) and metadata creation tool built by the METAFOR project for use in the context of describing the climate models, simulations and experiments of the fifth Coupled Model Intercomparison Project (CMIP5). The CV and resulting tool chain introduced here is designed for extensibility and reuse and should find applicability in many more projects.
Resumo:
Background: In many experimental pipelines, clustering of multidimensional biological datasets is used to detect hidden structures in unlabelled input data. Taverna is a popular workflow management system that is used to design and execute scientific workflows and aid in silico experimentation. The availability of fast unsupervised methods for clustering and visualization in the Taverna platform is important to support a data-driven scientific discovery in complex and explorative bioinformatics applications. Results: This work presents a Taverna plugin, the Biological Data Interactive Clustering Explorer (BioDICE), that performs clustering of high-dimensional biological data and provides a nonlinear, topology preserving projection for the visualization of the input data and their similarities. The core algorithm in the BioDICE plugin is Fast Learning Self Organizing Map (FLSOM), which is an improved variant of the Self Organizing Map (SOM) algorithm. The plugin generates an interactive 2D map that allows the visual exploration of multidimensional data and the identification of groups of similar objects. The effectiveness of the plugin is demonstrated on a case study related to chemical compounds. Conclusions: The number and variety of available tools and its extensibility have made Taverna a popular choice for the development of scientific data workflows. This work presents a novel plugin, BioDICE, which adds a data-driven knowledge discovery component to Taverna. BioDICE provides an effective and powerful clustering tool, which can be adopted for the explorative analysis of biological datasets.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.
Resumo:
The spread of the Web boosted the dissemination of Information Systems (IS) based on the Web. In order to support the implementation of these systems, several technologies came up or evolved with this purpose, namely the programming languages. The Technology Acceptance Model TAM (Davis, 1986) was conceived aiming to evaluate the acceptance/use of information technologies by their users. A lot of studies and many applications have used the TAM, however, in the literature it was not found a mention of the use of such model related to the use of programming languages. This study aims to investigate which factors influence the use of programming languages on the development of Web systems by their developers, applying an extension of the TAM, proposed in this work. To do so, a research was done with Web developers in two Yahoo groups: java-br and python-brasil, where 26 Java questionnaires and 39 Python questionnaires were fully answered. The questionnaire had general questions and questions which measured intrinsic and extrinsic factors of the programming languages, the perceived usefulness, the perceived ease of use, the attitude toward the using and the programming language use. Most of the respondents were men, graduate, between 20 and 30 years old, working in the southeast and south regions. The research was descriptive in the sense of its objectives. Statistical tools, descriptive statistics, main components and linear regression analysis were used for the data analysis. The foremost research results were: Java and Python have machine independence, extensibility, generality and reliability; Java and Python are more used by corporations and international organizations than supported by the government or educational institutions; there are more Java programmers than Python programmers; the perceived usefulness is influenced by the perceived ease of use; the generality and the extensibility are intrinsic factors of programming languages which influence the perceived ease of use; the perceived ease of use influences the attitude toward the using of the programming language
Resumo:
Middleware platforms have been widely used as an underlying infrastructure to the development of distributed applications. They provide distribution and heterogeneity transparency and a set of services that ease the construction of distributed applications. Nowadays, the middlewares accommodate an increasing variety of requirements to satisfy distinct application domains. This broad range of application requirements increases the complexity of the middleware, due to the introduction of many cross-cutting concerns in the architecture, which are not properly modularized by traditional programming techniques, resulting in a tangling and spread of theses concerns in the middleware code. The presence of these cross-cutting concerns limits the middleware scalability and aspect-oriented paradigm has been used successfully to improve the modularity, extensibility and customization capabilities of middleware. This work presents AO-OiL, an aspect-oriented (AO) middleware architecture, based on the AO middleware reference architecture. This middleware follows the philosophy that the middleware functionalities must be driven by the application requirements. AO-OiL consists in an AO refactoring of the OiL (Orb in Lua) middleware in order to separate basic and crosscutting concerns. The proposed architecture was implemented in Lua and RE-AspectLua. To evaluate the refactoring impact in the middleware architecture, this paper presents a comparative analysis of performance between AO-OiL and OiL
Resumo:
There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant