998 resultados para adaptive variability
Resumo:
The objects with which the hand interacts with may significantly change the dynamics of the arm. How does the brain adapt control of arm movements to this new dynamic? We show that adaptation is via composition of a model of the task's dynamics. By exploring generalization capabilities of this adaptation we infer some of the properties of the computational elements with which the brain formed this model: the elements have broad receptive fields and encode the learned dynamics as a map structured in an intrinsic coordinate system closely related to the geometry of the skeletomusculature. The low--level nature of these elements suggests that they may represent asset of primitives with which a movement is represented in the CNS.
Resumo:
A promising technique for the large-scale manufacture of micro-fluidic devices and photonic devices is hot embossing of polymers such as PMMA. Micro-embossing is a deformation process where the workpiece material is heated to permit easier material flow and then forced over a planar patterned tool. While there has been considerable, attention paid to process feasibility very little effort has been put into production issues such as process capability and eventual process control. In this paper, we present initial studies aimed at identifying the origins and magnitude of variability for embossing features at the micron scale in PMMA. Test parts with features ranging from 3.5- 630 µm wide and 0.9 µm deep were formed. Measurements at this scale proved very difficult, and only atomic force microscopy was able to provide resolution sufficient to identify process variations. It was found that standard deviations of widths at the 3-4 µm scale were on the order of 0.5 µm leading to a coefficient of variation as high as 13%. Clearly, the transition from test to manufacturing for this process will require understanding the causes of this variation and devising control methods to minimize its magnitude over all types of parts.
Resumo:
In the field of biologics production, productivity and stability of the transfected gene of interest are two very important attributes that dictate if a production process is viable. To further understand and improve these two traits, we would need to further our understanding of the factors affecting them. These would include integration site of the gene, gene copy number, cell phenotypic variation and cell environment. As these factors play different parts in the development process, they lead to variable productivity and stability of the transfected gene between clones, the well-known phenomenon of “clonal variation”. A study of this phenomenon and how the various factors contribute to it will thus shed light on strategies to improve productivity and stability in the production cell line. Of the four factors, the site of gene integration appears to be one of the most important. Hence, it is proposed that work is done on studying how different integration sites affect the productivity and stability of transfected genes in the development process. For the study to be more industrially relevant, it is proposed that the Chinese Hamster Ovary dhfr-deficient cell line, CHO-DG44, is used as the model system.
Resumo:
Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes
Resumo:
In several computer graphics areas, a refinement criterion is often needed to decide whether to go on or to stop sampling a signal. When the sampled values are homogeneous enough, we assume that they represent the signal fairly well and we do not need further refinement, otherwise more samples are required, possibly with adaptive subdivision of the domain. For this purpose, a criterion which is very sensitive to variability is necessary. In this paper, we present a family of discrimination measures, the f-divergences, meeting this requirement. These convex functions have been well studied and successfully applied to image processing and several areas of engineering. Two applications to global illumination are shown: oracles for hierarchical radiosity and criteria for adaptive refinement in ray-tracing. We obtain significantly better results than with classic criteria, showing that f-divergences are worth further investigation in computer graphics. Also a discrimination measure based on entropy of the samples for refinement in ray-tracing is introduced. The recursive decomposition of entropy provides us with a natural method to deal with the adaptive subdivision of the sampling region
Resumo:
The chemical composition of sediments and rocks, as well as their distribution at the Martian surface, represent a long term archive of processes, which have formed the planetary surface. A survey of chemical compositions by means of Compositional Data Analysis represents a valuable tool to extract direct evidence for weathering processes and allows to quantify weathering and sedimentation rates. clr-biplot techniques are applied for visualization of chemical relationships across the surface (“chemical maps”). The variability among individual suites of data is further analyzed by means of clr-PCA, in order to extract chemical alteration vectors between fresh rocks and their crusts and for an assessment of different source reservoirs accessible to soil formation. Both techniques are applied to elucidate the influence of remote weathering by combined analysis of several soil forming branches. Vector analysis in the Simplex provides the opportunity to study atmosphere surface interactions, including the role and composition of volcanic gases
Resumo:
Resumen tomado de la publicaci??n
Resumo:
This paper presents a first approach of Evaluation Engine Architecture (EEA) as proposal to support adaptive integral assessment, in the context of a virtual learning environment. The goal of our research is design an evaluation engine tool to assist in the whole assessment process within the A2UN@ project, linking that tool with the other key elements of a learning design (learning task, learning resources and learning support). The teachers would define the relation between knowledge, competencies, activities, resources and type of assessment. Providing this relation is possible obtain more accurate estimations of student's knowledge for adaptive evaluations and future recommendations. The process is supported by usage of educational standards and specifications and for an integral user modelling
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Learning contents adaptation has been a subject of interest in the research area of the adaptive hypermedia systems. Defining which variables and which standards can be considered to model adaptive content delivery processes is one of the main challenges in pedagogical design over e-learning environments. In this paper some specifications, architectures and technologies that can be used in contents adaptation processes considering characteristics of the context are described and a proposal to integrate some of these characteristics in the design of units of learning using adaptation conditions in a structure of IMS-Learning Design (IMS-LD) is presented. The key contribution of this work is the generation of instructional designs considering the context, which can be used in Learning Management Systems (LMSs) and diverse mobile devices
Resumo:
Hypermedia systems based on the Web for open distance education are becoming increasingly popular as tools for user-driven access learning information. Adaptive hypermedia is a new direction in research within the area of user-adaptive systems, to increase its functionality by making it personalized [Eklu 961. This paper sketches a general agents architecture to include navigational adaptability and user-friendly processes which would guide and accompany the student during hislher learning on the PLAN-G hypermedia system (New Generation Telematics Platform to Support Open and Distance Learning), with the aid of computer networks and specifically WWW technology [Marz 98-1] [Marz 98-2]. The PLAN-G actual prototype is successfully used with some informatics courses (the current version has no agents yet). The propased multi-agent system, contains two different types of adaptive autonomous software agents: Personal Digital Agents {Interface), to interacl directly with the student when necessary; and Information Agents (Intermediaries), to filtrate and discover information to learn and to adapt navigation space to a specific student
Resumo:
Engineering of negotiation model allows to develop effective heuristic for business intelligence. Digital ecosystems demand open negotiation models. To define in advance effective heuristics is not compliant with the requirement of openness. The new challenge is to develop business intelligence in advance exploiting an adaptive approach. The idea is to learn business strategy once new negotiation model rise in the e-market arena. In this paper we present how recommendation technology may be deployed in an open negotiation environment where the interaction protocol models are not known in advance. The solution we propose is delivered as part of the ONE Platform, open source software that implements a fully distributed open environment for business negotiation
Resumo:
Durante la crisis financiera global de 2008 muchas organizaciones y mercados financieros tuvieron que terminar sus operaciones o replantearlas debido a los choques que golpearon el bienestar de sus empresas. A pesar de esta grave situación, en la actualidad se pueden encontrar empresas que se recuperaron y salieron del terrible panorama que les presentó la crisis, incluso encontrando nuevas oportunidades de negocio y fortaleciendo su futuro. Esta capacidad que algunas organizaciones tuvieron y que permitió su salida victoriosa de la crisis se denomina resiliencia, la cual es la capacidad de sobreponerse a los efectos negativos de choques internos o externos (Briguglio, Cordina, Farrugia & Vella 2009). Por tanto en el presente trabajo se estudiará esta capacidad tanto en la organización como en los líderes para hallar factores que mejoren el desempeño de las empresas en crisis como la que ocurrió en el 2008 – 2009. Primero se realizará un estudio sobre los sucesos y el desarrollo de la crisis subprime del año 2008 para tener un entendimiento claro de sus antecedentes, desarrollo, magnitud y consecuencias. Posteriormente se realizará un estudio profundo sobre la teoría de la resiliencia organizacional y la resiliencia en el líder como individuo y los estilos de liderazgo. Finalmente teniendo un sustento teórico tanto de la crisis como del concepto de resiliencia se tomarán casos de estudio de empresas que lograron perdurar en la crisis financiera del 2008 y empresas que no lograron sobrevivir para posteriormente hallar características del líder y del liderazgo que puedan aumentar o afectar la capacidad de resiliencia de las organizaciones con el objetivo de brindar herramientas a los líderes actuales para que manejen de forma eficiente y eficaz las empresas en un mundo complejo y variable como el actual.
Resumo:
Introduces extended Crowds protocols. Optional reading