783 resultados para open data value chain
Resumo:
“Por lo tanto, la cristalización de polímeros se supone, y en las teorías se describe a menudo, como un proceso de múltiples pasos con muchos aspectos físico-químicos y estructurales influyendo en él. Debido a la propia estructura de la cadena, es fácil entender que un proceso que es termodinámicamente forzado a aumentar su ordenamiento local, se vea obstaculizado geométricamente y, por tanto, no puede conducirse a un estado de equilibrio final. Como resultado, se forman habitualmente estructuras de no equilibrio con diferentes características dependiendo de la temperatura, presión, cizallamiento y otros parámetros físico-químicos del sistema”. Estas palabras, pronunciadas recientemente por el profesor Bernhard Wunderlich, uno de los mas relevantes fisico-quimicos que han abordado en las ultimas décadas el estudio del estado físico de las macromoléculas, adelantan lo que de alguna manera se explicita en esta memoria y constituyen el “leitmotiv” de este trabajo de tesis. El mecanismo de la cristalización de polímeros esta aun bajo debate en la comunidad de la física de polímeros y la mayoría de los abordajes experimentales se explican a través de la teoría LH. Esta teoría clásica debida a Lauritzen y Hoffman (LH), y que es una generalización de la teoría de cristalización de una molécula pequeña desde la fase de vapor, describe satisfactoriamente muchas observaciones experimentales aunque esta lejos de explicar el complejo fenómeno de la cristalización de polímeros. De hecho, la formulación original de esta teoría en el National Bureau of Standards, a comienzos de la década de los 70, sufrió varias reformulaciones importantes a lo largo de la década de los 80, buscando su adaptación a los hallazgos experimentales. Así nació el régimen III de cristalización que posibilita la creacion de nichos moleculares en la superficie y que dio pie al paradigma ofrecido por Sadler y col., para justificar los experimentos que se obtenian por “scattering” de neutrones y otras técnicas como la técnica de “droplets” o enfriamiento rapido. Por encima de todo, el gran éxito de la teoría radica en que explica la dependencia inversa entre el tamaño del plegado molecular y el subenfriamiento, definido este ultimo como el intervalo de temperatura que media entre la temperatura de equilibrio y la temperatura de cristalización. El problema concreto que aborda esta tesis es el estudio de los procesos de ordenamiento de poliolefinas con distinto grado de ramificacion mediante simulaciones numéricas. Los copolimeros estudiados en esta tesis se consideran materiales modelo de gran homogeneidad molecular desde el punto de vista de la distribución de tamaños y de ramificaciones en la cadena polimérica. Se eligieron estas poliolefinas debido al gran interes experimental en conocer el cambio en las propiedades fisicas de los materiales dependiendo del tipo y cantidad de comonomero utilizado. Además, son modelos sobre los que existen una ingente cantidad de información experimental, que es algo que preocupa siempre al crear una realidad virtual como es la simulación. La experiencia en el grupo Biophym es que los resultados de simulación deben de tener siempre un correlato mas o menos próximo experimental y ese argumento se maneja a lo largo de esta memoria. Empíricamente, se conoce muy bien que las propiedades físicas de las poliolefinas, en suma dependen del tipo y de la cantidad de ramificaciones que presenta el material polimérico. Sin embargo, tal como se ha explicado no existen modelos teóricos adecuados que expliquen los mecanismos subyacentes de los efectos de las ramas. La memoria de este trabajo es amplia por la complejidad del tema. Se inicia con una extensa introducción sobre los conceptos básicos de una macromolecula que son relevantes para entender el contenido del resto de la memoria. Se definen los conceptos de macromolecula flexible, distribuciones y momentos, y su comportamiento en disolución y fundido con los correspondientes parametros caracteristicos. Se pone especial énfasis en el concepto de “entanglement” o enmaranamiento por considerarse clave a la hora de tratar macromoléculas con una longitud superior a la longitud critica de enmaranamiento. Finaliza esta introducción con una reseña sobre el estado del arte en la simulación de los procesos de cristalización. En un segundo capitulo del trabajo se expone detalladamente la metodología usada en cada grupo de casos. En el primer capitulo de resultados, se discuten los estudios de simulación en disolución diluida para sistemas lineales y ramificados de cadena única. Este caso mas simple depende claramente del potencial de torsión elegido tal como se discute a lo largo del texto. La formación de los núcleos “babys” propuestos por Muthukumar parece que son consecuencia del potencial de torsión, ya que este facilita los estados de torsión mas estables. Así que se propone el análisis de otros potenciales que son igualmente utilizados y los resultados obtenidos sobre la cristalización, discutidos en consecuencia. Seguidamente, en un segundo capitulo de resultados se estudian moleculas de alcanos de cadena larga lineales y ramificados en un fundido por simulaciones atomisticas como un modelo de polietileno. Los resultados atomisticos pese a ser de gran detalle no logran captar en su totalidad los efectos experimentales que se observan en los fundidos subenfriados en su etapa previa al estado ordenado. Por esta razon se discuten en los capítulos 3 y 4 de resultados sistemas de cadenas cortas y largas utilizando dos modelos de grano grueso (CG-PVA y CG-PE). El modelo CG-PE se desarrollo durante la tesis. El uso de modelos de grano grueso garantiza una mayor eficiencia computacional con respecto a los modelos atomísticos y son suficientes para mostrar los fenómenos a la escala relevante para la cristalización. En todos estos estudios mencionados se sigue la evolución de los procesos de ordenamiento y de fusión en simulaciones de relajación isoterma y no isoterma. Como resultado de los modelos de simulación, se han evaluado distintas propiedades fisicas como la longitud de segmento ordenado, la cristalinidad, temperaturas de fusion/cristalizacion, etc., lo que permite una comparación con los resultados experimentales. Se demuestra claramente que los sistemas ramificados retrasan y dificultan el orden de la cadena polimérica y por tanto, las regiones cristalinas ordenadas decrecen al crecer las ramas. Como una conclusión general parece mostrarse una tendencia a la formación de estructuras localmente ordenadas que crecen como bloques para completar el espacio de cristalización que puede alcanzarse a una temperatura y a una escala de tiempo determinada. Finalmente hay que señalar que los efectos observados, estan en concordancia con otros resultados tanto teoricos/simulacion como experimentales discutidos a lo largo de esta memoria. Su resumen se muestra en un capitulo de conclusiones y líneas futuras de investigación que se abren como consecuencia de esta memoria. Hay que mencionar que el ritmo de investigación se ha acentuado notablemente en el ultimo ano de trabajo, en parte debido a las ventajas notables obtenidas por el uso de la metodología de grano grueso que pese a ser muy importante para esta memoria no repercute fácilmente en trabajos publicables. Todo ello justifica que gran parte de los resultados esten en fase de publicación. Abstract “Polymer crystallization is therefore assumed, and in theories often described, to be a multi step process with many influencing aspects. Because of the chain structure, it is easy to understand that a process which is thermodynamically forced to increase local ordering but is geometrically hindered cannot proceed into a final equilibrium state. As a result, nonequilibrium structures with different characteristics are usually formed, which depend on temperature, pressure, shearing and other parameters”. These words, recently written by Professor Bernhard Wunderlich, one of the most prominent researchers in polymer physics, put somehow in value the "leitmotiv "of this thesis. The crystallization mechanism of polymers is still under debate in the physics community and most of the experimental findings are still explained by invoking the LH theory. This classical theory, which was initially formulated by Lauritzen and Hoffman (LH), is indeed a generalization of the crystallization theory for small molecules from the vapor phase. Even though it describes satisfactorily many experimental observations, it is far from explaining the complex phenomenon of polymer crystallization. This theory was firstly devised in the early 70s at the National Bureau of Standards. It was successively reformulated along the 80s to fit the experimental findings. Thus, the crystallization regime III was introduced into the theory in order to explain the results found in neutron scattering, droplet or quenching experiments. This concept defines the roughness of the crystallization surface leading to the paradigm proposed by Sadler et al. The great success of this theory is the ability to explain the inverse dependence of the molecular folding size on the supercooling, the latter defined as the temperature interval between the equilibrium temperature and the crystallization temperature. The main scope of this thesis is the study of ordering processes in polyolefins with different degree of branching by using computer simulations. The copolymers studied along this work are considered materials of high molecular homogeneity, from the point of view of both size and branching distributions of the polymer chain. These polyolefins were selected due to the great interest to understand their structure– property relationships. It is important to note that there is a vast amount of experimental data concerning these materials, which is essential to create a virtual reality as is the simulation. The Biophym research group has a wide experience in the correlation between simulation data and experimental results, being this idea highly alive along this work. Empirically, it is well-known that the physical properties of the polyolefins depend on the type and amount of branches presented in the polymeric material. However, there are not suitable models to explain the underlying mechanisms associated to branching. This report is extensive due to the complexity of the topic under study. It begins with a general introduction to the basics concepts of macromolecular physics. This chapter is relevant to understand the content of the present document. Some concepts are defined along this section, among others the flexibility of macromolecules, size distributions and moments, and the behavior in solution and melt along with their corresponding characteristic parameters. Special emphasis is placed on the concept of "entanglement" which is a key item when dealing with macromolecules having a molecular size greater than the critical entanglement length. The introduction finishes with a review of the state of art on the simulation of crystallization processes. The second chapter of the thesis describes, in detail, the computational methodology used in each study. In the first results section, we discuss the simulation studies in dilute solution for linear and short chain branched single chain models. The simplest case is clearly dependent on the selected torsion potential as it is discussed throughout the text. For example, the formation of baby nuclei proposed by Mutukhumar seems to result from the effects of the torsion potential. Thus, we propose the analysis of other torsion potentials that are also used by other research groups. The results obtained on crystallization processes are accordingly discussed. Then, in a second results section, we study linear and branched long-chain alkane molecules in a melt by atomistic simulations as a polyethylene-like model. In spite of the great detail given by atomistic simulations, they are not able to fully capture the experimental facts observed in supercooled melts, in particular the pre-ordered states. For this reason, we discuss short and long chains systems using two coarse-grained models (CG-PVA and CG-PE) in section 3 and 4 of chapter 2. The CG-PE model was developed during the thesis. The use of coarse-grained models ensures greater computational efficiency with respect to atomistic models and is enough to show the relevant scale phenomena for crystallization. In all the analysis we follow the evolution of the ordering and melting processes by both isothermal and non isothermal simulations. During this thesis we have obtained different physical properties such as stem length, crystallinity, melting/crystallization temperatures, and so on. We show that branches in the chains cause a delay in the crystallization and hinder the ordering of the polymer chain. Therefore, crystalline regions decrease in size as branching increases. As a general conclusion, it seems that there is a tendency in the macromolecular systems to form ordered structures, which can grown locally as blocks, occupying the crystallization space at a given temperature and time scale. Finally it should be noted that the observed effects are consistent with both, other theoretical/simulation and experimental results. The summary is provided in the conclusions chapter along with future research lines that open as result of this report. It should be mentioned that the research work has speeded up markedly in the last year, in part because of the remarkable benefits obtained by the use of coarse-grained methodology that despite being very important for this thesis work, is not easily publishable by itself. All this justify that most of the results are still in the publication phase.
Resumo:
Innovation has been identified as the single most relevant element in fuelling corporations’ competitive advantage and ultimate value creation. Corporations no longer rely on a single, linear structure of innovation; the new paradigm of open innovation opens up new possibilities of organizing innovation within the ecosystem, thus giving rise to new drivers for value creation. These value drivers have an impact on the strategic position of the firm and have the ability to create superior financial performance. In this paper we explore the close relationship between open innovation and value creation and propose a framework to analyze this process as well as the most critical elements involved.
Resumo:
Providing descriptions of isolated sensors and sensor networks in natural language, understandable by the general public, is useful to help users find relevant sensors and analyze sensor data. In this paper, we discuss the feasibility of using geographic knowledge from public databases available on the Web (such as OpenStreetMap, Geonames, or DBpedia) to automatically construct such descriptions. We present a general method that uses such information to generate sensor descriptions in natural language. The results of the evaluation of our method in a hydrologic national sensor network showed that this approach is feasible and capable of generating adequate sensor descriptions with a lower development effort compared to other approaches. In the paper we also analyze certain problems that we found in public databases (e.g., heterogeneity, non-standard use of labels, or rigid search methods) and their impact in the generation of sensor descriptions.
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
Traditional Text-To-Speech (TTS) systems have been developed using especially-designed non-expressive scripted recordings. In order to develop a new generation of expressive TTS systems in the Simple4All project, real recordings from the media should be used for training new voices with a whole new range of speaking styles. However, for processing this more spontaneous material, the new systems must be able to deal with imperfect data (multi-speaker recordings, background and foreground music and noise), filtering out low-quality audio segments and creating mono-speaker clusters. In this paper we compare several architectures for combining speaker diarization and music and noise detection which improve the precision and overall quality of the segmentation.
Resumo:
A procedure for measuring the overheating temperature (ΔT ) of a p-n junction area in the structure of photovoltaic (PV) cells converting laser or solar radiations relative to the ambient temperature has been proposed for the conditions of connecting to an electric load. The basis of the procedure is the measurement of the open-circuit voltage (VO C ) during the initial time period after the fast disconnection of the external resistive load. The simultaneous temperature control on an external heated part of a PV module gives the means for determining the value of VO C at ambient temperature. Comparing it with that measured after switching OFF the load makes the calculation of ΔT possible. Calibration data on the VO C = f(T ) dependences for single-junction AlGaAs/GaAs and triple-junction InGaP/GaAs/Ge PV cells are presented. The temperature dynamics in the PV cells has been determined under flash illumination and during fast commutation of the load. Temperature measurements were taken in two cases: converting continuous laser power by single-junction cells and converting solar power by triple-junction cells operating in the concentrator modules.
Resumo:
We propose a general procedure for solving incomplete data estimation problems. The procedure can be used to find the maximum likelihood estimate or to solve estimating equations in difficult cases such as estimation with the censored or truncated regression model, the nonlinear structural measurement error model, and the random effects model. The procedure is based on the general principle of stochastic approximation and the Markov chain Monte-Carlo method. Applying the theory on adaptive algorithms, we derive conditions under which the proposed procedure converges. Simulation studies also indicate that the proposed procedure consistently converges to the maximum likelihood estimate for the structural measurement error logistic regression model.
Resumo:
I attempt to reconcile apparently conflicting factors and mechanisms that have been proposed to determine the rate constant for two-state folding of small proteins, on the basis of general features of the structures of transition states. Φ-Value analysis implies a transition state for folding that resembles an expanded and distorted native structure, which is built around an extended nucleus. The nucleus is composed predominantly of elements of partly or well-formed native secondary structure that are stabilized by local and long-range tertiary interactions. These long-range interactions give rise to connecting loops, frequently containing the native loops that are poorly structured. I derive an equation that relates differences in the contact order of a protein to changes in the length of linking loops, which, in turn, is directly related to the unfavorable free energy of the loops in the transition state. Kinetic data on loop extension mutants of CI2 and α-spectrin SH3 domain fit the equation qualitatively. The rate of folding depends primarily on the interactions that directly stabilize the nucleus, especially those in native-like secondary structure and those resulting from the entropy loss from the connecting loops, which vary with contact order. This partitioning of energy accounts for the success of some algorithms that predict folding rates, because they use these principles either explicitly or implicitly. The extended nucleus model thus unifies the observations of rate depending on both stability and topology.
Resumo:
We describe the use of singular value decomposition in transforming genome-wide expression data from genes × arrays space to reduced diagonalized “eigengenes” × “eigenarrays” space, where the eigengenes (or eigenarrays) are unique orthonormal superpositions of the genes (or arrays). Normalizing the data by filtering out the eigengenes (and eigenarrays) that are inferred to represent noise or experimental artifacts enables meaningful comparison of the expression of different genes across different arrays in different experiments. Sorting the data according to the eigengenes and eigenarrays gives a global picture of the dynamics of gene expression, in which individual genes and arrays appear to be classified into groups of similar regulation and function, or similar cellular state and biological phenotype, respectively. After normalization and sorting, the significant eigengenes and eigenarrays can be associated with observed genome-wide effects of regulators, or with measured samples, in which these regulators are overactive or underactive, respectively.
Resumo:
Ribozymes of hepatitis delta virus have been proposed to use an active-site cytosine as an acid-base catalyst in the self-cleavage reaction. In this study, we have examined the role of cytosine in more detail with the antigenomic ribozyme. Evidence that proton transfer in the rate-determining step involved cytosine 76 (C76) was obtained from examining cleavage activity of the wild-type and imidazole buffer-rescued C76-deleted (C76Δ) ribozymes in D2O and H2O. In both reactions, a similar kinetic isotope effect and shift in the apparent pKa indicate that the buffer is functionally substituting for the side chain in proton transfer. Proton inventory of the wild-type reaction supported a mechanism of a single proton transfer at the transition state. This proton transfer step was further characterized by exogenous base rescue of a C76Δ mutant with cytosine and imidazole analogues. For the imidazole analogues that rescued activity, the apparent pKa of the rescue reaction, measured under kcat/KM conditions, correlated with the pKa of the base. From these data a Brønsted coefficient (β) of 0.51 was determined for the base-rescued reaction of C76Δ. This value is consistent with that expected for proton transfer in the transition state. Together, these data provide strong support for a mechanism where an RNA side chain participates directly in general acid or general base catalysis of the wild-type ribozyme to facilitate RNA cleavage.
Resumo:
Comparative genomics offers unparalleled opportunities to integrate historically distinct disciplines, to link disparate biological kingdoms, and to bridge basic and applied science. Cross-species, cross-genera, and cross-kingdom comparisons are proving key to understanding how genes are structured, how gene structure relates to gene function, and how changes in DNA have given rise to the biological diversity on the planet. The application of genomics to the study of crop species offers special opportunities for innovative approaches for combining sequence information with the vast reservoirs of historical information associated with crops and their evolution. The grasses provide a particularly well developed system for the development of tools to facilitate comparative genetic interpretation among members of a diverse and evolutionarily successful family. Rice provides advantages for genomic sequencing because of its small genome and its diploid nature, whereas each of the other grasses provides complementary genetic information that will help extract meaning from the sequence data. Because of the importance of the cereals to the human food chain, developments in this area can lead directly to opportunities for improving the health and productivity of our food systems and for promoting the sustainable use of natural resources.
Resumo:
Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.
Resumo:
This layer is a georeferenced raster image of the historic paper map entitled: Base map of the District of Columbia showing public and zoning areas, base prepared in the Office of the Surveyor, D.C., by direction of the Engineer Commissioner, D.C. It was published by Engineer Commissioner in 1936. Scale [ca. 1:19,200]. Base map "complete to June 13, 1933." The image inside the map neatline is georeferenced to the surface of the earth and fit to the Maryland State Plane Coordinate System Meters NAD83 (Fipszone 1900). All map collar and inset information is also available as part of the raster image, including any inset maps, profiles, statistical tables, directories, text, illustrations, index maps, legends, or other information associated with the principal map. This map shows features such as residential areas, open spaces, commercial and industrial areas, alley dwelling areas, roads, block numbers, railroads and stations, drainage, selected public buildings and points of interest, parks, cemeteries, and more. This layer is part of a selection of digitally scanned and georeferenced historic maps from The Harvard Map Collection as part of the Imaging the Urban Environment project. Maps selected for this project represent major urban areas and cities of the world, at various time periods. These maps typically portray both natural and manmade features at a large scale. The selection represents a range of regions, originators, ground condition dates, scales, and purposes.
Resumo:
The revelation of the top-secret US intelligence-led PRISM Programme has triggered wide-ranging debates across Europe. Press reports have shed new light on the electronic surveillance ‘fishing expeditions’ of the US National Security Agency and the FBI into the world’s largest electronic communications companies. This Policy Brief by a team of legal specialists and political scientists addresses the main controversies raised by the PRISM affair and the policy challenges that it poses for the EU. Two main arguments are presented: First, the leaks over the PRISM programme have undermined the trust that EU citizens have in their governments and the European institutions to safeguard and protect their privacy; and second, the PRISM affair raises questions regarding the capacity of EU institutions to draw lessons from the past and to protect the data of its citizens and residents in the context of transatlantic relations. The Policy Brief puts forward a set of policy recommendations for the EU to follow and implement a robust data protection strategy in response to the affair.