875 resultados para modeling of data sources
Resumo:
The existing seismic isolation systems are based on well-known and accepted physical principles, but they are still having some functional drawbacks. As an attempt of improvement, the Roll-N-Cage (RNC) isolator has been recently proposed. It is designed to achieve a balance in controlling isolator displacement demands and structural accelerations. It provides in a single unit all the necessary functions of vertical rigid support, horizontal flexibility with enhanced stability, resistance to low service loads and minor vibration, and hysteretic energy dissipation characteristics. It is characterized by two unique features that are a self-braking (buffer) and a self-recentering mechanism. This paper presents an advanced representation of the main and unique features of the RNC isolator using an available finite element code called SAP2000. The validity of the obtained SAP2000 model is then checked using experimental, numerical and analytical results. Then, the paper investigates the merits and demerits of activating the built-in buffer mechanism on both structural pounding mitigation and isolation efficiency. The paper addresses the problem of passive alleviation of possible inner pounding within the RNC isolator, which may arise due to the activation of its self-braking mechanism under sever excitations such as near-fault earthquakes. The results show that the obtained finite element code-based model can closely match and accurately predict the overall behavior of the RNC isolator with effectively small errors. Moreover, the inherent buffer mechanism of the RNC isolator could mitigate or even eliminate direct structure-tostructure pounding under severe excitation considering limited septation gaps between adjacent structures. In addition, the increase of inherent hysteretic damping of the RNC isolator can efficiently limit its peak displacement together with the severity of the possibly developed inner pounding and, therefore, alleviate or even eliminate the possibly arising negative effects of the buffer mechanism on the overall RNC-isolated structural responses.
Resumo:
Lately, the mobile data market has moved into a growth stage triggered by two facts: affordability of mobile broadband, and availability of data-friendly devices. At this stage, market growth is no longer dependent on push strategies from suppliers; on the contrary, demand is now driving the market. However, it will not be easy for mobile operating companies to cope up with the demand to come in the near future. The infrastructure that is needed to support corresponding demand is far from completion. Operators are forced to make heavy investments to upgrade and expand their networks. To decide how to handle the present and upcoming demand, they need to identify and understand the characteristics of the scenarios they face. This is precisely the aim of this article, which provides figures on the consequences for mobile infrastructures of a generalised mobile media uptake. Data from the Spanish mobile deployment case have been used to arrive at practical figures and illustration of results, but the conclusions are easily extended to other countries and regions
Resumo:
Data centers are easily found in every sector of the worldwide economy. They are composed of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of Data Centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grep 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, Data Centers are responsible for more than 2% of total carbon dioxide emissions.
Resumo:
RDF streams are sequences of timestamped RDF statements or graphs, which can be generated by several types of data sources (sensors, social networks, etc.). They may provide data at high volumes and rates, and be consumed by applications that require real-time responses. Hence it is important to publish and interchange them efficiently. In this paper, we exploit a key feature of RDF data streams, which is the regularity of their structure and data values, proposing a compressed, efficient RDF interchange (ERI) format, which can reduce the amount of data transmitted when processing RDF streams. Our experimental evaluation shows that our format produces state-of-the-art streaming compression, remaining efficient in performance.
Resumo:
En los últimos años la sociedad está experimentando una serie de cambios. Uno de estos cambios es la datificación (“datafication” en inglés). Este término puede ser definido como la transformación sistemática de aspectos de la vida cotidiana de las personas en datos procesados por ordenadores. Cada día, a cada minuto y a cada segundo, cada vez que alguien emplea un dispositivo digital,hay datos siendo guardados en algún lugar. Se puede tratar del contenido de un correo electrónico pero también puede ser el número de pasos que esa persona ha caminado o su historial médico. El simple almacenamiento de datos no proporciona un valor añadido por si solo. Para extraer conocimiento de los datos, y por tanto darles un valor, se requiere del análisis de datos. La ciencia de los datos junto con el análisis de datos se está volviendo cada vez más popular. Hoy en día, se pueden encontrar millones de web APIs estadísticas; estas APIs ofrecen la posibilidad de analizar tendencias o sentimientos presentes en las redes sociales o en internet en general. Una de las redes sociales más populares, Twitter, es pública. Cada mensaje, o tweet, publicado puede ser visto por cualquier persona en el mundo, siempre y cuando posea una conexión a internet. Esto hace de Twitter un medio interesante a la hora de analizar hábitos sociales o perfiles de consumo. Es en este contexto en que se engloba este proyecto. Este trabajo, combinando el análisis estadístico de datos y el análisis de contenido, trata de extraer conocimiento de tweets públicos de Twitter. En particular tratará de establecer si el género es un factor influyente en las relaciones entre usuarios de Twitter. Para ello, se analizará una base de datos que contiene casi 2.000 tweets. En primer lugar se determinará el género de los usuarios mediante web APIs. En segundo lugar se empleará el contraste de hipótesis para saber si el género influye en los usuarios a la hora de relacionarse con otros usuarios. Finalmente se construirá un modelo estadístico para predecir el comportamiento de los usuarios de Twitter en relación a su género.
3-D modeling of perimeter recombination in GaAs diodes and its influence on concentrator solar cells
Resumo:
This paper describes a complete modelling of the perimeter recombination of GaAs diodes which solves most unknowns and suppresses the limitations of previous models. Because of the three dimensional nature of the implemented model, it is able to simulate real devices. GaAs diodes on two epiwafers with different base doping levels, sizes and geometries, namely square and circular are manufactured. The validation of the model is achieved by fitting the experimental measurements of the dark IV curve of the manufactured GaAs diodes. A comprehensive 3-D description of the occurring phenomena affecting the perimeter recombination is supplied with the help of the model. Finally, the model is applied to concentrator GaAs solar cells to assess the impact of their doping level, size and geometry on the perimeter recombination.
Resumo:
Perceptual voice evaluation according to the GRBAS scale is modelled using a linear combination of acoustic parameters calculated after a filter-bank analysis of the recorded voice signals. Modelling results indicate that for breathiness and asthenia more than 55% of the variance of perceptual rates can be explained by such a model, with only 4 latent variables. Moreover, the greatest part of the explained variance can be attributed to only one or two latent variables similarly weighted by all 5 listeners involved in the experiment. Correlation factors between actual rates and model predictions around 0.6 are obtained.
Resumo:
Neuronal morphology is hugely variable across brain regions and species, and their classification strategies are a matter of intense debate in neuroscience. GABAergic cortical interneurons have been a challenge because it is difficult to find a set of morphological properties which clearly define neuronal types. A group of 48 neuroscience experts around the world were asked to classify a set of 320 cortical GABAergic interneurons according to the main features of their three-dimensional morphological reconstructions. A methodology for building a model which captures the opinions of all the experts was proposed. First, one Bayesian network was learned for each expert, and we proposed an algorithm for clustering Bayesian networks corresponding to experts with similar behaviors. Then, a Bayesian network which represents the opinions of each group of experts was induced. Finally, a consensus Bayesian multinet which models the opinions of the whole group of experts was built. A thorough analysis of the consensus model identified different behaviors between the experts when classifying the interneurons in the experiment. A set of characterizing morphological traits for the neuronal types was defined by performing inference in the Bayesian multinet. These findings were used to validate the model and to gain some insights into neuron morphology.
Resumo:
En los últimos años, el Ge ha ganado de nuevo atención con la finalidad de ser integrado en el seno de las existentes tecnologías de microelectrónica. Aunque no se le considera como un canddato capaz de reemplazar completamente al Si en el futuro próximo, probalemente servirá como un excelente complemento para aumentar las propiedades eléctricas en dispositivos futuros, especialmente debido a su alta movilidad de portadores. Esta integración requiere de un avance significativo del estado del arte en los procesos de fabricado. Técnicas de simulación, como los algoritmos de Monte Carlo cinético (KMC), proporcionan un ambiente atractivo para llevar a cabo investigación y desarrollo en este campo, especialmente en términos de costes en tiempo y financiación. En este estudio se han usado, por primera vez, técnicas de KMC con el fin entender el procesado “front-end” de Ge en su fabricación, específicamente la acumulación de dañado y amorfización producidas por implantación iónica y el crecimiento epitaxial en fase sólida (SPER) de las capas amorfizadas. Primero, simulaciones de aproximación de clisiones binarias (BCA) son usadas para calcular el dañado causado por cada ión. La evolución de este dañado en el tiempo se simula usando KMC sin red, o de objetos (OKMC) en el que sólamente se consideran los defectos. El SPER se simula a través de una aproximación KMC de red (LKMC), siendo capaz de seguir la evolución de los átomos de la red que forman la intercara amorfo/cristalina. Con el modelo de amorfización desarrollado a lo largo de este trabajo, implementado en un simulador multi-material, se pueden simular todos estos procesos. Ha sido posible entender la acumulación de dañado, desde la generación de defectos puntuales hasta la formación completa de capas amorfas. Esta acumulación ocurre en tres regímenes bien diferenciados, empezando con un ritmo lento de formación de regiones de dañado, seguido por una rápida relajación local de ciertas áreas en la fase amorfa donde ambas fases, amorfa y cristalina, coexisten, para terminar en la amorfización completa de capas extensas, donde satura el ritmo de acumulación. Dicha transición ocurre cuando la concentración de dañado supera cierto valor límite, el cual es independiente de las condiciones de implantación. Cuando se implantan los iones a temperaturas relativamente altas, el recocido dinámico cura el dañado previamente introducido y se establece una competición entre la generación de dañado y su disolución. Estos efectos se vuelven especialmente importantes para iones ligeros, como el B, el cual crea dañado más diluido, pequeño y distribuido de manera diferente que el causado por la implantación de iones más pesados, como el Ge. Esta descripción reproduce satisfactoriamente la cantidad de dañado y la extensión de las capas amorfas causadas por implantación iónica reportadas en la bibliografía. La velocidad de recristalización de la muestra previamente amorfizada depende fuertemente de la orientación del sustrato. El modelo LKMC presentado ha sido capaz de explicar estas diferencias entre orientaciones a través de un simple modelo, dominado por una única energía de activación y diferentes prefactores en las frecuencias de SPER dependiendo de las configuraciones de vecinos de los átomos que recristalizan. La formación de maclas aparece como una consecuencia de esta descripción, y es predominante en sustratos crecidos en la orientación (111)Ge. Este modelo es capaz de reproducir resultados experimentales para diferentes orientaciones, temperaturas y tiempos de evolución de la intercara amorfo/cristalina reportados por diferentes autores. Las parametrizaciones preliminares realizadas de los tensores de activación de tensiones son también capaces de proveer una buena correlación entre las simulaciones y los resultados experimentales de velocidad de SPER a diferentes temperaturas bajo una presión hidrostática aplicada. Los estudios presentados en esta tesis han ayudado a alcanzar un mejor entendimiento de los mecanismos de producción de dañado, su evolución, amorfización y SPER para Ge, además de servir como una útil herramienta para continuar el trabajo en este campo. In the recent years, Ge has regained attention to be integrated into existing microelectronic technologies. Even though it is not thought to be a feasible full replacement to Si in the near future, it will likely serve as an excellent complement to enhance electrical properties in future devices, specially due to its high carrier mobilities. This integration requires a significant upgrade of the state-of-the-art of regular manufacturing processes. Simulation techniques, such as kinetic Monte Carlo (KMC) algorithms, provide an appealing environment to research and innovation in the field, specially in terms of time and funding costs. In the present study, KMC techniques are used, for the first time, to understand Ge front-end processing, specifically damage accumulation and amorphization produced by ion implantation and Solid Phase Epitaxial Regrowth (SPER) of the amorphized layers. First, Binary Collision Approximation (BCA) simulations are used to calculate the damage caused by every ion. The evolution of this damage over time is simulated using non-lattice, or Object, KMC (OKMC) in which only defects are considered. SPER is simulated through a Lattice KMC (LKMC) approach, being able to follow the evolution of the lattice atoms forming the amorphous/crystalline interface. With the amorphization model developed in this work, implemented into a multi-material process simulator, all these processes can be simulated. It has been possible to understand damage accumulation, from point defect generation up to full amorphous layers formation. This accumulation occurs in three differentiated regimes, starting at a slow formation rate of the damage regions, followed by a fast local relaxation of areas into the amorphous phase where both crystalline and amorphous phases coexist, ending in full amorphization of extended layers, where the accumulation rate saturates. This transition occurs when the damage concentration overcomes a certain threshold value, which is independent of the implantation conditions. When implanting ions at relatively high temperatures, dynamic annealing takes place, healing the previously induced damage and establishing a competition between damage generation and its dissolution. These effects become specially important for light ions, as B, for which the created damage is more diluted, smaller and differently distributed than that caused by implanting heavier ions, as Ge. This description successfully reproduces damage quantity and extension of amorphous layers caused by means of ion implantation reported in the literature. Recrystallization velocity of the previously amorphized sample strongly depends on the substrate orientation. The presented LKMC model has been able to explain these differences between orientations through a simple model, dominated by one only activation energy and different prefactors for the SPER rates depending on the neighboring configuration of the recrystallizing atoms. Twin defects formation appears as a consequence of this description, and are predominant for (111)Ge oriented grown substrates. This model is able to reproduce experimental results for different orientations, temperatures and times of evolution of the amorphous/crystalline interface reported by different authors. Preliminary parameterizations for the activation strain tensors are able to also provide a good match between simulations and reported experimental results for SPER velocities at different temperatures under the appliance of hydrostatic pressure. The studies presented in this thesis have helped to achieve a greater understanding of damage generation, evolution, amorphization and SPER mechanisms in Ge, and also provide a useful tool to continue research in this field.
Resumo:
Ontology-Based Data Access (OBDA) permite el acceso a diferentes tipos de fuentes de datos (tradicionalmente bases de datos) usando un modelo más abstracto proporcionado por una ontología. La reescritura de consultas (query rewriting) usa una ontología para reescribir una consulta en una consulta reescrita que puede ser evaluada en la fuente de datos. Las consultas reescritas recuperan las respuestas que están implicadas por la combinación de los datos explicitamente almacenados en la fuente de datos, la consulta original y la ontología. Al trabajar sólo sobre las queries, la reescritura de consultas permite OBDA sobre cualquier fuente de datos que puede ser consultada, independientemente de las posibilidades para modificarla. Sin embargo, producir y evaluar las consultas reescritas son procesos costosos que suelen volverse más complejos conforme la expresividad y tamaño de la ontología y las consultas aumentan. En esta tesis exploramos distintas optimizaciones que peuden ser realizadas tanto en el proceso de reescritura como en las consultas reescritas para mejorar la aplicabilidad de OBDA en contextos realistas. Nuestra contribución técnica principal es un sistema de reescritura de consultas que implementa las optimizaciones presentadas en esta tesis. Estas optimizaciones son las contribuciones principales de la tesis y se pueden agrupar en tres grupos diferentes: -optimizaciones que se pueden aplicar al considerar los predicados en la ontología que no están realmente mapeados con las fuentes de datos. -optimizaciones en ingeniería que se pueden aplicar al manejar el proceso de reescritura de consultas en una forma que permite reducir la carga computacional del proceso de generación de consultas reescritas. -optimizaciones que se pueden aplicar al considerar metainformación adicional acerca de las características de la ABox. En esta tesis proporcionamos demostraciones formales acerca de la corrección y completitud de las optimizaciones propuestas, y una evaluación empírica acerca del impacto de estas optimizaciones. Como contribución adicional, parte de este enfoque empírico, proponemos un banco de pruebas (benchmark) para la evaluación de los sistemas de reescritura de consultas. Adicionalmente, proporcionamos algunas directrices para la creación y expansión de esta clase de bancos de pruebas. ABSTRACT Ontology-Based Data Access (OBDA) allows accessing different kinds of data sources (traditionally databases) using a more abstract model provided by an ontology. Query rewriting uses such ontology to rewrite a query into a rewritten query that can be evaluated on the data source. The rewritten queries retrieve the answers that are entailed by the combination of the data explicitly stored in the data source, the original query and the ontology. However, producing and evaluating the rewritten queries are both costly processes that become generally more complex as the expressiveness and size of the ontology and queries increase. In this thesis we explore several optimisations that can be performed both in the rewriting process and in the rewritten queries to improve the applicability of OBDA in real contexts. Our main technical contribution is a query rewriting system that implements the optimisations presented in this thesis. These optimisations are the core contributions of the thesis and can be grouped into three different groups: -optimisations that can be applied when considering the predicates in the ontology that are actually mapped to the data sources. -engineering optimisations that can be applied by handling the process of query rewriting in a way that permits to reduce the computational load of the query generation process. -optimisations that can be applied when considering additional metainformation about the characteristics of the ABox. In this thesis we provide formal proofs for the correctness of the proposed optimisations, and an empirical evaluation about the impact of the optimisations. As an additional contribution, part of this empirical approach, we propose a benchmark for the evaluation of query rewriting systems. We also provide some guidelines for the creation and expansion of this kind of benchmarks.
Resumo:
Over the last few years, the Data Center market has increased exponentially and this tendency continues today. As a direct consequence of this trend, the industry is pushing the development and implementation of different new technologies that would improve the energy consumption efficiency of data centers. An adaptive dashboard would allow the user to monitor the most important parameters of a data center in real time. For that reason, monitoring companies work with IoT big data filtering tools and cloud computing systems to handle the amounts of data obtained from the sensors placed in a data center.Analyzing the market trends in this field we can affirm that the study of predictive algorithms has become an essential area for competitive IT companies. Complex algorithms are used to forecast risk situations based on historical data and warn the user in case of danger. Considering that several different users will interact with this dashboard from IT experts or maintenance staff to accounting managers, it is vital to personalize it automatically. Following that line of though, the dashboard should only show relevant metrics to the user in different formats like overlapped maps or representative graphs among others. These maps will show all the information needed in a visual and easy-to-evaluate way. To sum up, this dashboard will allow the user to visualize and control a wide range of variables. Monitoring essential factors such as average temperature, gradients or hotspots as well as energy and power consumption and savings by rack or building would allow the client to understand how his equipment is behaving, helping him to optimize the energy consumption and efficiency of the racks. It also would help him to prevent possible damages in the equipment with predictive high-tech algorithms.
Resumo:
The performance of tandem stacks of Group III?V multijunction solar cells continues to improve rapidly, both through improved performance of the individual cells in the stack and throughi ncrease in the number of stacked cells. As the radiative efficiency of these individual cells increases, radiative coupling between the stacked cells becomes an increasingly important factor not only in cell design, but also in accurate efficiency measurement and in determining performance of cells and systems under varying spectral conditions in the field. Past modeling has concentrated on electroluminescent coupling between the cells, although photoluminescent coupling is shown to be important for cells operating near their maximum power point voltage or below or when junction defect recombination is significant. Extension of earlier models i sproposed to allow this non-negligible component of luminescent coupling to be included. Therefined model is validated by measurement of the closely related external emission from both single and double junction cells.
Resumo:
An integrated understanding of molecular and developmental biology must consider the large number of molecular species involved and the low concentrations of many species in vivo. Quantitative stochastic models of molecular interaction networks can be expressed as stochastic Petri nets (SPNs), a mathematical formalism developed in computer science. Existing software can be used to define molecular interaction networks as SPNs and solve such models for the probability distributions of molecular species. This approach allows biologists to focus on the content of models and their interpretation, rather than their implementation. The standardized format of SPNs also facilitates the replication, extension, and transfer of models between researchers. A simple chemical system is presented to demonstrate the link between stochastic models of molecular interactions and SPNs. The approach is illustrated with examples of models of genetic and biochemical phenomena where the UltraSAN package is used to present results from numerical analysis and the outcome of simulations.
Resumo:
The function of a protein generally is determined by its three-dimensional (3D) structure. Thus, it would be useful to know the 3D structure of the thousands of protein sequences that are emerging from the many genome projects. To this end, fold assignment, comparative protein structure modeling, and model evaluation were automated completely. As an illustration, the method was applied to the proteins in the Saccharomyces cerevisiae (baker’s yeast) genome. It resulted in all-atom 3D models for substantial segments of 1,071 (17%) of the yeast proteins, only 40 of which have had their 3D structure determined experimentally. Of the 1,071 modeled yeast proteins, 236 were related clearly to a protein of known structure for the first time; 41 of these previously have not been characterized at all.
Resumo:
The ligand binding domain of the human vitamin D receptor (VDR) was modeled based on the crystal structure of the retinoic acid receptor. The ligand binding pocket of our VDR model is spacious at the helix 11 site and confined at the β-turn site. The ligand 1α,25-dihydroxyvitamin D3 was assumed to be anchored in the ligand binding pocket with its side chain heading to helix 11 (site 2) and the A-ring toward the β-turn (site 1). Three residues forming hydrogen bonds with the functionally important 1α- and 25-hydroxyl groups of 1α,25-dihydroxyvitamin D3 were identified and confirmed by mutational analysis: the 1α-hydroxyl group is forming pincer-type hydrogen bonds with S237 and R274 and the 25-hydroxyl group is interacting with H397. Docking potential for various ligands to the VDR model was examined, and the results are in good agreement with our previous three-dimensional structure-function theory.