964 resultados para Data Migration Processes Modeling
Resumo:
Geochemical investigation of 18 samples of sediments from Site 434 involved determining the content of organic carbon, of bitumoid A (The chloroform A-chl and alcohol-benzene A-alb extracts) and its various fractions, and of individual hydrocarbons as well as the structural group composition of resins. We identified certain samples that differed sharply from the rest by their increased bitumen content and relatively low molecular hydrocarbons and by the fact that their resinous components were more neutral and aliphatic in composition. The distribution of bitumoid and its components seems to reflect migration processes in operation during the early stages of the transformation of organic matter.
Resumo:
Palynological investigations in northeastern Bavaria (Bavarian Vogtland, Fichtelgebirge, Steinwald) reveal the Late Glacial and Postglacial history of the regional vegetation. Radiocarbon data in comparison with those from the neighbouring regions (Rhön, Oberpfälzer Wald, Bavarian Forests) show a time lag in the development of the arboreal vegetation due to migration processes. The Fichtelgebirge is the southernmost part ofnortheastern Bavaria where the early Alleröd period (pollen zone IIa) is characterised by a dominance of birch forests. Hazel reached maximal values around 8000 BP in the area from the Fichtelgebirge to the Bavarian Forests, e.g. about 600 years earlier than in the more northern Rhön mountains. For spruce there is a considerable time lag between the Bavarian Forests and the Fichtelgebirge. Spruce spreading started in the Fichtelgebirge during the older part of the Atlantic period (pollen zone VI). At the same time, spruce already was the dominant tree in the Bavarian Forests. During the younger part of the Atlantic period (pollen zone VII) spruce and mixed oak forest tree species frequently occurred in the Fichtelgebirge. At the end of pollen zone VI, spruce came to dominance. At the same time, the immigration of beech started. During the Subboreal period (pollen zone VIII), spruce remained being a dominant member in the forests and at the end of pollen zone VIII, fir began to spread rapidly. During the first part of the Subatlantic period (pollen zone IX) spruce, beech, fir and pine formed the mountainous forests in the Fichtelgebirge. In the area of the Bavarian Vogtland, however, fir was a dominant forest tree during pollen zone IX, while spruce and beech played a less important role. During the 12th century, human colonisation started in the area of the Fichtelgebirge. This is 400 years later as in the area of the Rhön mountains. Indicators for earlier forest clearances are rare or absent.
Resumo:
Digital atlases of animal development provide a quantitative description of morphogenesis, opening the path toward processes modeling. Prototypic atlases offer a data integration framework where to gather information from cohorts of individuals with phenotypic variability. Relevant information for further theoretical reconstruction includes measurements in time and space for cell behaviors and gene expression. The latter as well as data integration in a prototypic model, rely on image processing strategies. Developing the tools to integrate and analyze biological multidimensional data are highly relevant for assessing chemical toxicity or performing drugs preclinical testing. This article surveys some of the most prominent efforts to assemble these prototypes, categorizes them according to salient criteria and discusses the key questions in the field and the future challenges toward the reconstruction of multiscale dynamics in model organisms.
Resumo:
Los Centros de Datos se encuentran actualmente en cualquier sector de la economía mundial. Están compuestos por miles de servidores, dando servicio a los usuarios de forma global, las 24 horas del día y los 365 días del año. Durante los últimos años, las aplicaciones del ámbito de la e-Ciencia, como la e-Salud o las Ciudades Inteligentes han experimentado un desarrollo muy significativo. La necesidad de manejar de forma eficiente las necesidades de cómputo de aplicaciones de nueva generación, junto con la creciente demanda de recursos en aplicaciones tradicionales, han facilitado el rápido crecimiento y la proliferación de los Centros de Datos. El principal inconveniente de este aumento de capacidad ha sido el rápido y dramático incremento del consumo energético de estas infraestructuras. En 2010, la factura eléctrica de los Centros de Datos representaba el 1.3% del consumo eléctrico mundial. Sólo en el año 2012, el consumo de potencia de los Centros de Datos creció un 63%, alcanzando los 38GW. En 2013 se estimó un crecimiento de otro 17%, hasta llegar a los 43GW. Además, los Centros de Datos son responsables de más del 2% del total de emisiones de dióxido de carbono a la atmósfera. Esta tesis doctoral se enfrenta al problema energético proponiendo técnicas proactivas y reactivas conscientes de la temperatura y de la energía, que contribuyen a tener Centros de Datos más eficientes. Este trabajo desarrolla modelos de energía y utiliza el conocimiento sobre la demanda energética de la carga de trabajo a ejecutar y de los recursos de computación y refrigeración del Centro de Datos para optimizar el consumo. Además, los Centros de Datos son considerados como un elemento crucial dentro del marco de la aplicación ejecutada, optimizando no sólo el consumo del Centro de Datos sino el consumo energético global de la aplicación. Los principales componentes del consumo en los Centros de Datos son la potencia de computación utilizada por los equipos de IT, y la refrigeración necesaria para mantener los servidores dentro de un rango de temperatura de trabajo que asegure su correcto funcionamiento. Debido a la relación cúbica entre la velocidad de los ventiladores y el consumo de los mismos, las soluciones basadas en el sobre-aprovisionamiento de aire frío al servidor generalmente tienen como resultado ineficiencias energéticas. Por otro lado, temperaturas más elevadas en el procesador llevan a un consumo de fugas mayor, debido a la relación exponencial del consumo de fugas con la temperatura. Además, las características de la carga de trabajo y las políticas de asignación de recursos tienen un impacto importante en los balances entre corriente de fugas y consumo de refrigeración. La primera gran contribución de este trabajo es el desarrollo de modelos de potencia y temperatura que permiten describes estos balances entre corriente de fugas y refrigeración; así como la propuesta de estrategias para minimizar el consumo del servidor por medio de la asignación conjunta de refrigeración y carga desde una perspectiva multivariable. Cuando escalamos a nivel del Centro de Datos, observamos un comportamiento similar en términos del balance entre corrientes de fugas y refrigeración. Conforme aumenta la temperatura de la sala, mejora la eficiencia de la refrigeración. Sin embargo, este incremente de la temperatura de sala provoca un aumento en la temperatura de la CPU y, por tanto, también del consumo de fugas. Además, la dinámica de la sala tiene un comportamiento muy desigual, no equilibrado, debido a la asignación de carga y a la heterogeneidad en el equipamiento de IT. La segunda contribución de esta tesis es la propuesta de técnicas de asigación conscientes de la temperatura y heterogeneidad que permiten optimizar conjuntamente la asignación de tareas y refrigeración a los servidores. Estas estrategias necesitan estar respaldadas por modelos flexibles, que puedan trabajar en tiempo real, para describir el sistema desde un nivel de abstracción alto. Dentro del ámbito de las aplicaciones de nueva generación, las decisiones tomadas en el nivel de aplicación pueden tener un impacto dramático en el consumo energético de niveles de abstracción menores, como por ejemplo, en el Centro de Datos. Es importante considerar las relaciones entre todos los agentes computacionales implicados en el problema, de forma que puedan cooperar para conseguir el objetivo común de reducir el coste energético global del sistema. La tercera contribución de esta tesis es el desarrollo de optimizaciones energéticas para la aplicación global por medio de la evaluación de los costes de ejecutar parte del procesado necesario en otros niveles de abstracción, que van desde los nodos hasta el Centro de Datos, por medio de técnicas de balanceo de carga. Como resumen, el trabajo presentado en esta tesis lleva a cabo contribuciones en el modelado y optimización consciente del consumo por fugas y la refrigeración de servidores; el modelado de los Centros de Datos y el desarrollo de políticas de asignación conscientes de la heterogeneidad; y desarrolla mecanismos para la optimización energética de aplicaciones de nueva generación desde varios niveles de abstracción. ABSTRACT Data centers are easily found in every sector of the worldwide economy. They consist of tens of thousands of servers, serving millions of users globally and 24-7. In the last years, e-Science applications such e-Health or Smart Cities have experienced a significant development. The need to deal efficiently with the computational needs of next-generation applications together with the increasing demand for higher resources in traditional applications has facilitated the rapid proliferation and growing of data centers. A drawback to this capacity growth has been the rapid increase of the energy consumption of these facilities. In 2010, data center electricity represented 1.3% of all the electricity use in the world. In year 2012 alone, global data center power demand grew 63% to 38GW. A further rise of 17% to 43GW was estimated in 2013. Moreover, data centers are responsible for more than 2% of total carbon dioxide emissions. This PhD Thesis addresses the energy challenge by proposing proactive and reactive thermal and energy-aware optimization techniques that contribute to place data centers on a more scalable curve. This work develops energy models and uses the knowledge about the energy demand of the workload to be executed and the computational and cooling resources available at data center to optimize energy consumption. Moreover, data centers are considered as a crucial element within their application framework, optimizing not only the energy consumption of the facility, but the global energy consumption of the application. The main contributors to the energy consumption in a data center are the computing power drawn by IT equipment and the cooling power needed to keep the servers within a certain temperature range that ensures safe operation. Because of the cubic relation of fan power with fan speed, solutions based on over-provisioning cold air into the server usually lead to inefficiencies. On the other hand, higher chip temperatures lead to higher leakage power because of the exponential dependence of leakage on temperature. Moreover, workload characteristics as well as allocation policies also have an important impact on the leakage-cooling tradeoffs. The first key contribution of this work is the development of power and temperature models that accurately describe the leakage-cooling tradeoffs at the server level, and the proposal of strategies to minimize server energy via joint cooling and workload management from a multivariate perspective. When scaling to the data center level, a similar behavior in terms of leakage-temperature tradeoffs can be observed. As room temperature raises, the efficiency of data room cooling units improves. However, as we increase room temperature, CPU temperature raises and so does leakage power. Moreover, the thermal dynamics of a data room exhibit unbalanced patterns due to both the workload allocation and the heterogeneity of computing equipment. The second main contribution is the proposal of thermal- and heterogeneity-aware workload management techniques that jointly optimize the allocation of computation and cooling to servers. These strategies need to be backed up by flexible room level models, able to work on runtime, that describe the system from a high level perspective. Within the framework of next-generation applications, decisions taken at this scope can have a dramatical impact on the energy consumption of lower abstraction levels, i.e. the data center facility. It is important to consider the relationships between all the computational agents involved in the problem, so that they can cooperate to achieve the common goal of reducing energy in the overall system. The third main contribution is the energy optimization of the overall application by evaluating the energy costs of performing part of the processing in any of the different abstraction layers, from the node to the data center, via workload management and off-loading techniques. In summary, the work presented in this PhD Thesis, makes contributions on leakage and cooling aware server modeling and optimization, data center thermal modeling and heterogeneityaware data center resource allocation, and develops mechanisms for the energy optimization for next-generation applications from a multi-layer perspective.
Resumo:
There is a growing societal need to address the increasing prevalence of behavioral health issues, such as obesity, alcohol or drug use, and general lack of treatment adherence for a variety of health problems. The statistics, worldwide and in the USA, are daunting. Excessive alcohol use is the third leading preventable cause of death in the United States (with 79,000 deaths annually), and is responsible for a wide range of health and social problems. On the positive side though, these behavioral health issues (and associated possible diseases) can often be prevented with relatively simple lifestyle changes, such as losing weight with a diet and/or physical exercise, or learning how to reduce alcohol consumption. Medicine has therefore started to move toward finding ways of preventively promoting wellness, rather than solely treating already established illness. Evidence-based patient-centered Brief Motivational Interviewing (BMI) interven- tions have been found particularly effective in helping people find intrinsic motivation to change problem behaviors after short counseling sessions, and to maintain healthy lifestyles over the long-term. Lack of locally available personnel well-trained in BMI, however, often limits access to successful interventions for people in need. To fill this accessibility gap, Computer-Based Interventions (CBIs) have started to emerge. Success of the CBIs, however, critically relies on insuring engagement and retention of CBI users so that they remain motivated to use these systems and come back to use them over the long term as necessary. Because of their text-only interfaces, current CBIs can therefore only express limited empathy and rapport, which are the most important factors of health interventions. Fortunately, in the last decade, computer science research has progressed in the design of simulated human characters with anthropomorphic communicative abilities. Virtual characters interact using humans’ innate communication modalities, such as facial expressions, body language, speech, and natural language understanding. By advancing research in Artificial Intelligence (AI), we can improve the ability of artificial agents to help us solve CBI problems. To facilitate successful communication and social interaction between artificial agents and human partners, it is essential that aspects of human social behavior, especially empathy and rapport, be considered when designing human-computer interfaces. Hence, the goal of the present dissertation is to provide a computational model of rapport to enhance an artificial agent’s social behavior, and to provide an experimental tool for the psychological theories shaping the model. Parts of this thesis were already published in [LYL+12, AYL12, AL13, ALYR13, LAYR13, YALR13, ALY14].
Resumo:
Individuals and corporate users are persistently considering cloud adoption due to its significant benefits compared to traditional computing environments. The data and applications in the cloud are stored in an environment that is separated, managed and maintained externally to the organisation. Therefore, it is essential for cloud providers to demonstrate and implement adequate security practices to protect the data and processes put under their stewardship. Security transparency in the cloud is likely to become the core theme that underpins the systematic disclosure of security designs and practices that enhance customer confidence in using cloud service and deployment models. In this paper, we present a framework that enables a detailed analysis of security transparency for cloud based systems. In particular, we consider security transparency from three different levels of abstraction, i.e., conceptual, organisation and technical levels, and identify the relevant concepts within these levels. This allows us to provide an elaboration of the essential concepts at the core of transparency and analyse the means for implementing them from a technical perspective. Finally, an example from a real world migration context is given to provide a solid discussion on the applicability of the proposed framework.
Resumo:
El proceso de toma de decisiones en las bibliotecas universitarias es de suma importancia, sin embargo, se encuentra complicaciones como la gran cantidad de fuentes de datos y los grandes volúmenes de datos a analizar. Las bibliotecas universitarias están acostumbradas a producir y recopilar una gran cantidad de información sobre sus datos y servicios. Las fuentes de datos comunes son el resultado de sistemas internos, portales y catálogos en línea, evaluaciones de calidad y encuestas. Desafortunadamente estas fuentes de datos sólo se utilizan parcialmente para la toma de decisiones debido a la amplia variedad de formatos y estándares, así como la falta de métodos eficientes y herramientas de integración. Este proyecto de tesis presenta el análisis, diseño e implementación del Data Warehouse, que es un sistema integrado de toma de decisiones para el Centro de Documentación Juan Bautista Vázquez. En primer lugar se presenta los requerimientos y el análisis de los datos en base a una metodología, esta metodología incorpora elementos claves incluyendo el análisis de procesos, la calidad estimada, la información relevante y la interacción con el usuario que influyen en una decisión bibliotecaria. A continuación, se propone la arquitectura y el diseño del Data Warehouse y su respectiva implementación la misma que soporta la integración, procesamiento y el almacenamiento de datos. Finalmente los datos almacenados se analizan a través de herramientas de procesamiento analítico y la aplicación de técnicas de Bibliomining ayudando a los administradores del centro de documentación a tomar decisiones óptimas sobre sus recursos y servicios.
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Humanas, Departamento de Geografia, 2016.
Resumo:
In the recent past, hardly anyone could predict this course of GIS development. GIS is moving from desktop to cloud. Web 2.0 enabled people to input data into web. These data are becoming increasingly geolocated. Big amounts of data formed something that is called "Big Data". Scientists still don't know how to deal with it completely. Different Data Mining tools are used for trying to extract some useful information from this Big Data. In our study, we also deal with one part of these data - User Generated Geographic Content (UGGC). The Panoramio initiative allows people to upload photos and describe them with tags. These photos are geolocated, which means that they have exact location on the Earth's surface according to a certain spatial reference system. By using Data Mining tools, we are trying to answer if it is possible to extract land use information from Panoramio photo tags. Also, we tried to answer to what extent this information could be accurate. At the end, we compared different Data Mining methods in order to distinguish which one has the most suited performances for this kind of data, which is text. Our answers are quite encouraging. With more than 70% of accuracy, we proved that extracting land use information is possible to some extent. Also, we found Memory Based Reasoning (MBR) method the most suitable method for this kind of data in all cases.
Resumo:
The environmental and socio-economic importance of coastal areas is widely recognized, but at present these areas face severe weaknesses and high-risk situations. The increased demand and growing human occupation of coastal zones have greatly contributed to exacerbating such weaknesses. Today, throughout the world, in all countries with coastal regions, episodes of waves overtopping and coastal flooding are frequent. These episodes are usually responsible for property losses and often put human lives at risk. The floods are caused by coastal storms primarily due to the action of very strong winds. The propagation of these storms towards the coast induces high water levels. It is expected that climate change phenomena will contribute to the intensification of coastal storms. In this context, an estimation of coastal flooding hazards is of paramount importance for the planning and management of coastal zones. Consequently, carrying out a series of storm scenarios and analyzing their impacts through numerical modeling is of prime interest to coastal decision-makers. Firstly, throughout this work, historical storm tracks and intensities are characterized for the northeastern region of United States coast, in terms of probability of occurrence. Secondly, several storm events with high potential of occurrence are generated using a specific tool of DelftDashboard interface for Delft3D software. Hydrodynamic models are then used to generate ensemble simulations to assess storms' effects on coastal water levels. For the United States’ northeastern coast, a highly refined regional domain is considered surrounding the area of The Battery, New York, situated in New York Harbor. Based on statistical data of numerical modeling results, a review of the impact of coastal storms to different locations within the study area is performed.
Resumo:
En la investigación llevada a cabo nos hemos aproximado, desde un punto de vista cualitativo, a los recursos de formación para la inserción sociolaboral dirigidos a personas inmigrantes en la ciudad de Barcelona. Dicha formación forma parte de los recursos del sistema de bienestar español, caracterizado como mediterráneo (Esping-Andersen, 1990; Ferrara 1996; Moreno, 2002); y responde a los lineamientos de las políticas de integración dirigidas al mencionado colectivo. Para llevar a cabo el trabajo hemos adoptado la perspectiva metodológica de Antropología de las Políticas (Shore y Wright, 1997). La construcción del marco teórico bebió de los aportes que enfatizan el papel político de los Estados-nación en relación con los procesos migratorios (Sayad, 2010), señalando que la inmigración, lejos de ser un proceso que les “sucede” a las sociedades de recepción, es un fenómeno conformado por éstas (Geddes, 2006). La profunda transformación en los modos de la cohesión social (Castel, 1997) en las sociedades de recepción de personas inmigradas constituyen el contexto en el cual actúan las políticas de integración. A través de las condiciones de acceso a los recursos de formación para la inserción sociolaboral, mediante los contenidos impartidos y las maneras en que lo hacen, se configuran los inmigrantes “deseados” e “indeseados” funcionando como “fronteras organizativas”. Los resultados del análisis, indican que se espera que las personas inmigradas sean sujetos disponibles y activos, donde la formación emerge más que como un derecho que favorece y consolida la cohesión social o la “integración”, como un recurso que hay que “merecer”. Paralelamente a dicha emergencia, la formación se perfila como un dispositivo que antes que servir para la promoción social es un débil sustituto del empleo, fatigosamente anhelado por las personas que llevan a cabo los procesos formativos.
Resumo:
We present a study of the continuous-time equations governing the dynamics of a susceptible infected-susceptible model on heterogeneous metapopulations. These equations have been recently proposed as an alternative formulation for the spread of infectious diseases in metapopulations in a continuous-time framework. Individual-based Monte Carlo simulations of epidemic spread in uncorrelated networks are also performed revealing a good agreement with analytical predictions under the assumption of simultaneous transmission or recovery and migration processes
Resumo:
Ground-penetrating radar (GPR) and microgravimetric surveys have been conducted in the southern Jura mountains of western Switzerland in order to map subsurface karstic features. The study site, La Grande Rolaz cave, is an extensive system in which many portions have been mapped. By using small station spacing and careful processing for the geophysical data, and by modeling these data with topographic information from within the cave, accurate interpretations have been achieved. The constraints on the interpreted geologic models are better when combining the geophysical methods than when using only one of the methods, despite the general limitations of two-dimensional (2D) profiling. For example, microgravimetry can complement GPR methods for accurately delineating a shallow cave section approximately 10 X 10 mt in size. Conversely, GPR methods can be complementary in determining cavity depths and in verifying the presence of off-line features and numerous areas of small cavities and fractures, which may be difficult to resolve in microgravimetric data.
Resumo:
ABSTRACT The literature on fertilization for carrot growing usually recommends nutrient application rates for yield expectations lower than the yields currently obtained. Moreover, the recommendation only considers the results of soil chemical analysis and does not include effects such as crop residues or variations in yield levels. The aim of this study was to propose a fertilizer recommendation system for carrot cultivation (FERTICALC Carrot) which includes consideration of the nutrient supply by crop residues, variation in intended yield, soil chemical properties, and the growing season (winter or summer). To obtain the data necessary for modeling nutritional requirements, 210 carrot production stands were sampled in the region of Alto Paranaíba, State of Minas Gerais, Brazil. The dry matter content of the roots, the coefficient of biological utilization of nutrients in the roots, and the nutrient harvest index for summer and winter crops were determined for these samples. To model the nutrient supply by the soil, the literature was surveyed in regard to this theme. A modeling system was developed for recommendation of macronutrients and B. For cationic micronutrients, the system only reports crop nutrient export and extraction. The FERTICALC which was developed proved to be efficient for fertilizer recommendation for carrot cultivation. Advantages in relation to official fertilizer recommendation tables are continuous variation of nutrient application rates in accordance with soil properties and in accordance with data regarding the extraction efficiency of modern, higher yielding cultivars.
Resumo:
The following report contains district-level numbers for the percent of juveniles in Iowa’s Juvenile Court System and the number who re-offended (subsequent complaint/s). These numbers represent the data available in the ICIS system for initial complaints in 2002, an subsequent complaints. The report also includes recidivism numbers for those counties that had more than 180 initial complaints These numbers are representative, but are not state totals. All districts are represented, but no all counties. The 5th District only includes Polk County due to a data migration issue that, at this time, does not allow the remaining 5th District counties data to be analyzed.