976 resultados para cloud computing, cloud federation, concurrent live migration, data center, qemu, kvm, libvirt
Resumo:
Wilbur Zelinsky formulated a Hypothesis of Mobility Transition in 1971,in which he tried to relate all aspects of mobility to the Demographic Transition and modernisation. This dissertation applies the theoretical framework, proposed by Zelinsky and extended to encompass a family of transitions, to understand migration patterns of city regions. The two city regions, Brisbane and Stockholm, are selected as case studies, representing important city regions of similar size, but drawn from contrasting historical settings. A comparison of the case studies with the theoretical framework aims to determine how the relative contributions of net migration, the source areas of migrants, and the migration intensity change with modernisation. In addition, the research also aims to identify aspects of modernisation affecting migration. These aspects of migration are analysed with a "historical approach" and a "multivariate approach". An extensive investigation into the city regions' historical background provides the source, from which evidence for a relationship between migration and modernisation is extracted. With this historical approach, similarities and differences in migration patterns are identified. The other research approach analyse multivariate data, from the last two decades, on migration flows and modernisation. Correlations between migration and key aspects of modernisation are tested with multivariate regression, based on an alternative version of a spatial interaction model. The project demonstrates that the changing functions of cities and the structural modernisation are influential on migration. Similar patterns are found, regarding the relative contributions of net migration and natural increase to population growth. The research finds links between these changes in the relative contribution of net migration and demographic modernisation. The findings on variations in urban and rural source areas of migrants to city regions do not contradict the expected pattern, but data limitations prevent definite conclusion to be drawn. The assessment of variations in migration intensity resulted in the expected pattern not being supported. Based on Swedish data, the hypothesised increase in migration intensity is rejected. Interactional migration data also show patterns different from those derived from the theoretical framework. The findings, from both research approaches, suggested that structural modernisation affected migration flows more than demographic modernisation. The findings lead to a formulation of hypothesised patterns for migration to city regions. The study provides an important research contribution by applying the two research approaches to city regions. It also combines the study of internal and international migration to address the research objectives within a framework of transitional change.
Resumo:
Saipan, situated about 15° N. and 146° E., is one of the larger and more southerly of the Mariana Islands. The 15 small islands of this chain are strung along an eastwardly convex ridge for more than 400 miles north to south, midway between Honshu and New Guinea and about 1,200 miles east of the Philippines. Paralleling this ridge 60 to 100 miles further east is a deep submarine trench, beyond which lies the Pacific Basin proper. To the west is the Philippine Sea, generally deeper than 2,000 fathoms. The trench coincides with a zone of negative gravity anomalies, earthquake foci occur at increasing depths westward from it, and silica- and alumina-rich volcanic rocks characterize the emergent island chain itself. The contrast between these features and those of the Pacific Basin proper to the east is held to favor the conclusion that the Mariana island arc and trench define the structural and petrographic front of Asia.
Resumo:
Al giorno d'oggi il reinforcement learning ha dimostrato di essere davvero molto efficace nel machine learning in svariati campi, come ad esempio i giochi, il riconoscimento vocale e molti altri. Perciò, abbiamo deciso di applicare il reinforcement learning ai problemi di allocazione, in quanto sono un campo di ricerca non ancora studiato con questa tecnica e perchè questi problemi racchiudono nella loro formulazione un vasto insieme di sotto-problemi con simili caratteristiche, per cui una soluzione per uno di essi si estende ad ognuno di questi sotto-problemi. In questo progetto abbiamo realizzato un applicativo chiamato Service Broker, il quale, attraverso il reinforcement learning, apprende come distribuire l'esecuzione di tasks su dei lavoratori asincroni e distribuiti. L'analogia è quella di un cloud data center, il quale possiede delle risorse interne - possibilmente distribuite nella server farm -, riceve dei tasks dai suoi clienti e li esegue su queste risorse. L'obiettivo dell'applicativo, e quindi del data center, è quello di allocare questi tasks in maniera da minimizzare il costo di esecuzione. Inoltre, al fine di testare gli agenti del reinforcement learning sviluppati è stato creato un environment, un simulatore, che permettesse di concentrarsi nello sviluppo dei componenti necessari agli agenti, invece che doversi anche occupare di eventuali aspetti implementativi necessari in un vero data center, come ad esempio la comunicazione con i vari nodi e i tempi di latenza di quest'ultima. I risultati ottenuti hanno dunque confermato la teoria studiata, riuscendo a ottenere prestazioni migliori di alcuni dei metodi classici per il task allocation.
Resumo:
São muitas as organizações que por todo o mundo possuem instalações deste tipo, em Portugal temos o exemplo da Portugal Telecom que recentemente inaugurou o seu Data Center na Covilhã. O desenvolvimento de um Data Center exige assim um projeto muito cuidado, o qual entre outros aspetos deverá garantir a segurança da informação e das próprias instalações, nomeadamente no que se refere à segurança contra incêndio.
Resumo:
The goal of this project was the historical study of Patagonia, the southernmost human settled area of the world. Our aim has been the anlysis of those societies to understand how they maintained their socioeconomic system, that is to say, how social reproduction allowed resolving the basic contradiction between production and reproduction, which is characteristic of all hunter-gatherer societies. New computational technologies have allowed a bettwer integration of the big amount and diversity of data needed. We have analyzed the historical evolution of inter-group and intra-group social relationships among nomad patagonian societies. We have compared different social answers in front of the radical change of their immediate reality produced by european colonization. We have exhaustively analyzed the results of archaeological excavations from diferent regions. We have also integrated information from anthropology, ethnohistory and geography. We are developing computing programs to integrate geographical actual data with paleoenvironmental data about the dynamic nature of soil formation, vegetation, hidrography, etc., with information about the history of productive action and how products were distributed among different human groups.
Resumo:
Of the approximately 25,000 bridges in Iowa, 28% are classified as structurally deficient, functionally obsolete, or both. Because many Iowa bridges require repair or replacement with a relatively limited funding base, there is a need to develop new bridge materials that may lead to longer life spans and reduced life-cycle costs. In addition, new and effective methods for determining the condition of structures are needed to identify when the useful life has expired or other maintenance is needed. Due to its unique alloy blend, high-performance steel (HPS) has been shown to have improved weldability, weathering capabilities, and fracture toughness than conventional structural steels. Since the development of HPS in the mid-1990s, numerous bridges using HPS girders have been constructed, and many have been economically built. The East 12th Street Bridge, which replaced a deteriorated box girder bridge, is Iowa’s first bridge constructed using HPS girders. The new structure is a two-span bridge that crosses I-235 in Des Moines, Iowa, providing one lane of traffic in each direction. A remote, continuous, fiber-optic based structural health monitoring (SHM) system for the bridge was developed using off-the-shelf technologies. In the system, sensors strategically located on the bridge collect raw strain data and then transfer the data via wireless communication to a gateway system at a nearby secure facility. The data are integrated and converted to text files before being uploaded automatically to a website that provides live strain data and a live video stream. A data storage/processing system at the Bridge Engineering Center in Ames, Iowa, permanently stores and processes the data files. Several processes are performed to check the overall system’s operation, eliminate temperature effects from the complete strain record, compute the global behavior of the bridge, and count strain cycles at the various sensor locations.
Resumo:
Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.
Resumo:
BACKGROUND: The objectives of this study were to determine the proportions of psychiatric and substance use disorders suffered by emergency departments' (EDs') frequent users compared to the mainstream ED population, to evaluate how effectively these disorders were diagnosed in both groups of patients by ED physicians, and to determine if these disorders were predictive of a frequent use of ED services. METHODS: This study is a cross-sectional study with concurrent and retrospective data collection. Between November 2009 and June 2010, patients' mental health and substance use disorders were identified prospectively in face-to-face research interviews using a screening questionnaire (i.e. researcher screening). These data were compared to the data obtained from a retrospective medical chart review performed in August 2011, searching for mental health and substance use disorders diagnosed by ED physicians and recorded in the patients' ED medical files (i.e. ED physician diagnosis). The sample consisted of 399 eligible adult patients (≥18 years old) admitted to the urban, general ED of a University Hospital. Among them, 389 patients completed the researcher screening. Two hundred and twenty frequent users defined by >4 ED visits in the previous twelve months were included and compared to 169 patients with ≤4 ED visits in the same period (control group). RESULTS: Researcher screening showed that ED frequent users were more likely than members of the control group to have an anxiety, depressive disorder, post-traumatic stress disorder (PTSD), or suffer from alcohol, illicit drug abuse/addiction. Reviewing the ED physician diagnosis, we found that the proportions of mental health and substance use disorders diagnosed by ED physicians were low both among ED frequent users and in the control group. Using multiple logistic regression analyses to predict frequent ED use, we found that ED patients who screened positive for psychiatric disorders only and those who screened positive for both psychiatric and substance use disorders were more likely to be ED frequent users compared to ED patients with no disorder. CONCLUSIONS: This study found high proportions of screened mental health and/or substance use disorders in ED frequent users, but it showed low rates of detection of such disorders in day-to-day ED activities which can be a cause for concern. Active screening for these disorders in this population, followed by an intervention and/or a referral for treatment by a case-management team may constitute a relevant intervention for integration into a general ED setting.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Currently, the power generation is one of the most significant life aspects for the whole man-kind. Barely one can imagine our life without electricity and thermal energy. Thus, different technologies for producing those types of energy need to be used. Each of those technologies will always have their own advantages and disadvantages. Nevertheless, every technology must satisfy such requirements as efficiency, ecology safety and reliability. In the matter of the power generation with nuclear energy utilization these requirements needs to be highly main-tained, especially since accidents on nuclear power plants may cause very long term deadly consequences. In order to prevent possible disasters related to the accident on a nuclear power plant strong and powerful algorithms were invented in last decades. Such algorithms are able to manage calculations of different physical processes and phenomena of real facilities. How-ever, the results acquired by the computing must be verified with experimental data.
Resumo:
L’intégration des TIC a connu un essor considérable dans les dernières années et des chercheurs à travers le monde y accordent une importance sans cesse croissante ; le sujet des TIC en éducation est ainsi répandu au sein des écrits depuis maintenant plusieurs années (Istance & Kools, 2013; Storz & Hoffman, 2013). Dans un monde où les technologies sont omniprésentes dans la plupart des sphères d’activités, il ne s’agit plus de savoir si les technologies doivent être intégrées dans les activités d’enseignement et d’apprentissage, mais bien de quelle façon elles doivent l’être. Comme les TIC présentent de nombreux avantages, notamment en ce qui concerne la motivation scolaire et la réduction du fossé numérique, les différents intervenants du monde de l’éducation sont généralement conscients de l’importance de bien utiliser les technologies de l’information et de la communication (TIC) en éducation, mais ne savent pas toujours par où commencer. La présente recherche s’intéresse à une forme particulière d’intégration des TIC en éducation, soit les projets portables. Les projets portables se différencient par le fait que l’enseignant et chaque élève disposent de leur propre ordinateur portable dans le but d’une utilisation pédagogique. Cette thèse de doctorat tente de détailler, à travers un langage clair et accessible, les défis qu’il est possible de rencontrer à l’intérieur de tels projets, de même que ce qui peut être fait pour en limiter les impacts. En vue de déterminer les conditions pouvant favoriser le succès global des projets portables au Québec, voire ailleurs, une recension des écrits exhaustive a permis de relever quatre catégories de facteurs principales dans lesquelles l’ensemble des défis identifiés semblent pouvoir être classés : les facteurs relatifs à la gestion du projet, les facteurs internes à l’enseignant, les facteurs relatifs au cadre de travail de même que les facteurs relatifs à l’infrastructure et au matériel. Ces diverses catégories de facteurs sont abordées en détails à l’intérieur du cadre théorique de cette thèse de doctorat. En vue d’atteindre les objectifs, un questionnaire a été mis au point et plus de 300 enseignants d’une commission scolaire où a lieu un projet portable à grand déploiement y ont répondu. Les données de nature mixte (données quantitatives et qualitatives) ont été analysées à l’aide de logiciels spécialisés et ceci a permis de vérifier la pertinence des éléments rencontrés dans la recension des écrits, de même que d’en découvrir de nouveaux. Il a été trouvé que de nombreux défis sont susceptibles d’être rencontrés. Les plus importants ont trait à la qualité du matériel utilisé, à l’importance de la formation des enseignants relativement aux TIC, et à l’importance de mettre au point une vision claire assurant la pleine adhésion des enseignants. Il a aussi été déterminé que l’enseignant doit pouvoir accéder à un soutien pédagogique ainsi qu’à un soutien technique facilement. Enfin, il a été découvert que la nature des projets à grand déploiement fait en sorte qu’il importe de porter une attention particulière aux besoins locaux des enseignants, qui peuvent varier selon le contexte de travail de ceux-ci.
Resumo:
The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.
Resumo:
The fast increase in the size and number of databases demands data mining approaches that are scalable to large amounts of data. This has led to the exploration of parallel computing technologies in order to perform data mining tasks concurrently using several processors. Parallelization seems to be a natural and cost-effective way to scale up data mining technologies. One of the most important of these data mining technologies is the classification of newly recorded data. This paper surveys advances in parallelization in the field of classification rule induction.
Resumo:
With the prospect of exascale computing, computational methods requiring only local data become especially attractive. Consequently, the typical domain decomposition of atmospheric models means horizontally-explicit vertically-implicit (HEVI) time-stepping schemes warrant further attention. In this analysis, Runge-Kutta implicit-explicit schemes from the literature are analysed for their stability and accuracy using a von Neumann stability analysis of two linear systems. Attention is paid to the numerical phase to indicate the behaviour of phase and group velocities. Where the analysis is tractable, analytically derived expressions are considered. For more complicated cases, amplification factors have been numerically generated and the associated amplitudes and phase diagnosed. Analysis of a system describing acoustic waves has necessitated attributing the three resultant eigenvalues to the three physical modes of the system. To do so, a series of algorithms has been devised to track the eigenvalues across the frequency space. The result enables analysis of whether the schemes exactly preserve the non-divergent mode; and whether there is evidence of spurious reversal in the direction of group velocities or asymmetry in the damping for the pair of acoustic modes. Frequency ranges that span next-generation high-resolution weather models to coarse-resolution climate models are considered; and a comparison is made of errors accumulated from multiple stability-constrained shorter time-steps from the HEVI scheme with a single integration from a fully implicit scheme over the same time interval. Two schemes, “Trap2(2,3,2)” and “UJ3(1,3,2)”, both already used in atmospheric models, are identified as offering consistently good stability and representation of phase across all the analyses. Furthermore, according to a simple measure of computational cost, “Trap2(2,3,2)” is the least expensive.
Resumo:
The purpose of this thesis is to present and describe which criteria, according to the systemfamily Configuration Management, should be met when developing a CM-tool to handlemigration data.ACT is a tool developed by Microsoft to gather information about, analyze, test and mitigateapplications in a network when migrating the IT-infrastructure of an organization to a newoperating system. The organization that is being studied wants to present the data about theanalyzed applications in such a way, that a customer can choose what to mitigate and migrate.The goal is therefore to develop a prototype (CM-tool) that will present this data.The study has shown that ACT lacks certain requirements stated by the organization when itcomes to presentation. But when it comes to the rest of the functions, ACT performs as expected.The investigation resulted in specifications and technical solution for the new CM-tool. CMcriteriafor migration data was put forth and parts of the prototype were also developed.