894 resultados para consolidamento algoritmo cloud macchine virtuali


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este artículo menciona que el algoritmo pretende llenar un hueco existente en los análisis de sensibilidad de la Programación Lineal. Estos análisis abarcan tradicionalmente a todos los coeficientes del sistema excepto a los coeficientes técnicos de las variables de la BASE, debido a la dificultad de calcular la inversa de ésta cuando se ha introducido un parámetro en uno de sus elementos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Marine stratocumulus clouds are generally optically thick and shallow, exerting a net cooling influence on climate. Changes in atmospheric aerosol levels alter cloud microphysics (e.g., droplet size) and cloud macrophysics (e.g., liquid water path, cloud thickness), thereby affecting cloud albedo and Earth’s radiative balance. To understand the aerosol-cloud-precipitation interactions and to explore the dynamical effects, three-dimensional large-eddy simulations (LES) with detailed bin-resolved microphysics are performed to explore the diurnal variation of marine stratocumulus clouds under different aerosol levels and environmental conditions. It is shown that the marine stratocumulus cloud albedo is sensitive to aerosol perturbation under clean background conditions, and to environmental conditions such as large-scale divergence rate and free tropospheric humidity.

Based on the in-situ Eastern Pacific Emitted Aerosol Cloud Experiment (E-PEACE) during Jul. and Aug. 2011, and A-Train satellite observation of 589 individual ship tracks during Jun. 2006-Dec. 2009, an analysis of cloud albedo responses in ship tracks is presented. It is found that the albedo response in ship tracks depends on the mesoscale cloud structure, the free tropospheric humidity, and cloud top height. Under closed cell structure (i.e., cloud cells ringed by a perimeter of clear air), with sufficiently dry air above cloud tops and/or higher cloud top heights, the cloud albedo can become lower in ship tracks. Based on the satellite data, nearly 25% of ship tracks exhibited a decreased albedo. The cloud macrophysical responses are crucial in determining both the strength and the sign of the cloud albedo response to aerosols.

To understand the aerosol indirect effects on global marine warm clouds, multisensory satellite observations, including CloudSat, MODIS, CALIPSO, AMSR-E, ECMWF, CERES, and NCEP, have been applied to study the sensitivity of cloud properties to aerosol levels and to large scale environmental conditions. With an estimate of anthropogenic aerosol fraction, the global aerosol indirect radiative forcing has been assessed.

As the coupling among aerosol, cloud, precipitation, and meteorological conditions in the marine boundary layer is complex, the integration of LES modeling, in-situ aircraft measurements, and global multisensory satellite data analyses improves our understanding of this complex system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN --- the Community Seismic Network --- which uses relatively low-cost sensors deployed by members of the community, and (2) SAF --- the Situation Awareness Framework --- which integrates data from multiple sources, including the CSN, CISN --- the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California --- and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis focuses on improving the simulation skills and the theoretical understanding of the subtropical low cloud response to climate change.

First, an energetically consistent forcing framework is designed and implemented for the large eddy simulation (LES) of the low-cloud response to climate change. The three representative current-day subtropical low cloud regimes of cumulus (Cu), cumulus-over-stratocumulus, and stratocumulus (Sc) are all well simulated with this framework, and results are comparable to the conventional fixed-SST approach. However, the cumulus response to climate warming subject to energetic constraints differs significantly from the conventional approach with fixed SST. Under the energetic constraint, the subtropics warm less than the tropics, since longwave (LW) cooling is more efficient with the drier subtropical free troposphere. The surface latent heat flux (LHF) also increases only weakly subject to the surface energetic constraint. Both factors contribute to an increased estimated inversion strength (EIS), and decreased inversion height. The decreased Cu-depth contributes to a decrease of liquid water path (LWP) and weak positive cloud feedback. The conventional fixed-SST approach instead simulates a strong increase in LHF and deepening of the Cu layer, leading to a weakly negative cloud feedback. This illustrates the importance of energetic constraints to the simulation and understanding of the sign and magnitude of low-cloud feedback.

Second, an extended eddy-diffusivity mass-flux (EDMF) closure for the unified representation of sub-grid scale (SGS) turbulence and convection processes in general circulation models (GCM) is presented. The inclusion of prognostic terms and the elimination of the infinitesimal updraft fraction assumption makes it more flexible for implementation in models across different scales. This framework can be consistently extended to formulate multiple updrafts and downdrafts, as well as variances and covariances. It has been verified with LES in different boundary layer regimes in the current climate, and further development and implementation of this closure may help to improve our simulation skills and understanding of low-cloud feedback through GCMs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[ES]Este trabajo presenta un algoritmo automatizado cuyo resultado es la determinación de las ganancias óptimas del lazo de control de un mecanismo de cinemática paralela. En concreto se ha aplicado al mecanismo 5R, aunque el método es válido para cualquier otro mecanismo introduciendo el modelo mecatrónico correspondiente. Permite disponer de un procedimiento para poder elegir en un futuro la combinación de motor y reductora más apropiada para un determinado mecanismo evitando realizar adquisiciones sobredimensionadas, como ocurrió con el mecanismo en cuestión.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En esta memoria se trata el problema de encontrar un algoritmo que construya un emparejamiento entre dos grupos, entendiendo por emparejamiento la asignacion a cada individuo, de cada grupo, otro individuo. La situaci on inicial de la que parte el problema es la siguiente: Dos grupos, los proponentes y los propuestos, que est an formados por n individuos cada uno, siendo n la dimensi on del problema. El grupo de los proponentes es el encargado de hacer las propuestas a la hora de construir el emparejamiento. El grupo de los propuestos es el encargado de recibir y gestionar las propuestas a la hora de construir el emparejamiento. Cada individuo de cada grupo ordena en una lista, de manera decreciente, a individuos del otro grupo atendiendo a su preferencia a la hora de ser emparejado, a esta lista la llamaremos lista de preferencia del individuo, considerando el quedarse solo la opci on menos preferida de entre las aceptables. El objetivo del problema es crear un emparejamiento en el que cada pareja sea satisfactoria para los individuos que la crean en base a las preferencias de cada uno.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neste trabalho é estudada a viabilidade de uma implementação em paralelo do algoritmo scale invariant feature transform (SIFT) para identificação de íris. Para a implementação do código foi utilizada a arquitetura para computação paralela compute unified device architecture (CUDA) e a linguagem OpenGL shading language (GLSL). O algoritmo foi testado utilizando três bases de dados de olhos e íris, o noisy visible wavelength iris image Database (UBIRIS), Michal-Libor e CASIA. Testes foram feitos para determinar o tempo de processamento para verificação da presença ou não de um indivíduo em um banco de dados, determinar a eficiência dos algoritmos de busca implementados em GLSL e CUDA e buscar valores de calibração que melhoram o posicionamento e a distribuição dos pontos-chave na região de interesse (íris) e a robustez do programa final.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A obtenção de imagens usando tomografia computadorizada revolucionou o diagnóstico de doenças na medicina e é usada amplamente em diferentes áreas da pesquisa científica. Como parte do processo de obtenção das imagens tomográficas tridimensionais um conjunto de radiografias são processadas por um algoritmo computacional, o mais usado atualmente é o algoritmo de Feldkamp, David e Kress (FDK). Os usos do processamento paralelo para acelerar os cálculos em algoritmos computacionais usando as diferentes tecnologias disponíveis no mercado têm mostrado sua utilidade para diminuir os tempos de processamento. No presente trabalho é apresentada a paralelização do algoritmo de reconstrução de imagens tridimensionais FDK usando unidades gráficas de processamento (GPU) e a linguagem CUDA-C. São apresentadas as GPUs como uma opção viável para executar computação paralela e abordados os conceitos introdutórios associados à tomografia computadorizada, GPUs, CUDA-C e processamento paralelo. A versão paralela do algoritmo FDK executada na GPU é comparada com uma versão serial do mesmo, mostrando maior velocidade de processamento. Os testes de desempenho foram feitos em duas GPUs de diferentes capacidades: a placa NVIDIA GeForce 9400GT (16 núcleos) e a placa NVIDIA Quadro 2000 (192 núcleos).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: To ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. Copyright: © 2015 Bildosola et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As preocupações com o uso da terra têm permeado inúmeros estudos científicos, no âmbito nacional e internacional, voltados para a avaliação dos impactos ambientais causados pelas atividades agropecuárias. Alguns processos do ciclo hidrológico, a exemplo da evapotranspiração apresentam modificações consideráveis, devido às constantes mudanças nos usos dos solos. Desta forma, o presente trabalho busca destacar o problema das rápidas e intensas mudanças no uso do solo oriundas da expansão da atividade agropecuária e seus impactos ao meio ambiente, especialmente sobre o processo da evapotranspiração regional, na mesorregião do Sul Goiano, região típica de cerrado, localizada no centro-oeste brasileiro. A aplicação do algoritmo Surface Energy Balance Algorithm for Land - SEBAL consistiu o cerne da metodologia utilizada, com vista à estimativa dos fluxos de energia e da evapotranspiração em escala regional, obtidos com base no equacionamento do balanço de energia à superfície, complementado por dados de temperatura do ar e velocidade do vento adquiridos em estações meteorológicas (PCDs) instaladas na área de estudo. Foram utilizados dados do sensor MODIS/TERRA dos anos 2006, 2007, 2008, 2009 e 2010. O algoritmo foi testado em sua forma clássica e modificado por alterações nos critérios de seleção dos pixels âncoras, utilizados no procedimento da estimativa do fluxo de calor sensível. Pode-se concluir que a alteração dos critérios influenciou positivamente os resultados obtidos e que os valores da evapotranspiração, na região estudada, indicaram a potencialidade da metodologia empregada para o monitoramento sistemático dos componentes do balanço de energia em escala regional.