916 resultados para clouds
Resumo:
En este documento está descrito detalladamente el trabajo realizado para completar todos objetivos marcados para este Trabajo de Fin de Grado, que tiene como meta final el desarrollo de un dashboard configurable de gestión y administración para instancias de OpenStack. OpenStack es una plataforma libre y de código abierto utilizada como solución de Infraestructura como Servicio (Infrastructure as a Service, IaaS) en clouds tanto públicos, que ofrecen sus servicios cobrando el tiempo de uso o los recursos utilizados, como privados para su utilización exclusiva en el entorno de una empresa. El proyecto OpenStack se inició como una colaboración entre la NASA y RackSpace, y a día de hoy es mantenido por las empresas más potentes del sector tecnológico a través de la Fundación OpenStack. La plataforma OpenStack permite el acceso a sus servicios a través de una Interfaz de Linea de Comandos (Command Line Interface, CLI), una API RESTful y una interfaz web en forma de dashboard. Esta última es ofrecida a través del servicio Horizon. Este servicio provee de una interfaz gráfica para acceder, gestionar y automatizar servicios basados en cloud. El dashboard de Horizon presente algunos problemas como que: solo admite opciones de configuración mediante código Python, lo que hace que el usuario no tenga ninguna capacidad de configuración y que el administrador esté obligado a interactuar directamente con el código. no tiene soporte para múltiples regiones que permitan que un usuario pueda distribuir sus recursos por distintos centros de datos en diversas localizaciones como más le convenga. El presente Trabajo de Fin de Grado, que es la fase inicial del proyecto FI-Dash, pretende solucionar estos problemas mediante el desarrollo de un catálogo de widget de la plataformaWireCloud que permitirán al usuario tener todas las funcionalidades ofrecidas por Horizon a la vez que le ofrecen capacidades de configuración y añaden funcionalidades no presentes en Horizon como el soporte de múltiples regiones. Como paso previo al desarrollo del catálogo de widgets se ha llevado a cabo un estudio de las tecnologías y servicios ofrecidos por OpenStack, así como de las herramientas que pudieran ser necesarias para la realización del trabajo. El proceso de desarrollo ha sido dividido en distintas fases de acuerdo con los distintos componentes que forman parte del dashboard cada uno con una funcion de gestion sobre un tipo de recurso distinto. Las otras fases del desarrollo han sido la integración completa del dashboard en la plataforma WireCloud y el diseño de una interfaz gráfica usable y atractiva.---ABSTRACT---Throughout this document it is described the work performed in order to achieve all of the objectives set for this Final Project, which has as its main goal the development of a configurable dashboard for managing and administrating OpenStack instances. OpenStack is a free and open source platform used as Infrastructure as a Service (IaaS) for both public clouds, which offer their services through payments on time or resources used, and private clouds for use only in the company’s environment. The OpenStack project started as a collaboration between NASA and Rackspace, and nowadays is maintained by the most powerful companies in the technology sector through the OpenStack Foundation. The OpenStack project provides access to its services through a Command Line Interface (CLI), a RESTful API and a web interface as dashboard. The latter is offered through a service called Horizon. This service provides a graphical interface to access, manage and automate cloud-based services. Horizon’s dashboard presents some problems such as: Only supports configuration options using Python code, which grants the user no configuration capabilities and forces the administrator to interact directly. No support for multiple regions that allow a user to allocate his resources by different data centers in different locations at his convenience. This Final Project, which is the initial stage of the FI-Dash project, aims to solve these problems by developing a catalog of widgets for the WireCloud platform that will allow the user to have all the features offered by Horizon while offering configuration capabilities and additional features not present in Horizon such as support for multiple regions. As a prelude to the development of the widget catalog, a study of technologies and services offered by OpenStack as well as tools that may be necessary to carry out the work has been conducted. The development process has been split in phases matching the different components that are part of the dashboard, having each one of them a function of management of one kind of resource. The other development phases have been the achieving of full integration with WireCloud and the design of a graphical interface that is both usable and atractive.
Resumo:
Se está produciendo en la geodesia un cambio de paradigma en la concepción de los modelos digitales del terreno, pasando de diseñar el modelo con el menor número de puntos posibles a hacerlo con cientos de miles o millones de puntos. Este cambio ha sido consecuencia de la introducción de nuevas tecnologías como el escáner láser, la interferometría radar y el tratamiento de imágenes. La rápida aceptación de estas nuevas tecnologías se debe principalmente a la gran velocidad en la toma de datos, a la accesibilidad por no precisar de prisma y al alto grado de detalle de los modelos. Los métodos topográficos clásicos se basan en medidas discretas de puntos que considerados en su conjunto forman un modelo; su precisión se deriva de la precisión en la toma singular de estos puntos. La tecnología láser escáner terrestre (TLS) supone una aproximación diferente para la generación del modelo del objeto observado. Las nubes de puntos, producto del escaneo con TLS, pasan a ser tratadas en su conjunto mediante análisis de áreas, de forma que ahora el modelo final no es el resultado de una agregación de puntos sino la de la mejor superficie que se adapta a las nubes de puntos. Al comparar precisiones en la captura de puntos singulares realizados con métodos taquimétricos y equipos TLS la inferioridad de estos últimos es clara; sin embargo es en el tratamiento de las nubes de puntos, con los métodos de análisis basados en áreas, se han obtenido precisiones aceptables y se ha podido considerar plenamente la incorporación de esta tecnología en estudios de deformaciones y movimientos de estructuras. Entre las aplicaciones del TLS destacan las de registro del patrimonio, registro de las fases en la construcción de plantas industriales y estructuras, atestados de accidentes y monitorización de movimientos del terreno y deformaciones de estructuras. En la auscultación de presas, comparado con la monitorización de puntos concretos dentro, en coronación o en el paramento de la presa, disponer de un modelo continuo del paramento aguas abajo de la presa abre la posibilidad de introducir los métodos de análisis de deformaciones de superficies y la creación de modelos de comportamiento que mejoren la comprensión y previsión de sus movimientos. No obstante, la aplicación de la tecnología TLS en la auscultación de presas debe considerarse como un método complementario a los existentes. Mientras que los péndulos y la reciente técnica basada en el sistema de posicionamiento global diferencial (DGPS) dan una información continua de los movimientos de determinados puntos de la presa, el TLS permite ver la evolución estacional y detectar posibles zonas problemáticas en todo el paramento. En este trabajo se analizan las características de la tecnología TLS y los parámetros que intervienen en la precisión final de los escaneos. Se constata la necesidad de utilizar equipos basados en la medida directa del tiempo de vuelo, también llamados pulsados, para distancias entre 100 m y 300 m Se estudia la aplicación del TLS a la modelización de estructuras y paramentos verticales. Se analizan los factores que influyen en la precisión final, como el registro de nubes, tipo de dianas y el efecto conjunto del ángulo y la distancia de escaneo. Finalmente, se hace una comparación de los movimientos dados por los péndulos directos de una presa con los obtenidos del análisis de las nubes de puntos correspondientes a varias campañas de escaneos de la misma presa. Se propone y valida el empleo de gráficos patrón para relacionar las variables precisión o exactitud con los factores distancia y ángulo de escaneo en el diseño de trabajos de campo. Se expone su aplicación en la preparación del trabajo de campo para la realización de una campaña de escaneos dirigida al control de movimientos de una presa y se realizan recomendaciones para la aplicación de la técnica TLS a grandes estructuras. Se ha elaborado el gráfico patrón de un equipo TLS concreto de alcance medio. Para ello se hicieron dos ensayos de campo en condiciones reales de trabajo, realizando escaneos en todo el rango de distancias y ángulos de escaneo del equipo. Se analizan dos métodos para obtener la precisión en la modelización de paramentos y la detección de movimientos de estos: el método del “plano de mejor ajuste” y el método de la “deformación simulada”. Por último, se presentan los resultados de la comparación de los movimientos estacionales de una presa arco-gravedad entre los registrados con los péndulos directos y los obtenidos a partir de los escaneos realizados con un TLS. Los resultados muestran diferencias de milímetros, siendo el mejor de ellos del orden de un milímetro. Se explica la metodología utilizada y se hacen consideraciones respecto a la densidad de puntos de las nubes y al tamaño de las mallas de triángulos. A shift of paradigm in the conception of the survey digital models is taking place in geodesy, moving from designing a model with the fewer possible number of points to models of hundreds of thousand or million points. This change has happened because of the introduction of new technologies like the laser scanner, the interferometry radar and the processing of images. The fast acceptance of these new technologies has been due mainly to the great speed getting the data, to the accessibility as reflectorless technique, and to the high degree of detail of the models. Classic survey methods are based on discreet measures of points that, considered them as a whole, form a model; the precision of the model is then derived from the precision measuring the single points. The terrestrial laser scanner (TLS) technology supposes a different approach to the model generation of the observed object. Point cloud, the result of a TLS scan, must be treated as a whole, by means of area-based analysis; so, the final model is not an aggregation of points but the one resulting from the best surface that fits with the point cloud. Comparing precisions between the one resulting from the capture of singular points made with tachometric measurement methods and with TLS equipment, the inferiority of this last one is clear; but it is in the treatment of the point clouds, using area-based analysis methods, when acceptable precisions have been obtained and it has been possible to consider the incorporation of this technology for monitoring structures deformations. Among TLS applications it have to be emphasized those of registry of the cultural heritage, stages registry during construction of industrial plants and structures, police statement of accidents and monitorization of land movements and structures deformations. Compared with the classical dam monitoring, approach based on the registry of a set of points, the fact having a continuous model of the downstream face allows the possibility of introducing deformation analysis methods and behavior models that would improve the understanding and forecast of dam movements. However, the application of TLS technology for dam monitoring must be considered like a complementary method with the existing ones. Pendulums and recently the differential global positioning system (DGPS) give a continuous information of the movements of certain points of the dam, whereas TLS allows following its seasonal evolution and to detect damaged zones of the dam. A review of the TLS technology characteristics and the factors affecting the final precision of the scanning data is done. It is stated the need of selecting TLS based on the direct time of flight method, also called pulsed, for scanning distances between 100m and 300m. Modelling of structures and vertical walls is studied. Factors that influence in the final precision, like the registry of point clouds, target types, and the combined effect of scanning distance and angle of incidence are analyzed. Finally, a comparison among the movements given by the direct pendulums of a dam and the ones obtained from the analysis of point clouds is done. A new approach to obtain a complete map-type plot of the precisions of TLS equipment based on the direct measurement of time of flight method at midrange distances is presented. Test were developed in field-like conditions, similar to dam monitoring and other civil engineering works. Taking advantage of graphic semiological techniques, a “distance - angle of incidence” map based was designed and evaluated for field-like conditions. A map-type plot was designed combining isolines with sized and grey scale points, proportional to the precision values they represent. Precisions under different field conditions were compared with specifications. For this purpose, point clouds were evaluated under two approaches: the standar "plane-of-best-fit" and the proposed "simulated deformation”, that showed improved performance. These results lead to a discussion and recommendations about optimal TLS operation in civil engineering works. Finally, results of the comparison of seasonal movements of an arc-gravity dam between the registered by the direct pendulums ant the obtained from the TLS scans, are shown. The results show differences of millimeters, being the best around one millimeter. The used methodology is explained and considerations with respect to the point cloud density and to the size of triangular meshes are done.
Resumo:
High-resolution video microscopy, image analysis, and computer simulation were used to study the role of the Spitzenkörper (Spk) in apical branching of ramosa-1, a temperature-sensitive mutant of Aspergillus niger. A shift to the restrictive temperature led to a cytoplasmic contraction that destabilized the Spk, causing its disappearance. After a short transition period, new Spk appeared where the two incipient apical branches emerged. Changes in cell shape, growth rate, and Spk position were recorded and transferred to the fungus simulator program to test the hypothesis that the Spk functions as a vesicle supply center (VSC). The simulation faithfully duplicated the elongation of the main hypha and the two apical branches. Elongating hyphae exhibited the growth pattern described by the hyphoid equation. During the transition phase, when no Spk was visible, the growth pattern was nonhyphoid, with consecutive periods of isometric and asymmetric expansion; the apex became enlarged and blunt before the apical branches emerged. Video microscopy images suggested that the branch Spk were formed anew by gradual condensation of vesicle clouds. Simulation exercises where the VSC was split into two new VSCs failed to produce realistic shapes, thus supporting the notion that the branch Spk did not originate by division of the original Spk. The best computer simulation of apical branching morphogenesis included simulations of the ontogeny of branch Spk via condensation of vesicle clouds. This study supports the hypothesis that the Spk plays a major role in hyphal morphogenesis by operating as a VSC—i.e., by regulating the traffic of wall-building vesicles in the manner predicted by the hyphoid model.
Resumo:
This paper deals with a luminous electric discharge that forms in the mesospheric region between thundercloud tops and the ionosphere at 90-km altitude. These cloud–ionosphere discharges (CIs), following visual reports dating back to the 19th century, were finally imaged by a low-light TV camera as part of the “SKYFLASH” program at the University of Minnesota in 1989. Many observations were made by various groups in the period 1993–1996. The characteristics of CIs are that they have a wide range of sizes from a few kilometers up to 50 km horizontally; they extend from 40 km to nearly 90 km vertically, with an intense region near 60–70 km and streamers extending down toward cloud tops; the CIs are partly or entirely composed of vertical luminous filaments of kilometer size. The predominate color is red. The TV images show that the CIs usually have a duration less than one TV field (16.7 ms), but higher-speed photometric measurements show that they last about 3 ms, and are delayed 3 ms after an initiating cloud–ground lightning stroke; 95% of these initiating strokes are found to be “positive”—i.e., carry positive charges from clouds to ground. The preference for positive initiating strokes is not understood. Theories of the formation of CIs are briefly reviewed.
Resumo:
The Arp2/3 complex is implicated in actin polymerization-driven movement of Listeria monocytogenes. Here, we find that Arp2p and Arc15p, two subunits of this complex, show tight, actin-independent association with isolated yeast mitochondria. Arp2p colocalizes with mitochondria. Consistent with this result, we detect Arp2p-dependent formation of actin clouds around mitochondria in intact yeast. Cells bearing mutations in ARP2 or ARC15 genes show decreased velocities of mitochondrial movement, loss of all directed movement and defects in mitochondrial morphology. Finally, we observe a decrease in the velocity and extent of mitochondrial movement in yeast in which actin dynamics are reduced but actin cytoskeletal structure is intact. These results support the idea that the movement of mitochondria in yeast is actin polymerization driven and that this movement requires Arp2/3 complex.
Resumo:
The effect of desert dust on cloud properties and precipitation has so far been studied solely by using theoretical models, which predict that rainfall would be enhanced. Here we present observations showing the contrary; the effect of dust on cloud properties is to inhibit precipitation. Using satellite and aircraft observations we show that clouds forming within desert dust contain small droplets and produce little precipitation by drop coalescence. Measurement of the size distribution and the chemical analysis of individual Saharan dust particles collected in such a dust storm suggest a possible mechanism for the diminished rainfall. The detrimental impact of dust on rainfall is smaller than that caused by smoke from biomass burning or anthropogenic air pollution, but the large abundance of desert dust in the atmosphere renders it important. The reduction of precipitation from clouds affected by desert dust can cause drier soil, which in turn raises more dust, thus providing a possible feedback loop to further decrease precipitation. Furthermore, anthropogenic changes of land use exposing the topsoil can initiate such a desertification feedback process.
Resumo:
The forcings that drive long-term climate change are not known with an accuracy sufficient to define future climate change. Anthropogenic greenhouse gases (GHGs), which are well measured, cause a strong positive (warming) forcing. But other, poorly measured, anthropogenic forcings, especially changes of atmospheric aerosols, clouds, and land-use patterns, cause a negative forcing that tends to offset greenhouse warming. One consequence of this partial balance is that the natural forcing due to solar irradiance changes may play a larger role in long-term climate change than inferred from comparison with GHGs alone. Current trends in GHG climate forcings are smaller than in popular “business as usual” or 1% per year CO2 growth scenarios. The summary implication is a paradigm change for long-term climate projections: uncertainties in climate forcings have supplanted global climate sensitivity as the predominant issue.
Resumo:
With the advent of the new extragalactic deuterium observations, Big Bang nucleosynthesis (BBN) is on the verge of undergoing a transformation. In the past, the emphasis has been on demonstrating the concordance of the BBN model with the abundances of the light isotopes extrapolated back to their primordial values by using stellar and galactic evolution theories. As a direct measure of primordial deuterium is converged upon, the nature of the field will shift to using the much more precise primordial D/H to constrain the more flexible stellar and galactic evolution models (although the question of potential systematic error in 4He abundance determinations remains open). The remarkable success of the theory to date in establishing the concordance has led to the very robust conclusion of BBN regarding the baryon density. This robustness remains even through major model variations such as an assumed first-order quark-hadron phase transition. The BBN constraints on the cosmological baryon density are reviewed and demonstrate that the bulk of the baryons are dark and also that the bulk of the matter in the universe is nonbaryonic. Comparison of baryonic density arguments from Lyman-α clouds, x-ray gas in clusters, and the microwave anisotropy are made.
Resumo:
We present the results of additional observations of the high energy source GRS 1915+105, which produces ejecta with apparent superluminal motions. The observations reported here were carried out with the Very Large Array at 3.5 cm and 20 cm. The 3.5-cm observations made during 1994 May allowed us to continue following the proper motions of the bright 1994 March 19 ejecta, as well as those of a subsequent, fainter ejection. The proper motions of the 1994 March 19 ejecta continued to be ballistic (i.e., constant) over the period of about 75 days where they remained detectable. From the observations in 1994 March-May we have identified three ejections of pairs of plasma clouds moving ballistically in approximately the same direction on the sky with similar proper motions. The 20-cm observations made during 1994 November and December were used to search, yet unsuccessfully, for extended jets or lobes associated with GRS 1915+105.
Resumo:
Context. During the course of a large spectroscopic survey of X-ray active late-type stars in the solar neighbourhood, we discovered four lithium-rich stars packed within just a few degrees on the sky. Although located in a sky area rich in CO molecular regions and dark clouds, the Cepheus-Cassiopeia complex, these very young stars are projected several degrees away from clouds in front of an area void of interstellar matter. As such, they are very good "isolated" T Tauri star candidates. Aims. We present optical observations of these stars conducted with 1-2 m class telescopes. We acquired high-resolution optical spectra as well as photometric data allowing us to investigate in detail their nature and physical parameters with the aim of testing the "runaway" and "in-situ" formation scenarios. Their kinematical properties are also analyzed to investigate their possible connection to already known stellar kinematic groups. Methods. We use the cross-correlation technique and other tools developed by us to derive accurate radial and rotational velocities and perform an automatic spectral classification. The spectral subtraction technique is used to infer chromospheric activity level in the Hα line core and clean the spectra of photospheric lines before measuring the equivalent width of the lithium absorption line. Results. Both physical (lithium content, chromospheric, and coronal activities) and kinematical indicators show that all stars are very young, with ages probably in the range 10-30 Myr. In particular, the spectral energy distribution of TYC4496-780-1 displays a strong near-and far-infrared excess, typical of T Tauri stars still surrounded by an accretion disc. They also share the same Galactic motion, proving that they form a homogeneous moving group of stars with the same origin. Conclusions. The most plausible explanation of how these "isolated" T Tauri stars formed is the "in-situ" model, although accurate distances are needed to clarify their connection with the Cepheus-Cassiopeia complex. The discovery of this loose association of "isolated" T Tauri stars can help to shed light on atypical formation processes of stars and planets in low-mass clouds.
Resumo:
In this thesis, we present the generation and studies of a 87Rb Bose-Einstein condensate (BEC) perturbed by an oscillatory excitation. The atoms are trapped in a harmonic magnetic trap where, after an evaporative cooling process, we produce the BEC. In order to study the effect caused by oscillatory excitations, a quadrupole magnetic field time oscillatory is superimposed to the trapping potential. Through this perturbation, collective modes were observed. The dipole mode is excited even for low excitation amplitudes. However, a minimum excitation energy is needed to excite the condensate quadrupole mode. Observing the excited cloud in TOF expansion, we note that for excitation amplitude in which the quadrupole mode is excited, the cloud expands without invert its aspect ratio. By looking these clouds, after long time-of-flight, it was possible to see vortices and, sometimes, a turbulent state in the condensed cloud. We calculated the momentum distribution of the perturbed BECs and a power law behavior, like the law to Kolmogorov turbulence, was observed. Furthermore, we show that using the method that we have developed to calculate the momentum distribution, the distribution curve (including the power law exponent) exhibits a dependence on the quadrupole mode oscillation of the cloud. The randomness distribution of peaks and depletions in density distribution image of an expanded turbulent BEC, remind us to the intensity profile of a speckle light beam. The analogy between matter-wave speckle and light speckle is justified by showing the similarities in the spatial propagation (or time expansion) of the waves. In addition, the second order correlation function is evaluated and the same dependence with distance was observed for the both waves. This creates the possibility to understand the properties of quantum matter in a disordered state. The propagation of a three-dimensional speckle field (as the matter-wave speckle described here) creates an opportunity to investigate the speckle phenomenon existing in dimensions higher than 2D (the case of light speckle).
Resumo:
Com o atual desenvolvimento industrial e tecnológico da sociedade, a presença de substâncias inflamáveis e/ou tóxicas aumentou significativamente em um grande número de atividades. A possível dispersão de gases perigosos em instalações de armazenamento ou em operações de transporte representam uma grande ameaça à saúde e ao meio ambiente. Portanto, a caracterização de uma nuvem inflamável e/ou tóxica é um ponto crítico na análise quantitativa de riscos. O objetivo principal desta tese foi fornecer novas perspectivas que pudessem auxiliar analistas de risco envolvidos na análise de dispersões em cenários complexos, por exemplo, cenários com barreiras ou semi-confinados. A revisão bibliográfica mostrou que, tradicionalmente, modelos empíricos e integrais são usados na análise de dispersão de substâncias tóxicas / inflamáveis, fornecendo estimativas rápidas e geralmente confiáveis ao descrever cenários simples (por exemplo, dispersão em ambientes sem obstruções sobre terreno plano). No entanto, recentemente, o uso de ferramentas de CFD para simular dispersões aumentou de forma significativa. Estas ferramentas permitem modelar cenários mais complexos, como os que ocorrem em espaços semi-confinados ou com a presença de barreiras físicas. Entre todas as ferramentas CFD disponíveis, consta na bibliografia que o software FLACS® tem bom desempenho na simulação destes cenários. Porém, como outras ferramentas similares, ainda precisa ser totalmente validado. Após a revisão bibliográfica sobre testes de campo já executados ao longo dos anos, alguns testes foram selecionados para realização de um exame preliminar de desempenho da ferramenta CFD utilizado neste estudo. Foram investigadas as possíveis fontes de incertezas em termos de capacidade de reprodutibilidade, de dependência de malha e análise de sensibilidade das variáveis de entrada e parâmetros de simulação. Os principais resultados desta fase foram moldados como princípios práticos a serem utilizados por analistas de risco ao realizar análise de dispersão com a presença de barreiras utilizando ferramentas CFD. Embora a revisão bibliográfica tenha mostrado alguns dados experimentais disponíveis na literatura, nenhuma das fontes encontradas incluem estudos detalhados sobre como realizar simulações de CFD precisas nem fornecem indicadores precisos de desempenho. Portanto, novos testes de campo foram realizados a fim de oferecer novos dados para estudos de validação mais abrangentes. Testes de campo de dispersão de nuvem de propano (com e sem a presença de barreiras obstruindo o fluxo) foram realizados no campo de treinamento da empresa Can Padró Segurança e Proteção (em Barcelona). Quatro testes foram realizados, consistindo em liberações de propano com vazões de até 0,5 kg/s, com duração de 40 segundos em uma área de descarga de 700 m2. Os testes de campo contribuíram para a reavaliação dos pontos críticos mapeados durante as primeiras fases deste estudo e forneceram dados experimentais para serem utilizados pela comunidade internacional no estudo de dispersão e validação de modelos. Simulações feitas utilizando-se a ferramenta CFD foram comparadas com os dados experimentais obtidos nos testes de campo. Em termos gerais, o simulador mostrou bom desempenho em relação às taxas de concentração da nuvem. O simulador reproduziu com sucesso a geometria complexa e seus efeitos sobre a dispersão da nuvem, mostrando claramente o efeito da barreira na distribuição das concentrações. No entanto, as simulações não foram capazes de representar toda a dinâmica da dispersão no que concerne aos efeitos da variação do vento, uma vez que as nuvens simuladas diluíram mais rapidamente do que nuvens experimentais.
Resumo:
A ciência tem feito uso frequente de recursos computacionais para execução de experimentos e processos científicos, que podem ser modelados como workflows que manipulam grandes volumes de dados e executam ações como seleção, análise e visualização desses dados segundo um procedimento determinado. Workflows científicos têm sido usados por cientistas de várias áreas, como astronomia e bioinformática, e tendem a ser computacionalmente intensivos e fortemente voltados à manipulação de grandes volumes de dados, o que requer o uso de plataformas de execução de alto desempenho como grades ou nuvens de computadores. Para execução dos workflows nesse tipo de plataforma é necessário o mapeamento dos recursos computacionais disponíveis para as atividades do workflow, processo conhecido como escalonamento. Plataformas de computação em nuvem têm se mostrado um alternativa viável para a execução de workflows científicos, mas o escalonamento nesse tipo de plataforma geralmente deve considerar restrições específicas como orçamento limitado ou o tipo de recurso computacional a ser utilizado na execução. Nesse contexto, informações como a duração estimada da execução ou limites de tempo e de custo (chamadas aqui de informações de suporte ao escalonamento) são importantes para garantir que o escalonamento seja eficiente e a execução ocorra de forma a atingir os resultados esperados. Este trabalho identifica as informações de suporte que podem ser adicionadas aos modelos de workflows científicos para amparar o escalonamento e a execução eficiente em plataformas de computação em nuvem. É proposta uma classificação dessas informações, e seu uso nos principais Sistemas Gerenciadores de Workflows Científicos (SGWC) é analisado. Para avaliar o impacto do uso das informações no escalonamento foram realizados experimentos utilizando modelos de workflows científicos com diferentes informações de suporte, escalonados com algoritmos que foram adaptados para considerar as informações inseridas. Nos experimentos realizados, observou-se uma redução no custo financeiro de execução do workflow em nuvem de até 59% e redução no makespan chegando a 8,6% se comparados à execução dos mesmos workflows sendo escalonados sem nenhuma informação de suporte disponível.
Resumo:
A pesquisa sobre as percepções de residentes da região de Piracicaba em relação às questões ambientais e ao futuro da humanidade no planeta foi desenvolvida com base em um questionário semi-estruturado. As questões buscaram levantar dados autoavaliativos sobre perfil, comportamentos, estado de humor, qualidade de vida, condição econômica e hábitos de consumo, práticas para destinação de resíduos, iniciativas de exercício de cidadania em prol da sustentabilidade socioambiental, para enfim, indagar sobre percepções de futuro e avaliações sobre o contexto socioambiental dos participantes da pesquisa. A aplicação dos questionários foi feita de forma aleatória estratificada nos bairros das cinco regiões da cidade de Piracicaba: Norte, Sul, Leste, Oeste e Centro. Por meio dessas aplicações, foram obtidos 655 questionários, que foram sistematizados, tabulados e analisados estatisticamente, utilizando-se gráficos de frequência, o Teste de Kruskal Wallis, os Testes de Correlação de Spearman e Kendall, o Teste de Qui-Quadrado e o Teste Exato de Fisher. Foram também criadas nuvens de palavras, desenvolvidas no software online \"Wordle\" (FEINBERG, 2014). Os resultados obtidos com essa pesquisa e as análises desenvolvidas indicam que 227 pessoas, isto é, aproximadamente 35% dos respondentes, possui uma percepção pessimista sobre o futuro da humanidade no planeta. Porém, 493 pessoas, o equivalente a aproximadamente 75% do total de respondentes, considerou que, dentre as alternativas apresentadas no questionário (Desenvolvimento de tecnologias; Controle de natalidade; Educação e mudanças culturais; Cobrança de impostos com base nos impactos ambientais; Intervenção do Estado), a educação e mudanças culturais são fundamentais no processo de transformação social e de superação das problemáticas ambientais. Observou-se também que a crise hídrica vivenciada na época em que os questionários foram aplicados também influenciou na percepção social dos respondentes, uma vez que a palavra \"água\" foi citada 380 vezes. Por meio do trabalho, foi também possível analisar o comportamento ambiental dos pesquisados, notando-se que ainda há a necessidade de promoção de atividades educacionais e comunicacionais que possam estimular a adoção de hábitos e comportamentos mais comprometidos com ideias de sustentabilidade e que levem a mudanças mais efetivas nos padrões de relacionamento entre sociedade e meio ambiente.
Resumo:
The center of our Galaxy hosts a supermassive black hole, Sagittarius (Sgr) A∗. Young, massive stars within 0.5 pc of Sgr A∗ are evidence of an episode of intense star formation near the black hole a few million years ago, which might have left behind a young neutron star traveling deep into Sgr A∗’s gravitational potential. On 2013 April 25, a short X-ray burst was observed from the direction of the Galactic center. With a series of observations with the Chandra and the Swift satellites, we pinpoint the associated magnetar at an angular distance of 2.4±0.3 arcsec from Sgr A∗, and refine the source spin period and its derivative (P = 3.7635537(2) s and ˙ P = 6.61(4) × 10−12 s s−1), confirmed by quasi simultaneous radio observations performed with the Green Bank Telescope and Parkes Radio Telescope, which also constrain a dispersion measure of DM = 1750 ± 50 pc cm−3, the highest ever observed for a radio pulsar. We have found that this X-ray source is a young magnetar at ≈0.07–2 pc from Sgr A∗. Simulations of its possible motion around Sgr A∗ show that it is likely (∼90% probability) in a bound orbit around the black hole. The radiation front produced by the past activity from the magnetar passing through the molecular clouds surrounding the Galactic center region might be responsible for a large fraction of the light echoes observed in the Fe fluorescence features.