941 resultados para open source seismic data processing packages
Resumo:
This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.
Resumo:
The large scale urban consumption of energy (LUCY) model simulates all components of anthropogenic heat flux (QF) from the global to individual city scale at 2.5 × 2.5 arc-minute resolution. This includes a database of different working patterns and public holidays, vehicle use and energy consumption in each country. The databases can be edited to include specific diurnal and seasonal vehicle and energy consumption patterns, local holidays and flows of people within a city. If better information about individual cities is available within this (open-source) database, then the accuracy of this model can only improve, to provide the community data from global-scale climate modelling or the individual city scale in the future. The results show that QF varied widely through the year, through the day, between countries and urban areas. An assessment of the heat emissions estimated revealed that they are reasonably close to those produced by a global model and a number of small-scale city models, so results from LUCY can be used with a degree of confidence. From LUCY, the global mean urban QF has a diurnal range of 0.7–3.6 W m−2, and is greater on weekdays than weekends. The heat release from building is the largest contributor (89–96%), to heat emissions globally. Differences between months are greatest in the middle of the day (up to 1 W m−2 at 1 pm). December to February, the coldest months in the Northern Hemisphere, have the highest heat emissions. July and August are at the higher end. The least QF is emitted in May. The highest individual grid cell heat fluxes in urban areas were located in New York (577), Paris (261.5), Tokyo (178), San Francisco (173.6), Vancouver (119) and London (106.7). Copyright © 2010 Royal Meteorological Society
Resumo:
SOA (Service Oriented Architecture), workflow, the Semantic Web, and Grid computing are key enabling information technologies in the development of increasingly sophisticated e-Science infrastructures and application platforms. While the emergence of Cloud computing as a new computing paradigm has provided new directions and opportunities for e-Science infrastructure development, it also presents some challenges. Scientific research is increasingly finding that it is difficult to handle “big data” using traditional data processing techniques. Such challenges demonstrate the need for a comprehensive analysis on using the above mentioned informatics techniques to develop appropriate e-Science infrastructure and platforms in the context of Cloud computing. This survey paper describes recent research advances in applying informatics techniques to facilitate scientific research particularly from the Cloud computing perspective. Our particular contributions include identifying associated research challenges and opportunities, presenting lessons learned, and describing our future vision for applying Cloud computing to e-Science. We believe our research findings can help indicate the future trend of e-Science, and can inform funding and research directions in how to more appropriately employ computing technologies in scientific research. We point out the open research issues hoping to spark new development and innovation in the e-Science field.
Resumo:
In recent years, there has been an increasing interest in the adoption of emerging ubiquitous sensor network (USN) technologies for instrumentation within a variety of sustainability systems. USN is emerging as a sensing paradigm that is being newly considered by the sustainability management field as an alternative to traditional tethered monitoring systems. Researchers have been discovering that USN is an exciting technology that should not be viewed simply as a substitute for traditional tethered monitoring systems. In this study, we investigate how a movement monitoring measurement system of a complex building is developed as a research environment for USN and related decision-supportive technologies. To address the apparent danger of building movement, agent-mediated communication concepts have been designed to autonomously manage large volumes of exchanged information. In this study, we additionally detail the design of the proposed system, including its principles, data processing algorithms, system architecture, and user interface specifics. Results of the test and case study demonstrate the effectiveness of the USN-based data acquisition system for real-time monitoring of movement operations.
Resumo:
This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.
Resumo:
One of the most pervasive classes of services needed to support e-Science applications are those responsible for the discovery of resources. We have developed a solution to the problem of service discovery in a Semantic Web/Grid setting. We do this in the context of bioinformatics, which is the use of computational and mathematical techniques to store, manage, and analyse the data from molecular biology in order to answer questions about biological phenomena. Our specific application is myGrid (www.mygrid.org.uk) that is developing open source, service-based middleware upon which bioinformatics applications can be built. myGrid is specifically targeted at developing open source high-level service Grid middleware for bioinformatics.
Resumo:
One of the most pervasive classes of services needed to support e-Science applications are those responsible for the discovery of resources. We have developed a solution to the problem of service discovery in a Semantic Web/Grid setting. We do this in the context of bioinformatics, which is the use of computational and mathematical techniques to store, manage, and analyse the data from molecular biology in order to answer questions about biological phenomena. Our specific application is myGrid (http: //www.mygrid.org.uk) that is developing open source, service-based middleware upon which bioin- formatics applications can be built. myGrid is specif- ically targeted at developing open source high-level service Grid middleware for bioinformatics.
Resumo:
Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.
Numerical Simulation Of Sediment Transport And Bedmorphology Around A Hydraulic Structure On A River
Resumo:
Scour around hydraulic structures is a critical problem in hydraulic engineering. Under prediction of scour depth may lead to costly failures of the structure, while over prediction might result in unnecessary costs. Unfortunately, up-to-date empirical scour prediction formulas are based on laboratory experiments that are not always able to reproduce field conditions due to complicated geometry of rivers and temporal and spatial scales of a physical model. However, computational fluid dynamics (CFD) tools can perform using real field dimensions and operating conditions to predict sediment scour around hydraulic structures. In Korea, after completing the Four Major Rivers Restoration Project, several new weirs have been built across Han, Nakdong, Geum and Yeongsan Rivers. Consequently, sediment deposition and bed erosion around such structures have became a major issue in these four rivers. In this study, an application of an open source CFD software package, the TELEMAC-MASCARET, to simulate sediment transport and bed morphology around Gangjeong weir, which is the largest multipurpose weir built on Nakdong River. A real bathymetry of the river and a geometry of the weir have been implemented into the numerical model. The numerical simulation is carried out with a real hydrograph at the upstream boundary. The bedmorphology obtained from the numerical results has been validated against field observation data, and a maximum of simulated scour depth is compared with the results obtained by empirical formulas of Hoffmans. Agreement between numerical computations, observed data and empirical formulas is judged to be satisfactory on all major comparisons. The outcome of this study does not only point out the locations where deposition and erosion might take place depending on the weir gate operation, but also analyzes the mechanism of formation and evolution of scour holes after the weir gates.
Resumo:
Driven by Web 2.0 technology and the almost ubiquitous presence of mobile devices, Volunteered Geographic Information (VGI) is knowing an unprecedented growth. These notable technological advancements have opened fruitful perspectives also in the field of water management and protection, raising the demand for a reconsideration of policies which also takes into account the emerging trend of VGI. This research investigates the opportunity of leveraging such technology to involve citizens equipped with common mobile devices (e.g. tablets and smartphones) in a campaign of report of water-related phenomena. The work is carried out in collaboration with ADBPO - Autorità di bacino del fiume Po (Po river basin Authority), i.e. the entity responsible for the environmental planning and protection of the basin of river Po. This is the longest Italian river, spreading over eight among the twenty Italian Regions and characterized by complex environmental issues. To enrich ADBPO official database with user-generated contents, a FOSS (Free and Open Source Software) architecture was designed which allows not only user field-data collection, but also data Web publication through standard protocols. Open Data Kit suite allows users to collect georeferenced multimedia information using mobile devices equipped with location sensors (e.g. the GPS). Users can report a number of environmental emergencies, problems or simple points of interest related to the Po river basin, taking pictures of them and providing other contextual information. Field-registered data is sent to a server and stored into a PostgreSQL database with PostGIS spatial extension. GeoServer provides then data dissemination on the Web, while specific OpenLayers-based viewers were built to optimize data access on both desktop computers and mobile devices. Besides proving the suitability of FOSS in the frame of VGI, the system represents a successful prototype for the exploitation of user local, real-time information aimed at managing and protecting water resources.
Resumo:
A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.
Resumo:
Guias para exploração mineral são normalmente baseados em modelos conceituais de depósitos. Esses guias são, normalmente, baseados na experiência dos geólogos, em dados descritivos e em dados genéticos. Modelamentos numéricos, probabilísticos e não probabilísticos, para estimar a ocorrência de depósitos minerais é um novo procedimento que vem a cada dia aumentando sua utilização e aceitação pela comunidade geológica. Essa tese utiliza recentes metodologias para a geração de mapas de favorablidade mineral. A denominada Ilha Cristalina de Rivera, uma janela erosional da Bacia do Paraná, situada na porção norte do Uruguai, foi escolhida como estudo de caso para a aplicação das metodologias. A construção dos mapas de favorabilidade mineral foi feita com base nos seguintes tipos de dados, informações e resultados de prospecção: 1) imagens orbitais; 2) prospecção geoquimica; 3) prospecção aerogeofísica; 4) mapeamento geo-estrutural e 5) altimetria. Essas informacões foram selecionadas e processadas com base em um modelo de depósito mineral (modelo conceitual), desenvolvido com base na Mina de Ouro San Gregorio. O modelo conceitual (modelo San Gregorio), incluiu características descritivas e genéticas da Mina San Gregorio, a qual abrange os elementos característicos significativos das demais ocorrências minerais conhecidas na Ilha Cristalina de Rivera. A geração dos mapas de favorabilidade mineral envolveu a construção de um banco de dados, o processamento dos dados, e a integração dos dados. As etapas de construção e processamento dos dados, compreenderam a coleta, a seleção e o tratamento dos dados de maneira a constituírem os denominados Planos de Informação. Esses Planos de Informação foram gerados e processados organizadamente em agrupamentos, de modo a constituírem os Fatores de Integração para o mapeamento de favorabilidade mineral na Ilha Cristalina de Rivera. Os dados foram integrados por meio da utilização de duas diferentes metodologias: 1) Pesos de Evidência (dirigida pelos dados) e 2) Lógica Difusa (dirigida pelo conhecimento). Os mapas de favorabilidade mineral resultantes da implementação das duas metodologias de integração foram primeiramente analisados e interpretados de maneira individual. Após foi feita uma análise comparativa entre os resultados. As duas metodologias xxiv obtiveram sucesso em identificar, como áreas de alta favorabilidade, as áreas mineralizadas conhecidas, além de outras áreas ainda não trabalhadas. Os mapas de favorabilidade mineral resultantes das duas metodologias mostraram-se coincidentes em relação as áreas de mais alta favorabilidade. A metodologia Pesos de Evidência apresentou o mapa de favorabilidade mineral mais conservador em termos de extensão areal, porém mais otimista em termos de valores de favorabilidade em comparação aos mapas de favorabilidade mineral resultantes da implementação da metodologia Lógica Difusa. Novos alvos para exploração mineral foram identificados e deverão ser objeto de investigação em detalhe.
Resumo:
This paper provides an examination of the emergence of open business models — entrepreneurial strategies that take advantage of the ease of digital reproduction to distribute free content, while earning money from the sale of related products and services. Locating the origins of open business in the open source software phenomenon, the authors suggest that the business strategies innovated there have broader economic relevance. Through a case study of the tecnobrega music scene in Belém, the paper illustrates how open business models can be applied to the production of cultural materials more generally
Resumo:
O uso combinado de algoritmos para a descoberta de tópicos em coleções de documentos com técnicas orientadas à visualização da evolução daqueles tópicos no tempo permite a exploração de padrões temáticos em corpora extensos a partir de representações visuais compactas. A pesquisa em apresentação investigou os requisitos de visualização do dado sobre composição temática de documentos obtido através da modelagem de tópicos – o qual é esparso e possui multiatributos – em diferentes níveis de detalhe, através do desenvolvimento de uma técnica de visualização própria e pelo uso de uma biblioteca de código aberto para visualização de dados, de forma comparativa. Sobre o problema estudado de visualização do fluxo de tópicos, observou-se a presença de requisitos de visualização conflitantes para diferentes resoluções dos dados, o que levou à investigação detalhada das formas de manipulação e exibição daqueles. Dessa investigação, a hipótese defendida foi a de que o uso integrado de mais de uma técnica de visualização de acordo com a resolução do dado amplia as possibilidades de exploração do objeto em estudo em relação ao que seria obtido através de apenas uma técnica. A exibição dos limites no uso dessas técnicas de acordo com a resolução de exploração do dado é a principal contribuição desse trabalho, no intuito de dar subsídios ao desenvolvimento de novas aplicações.
Resumo:
This work aims to develop a methodology for analysis of images using overlapping, which assists in identification of microstructural features in areas of titanium, which may be associated with its biological response. That way, surfaces of titanium heat treated for 08 (eight) different ways have been subjected to a test culture of cells. It was a relationship between the grain, texture and shape of grains of surface of titanium (attacked) trying to relate to the process of proliferation and adhesion. We used an open source software for cell counting adhered to the surface of titanium. The juxtaposition of images before and after cell culture was obtained with the aid of micro-hardness of impressions made on the surface of samples. From this image where there is overlap, it is possible to study a possible relationship between cell growth with microstructural characteristics of the surface of titanium. This methodology was efficient to describe a set of procedures that are useful in the analysis of surfaces of titanium subjected to a culture of cells