928 resultados para Geo-environmental systems
Resumo:
Las herramientas ETL (Extract, Transform, Load – extraer, transformar, cargar) permiten modelizar flujos de datos, facilitando la ejecución automática de procesos repetitivos. El intercambio de información entre dos modelos de datos heterogéneos es un claro ejemplo del tipo de tareas que pueden abordarse con software ETL. El proyecto Kettle es una herramienta ETL con licencia LGPL (Library General Public License) que utiliza técnicas de computación grid (ejecución paralela y distribuida) para poder procesar grandes cantidades de datos en un tiempo reducido. Kettle combina una potente ejecución en modo servidor con una intuitiva herramienta de escritorio para modelar los procesos y configurar los parámetros de ejecución. GeoKettle es una extensión de Kettle, que añade la posibilidad de tratar datos con componente geográfica, si bien está limitado a datos vectoriales y a ciertas operaciones espaciales muy concreta. El Centro Temático Europeo de Usos del Suelo e Información Espacial (ETC-LUSI) está impulsando un proyecto complementario, llamado BeETLe, que pretende ampliar drásticamente las capacidades de análisis y transformación espacial de GeoKettle. Para ello se ha elegido el proyecto Sextante, una librería de análisis espacial que incluye más de doscientos algoritmos ráster y vectoriales. La intención del proyecto BeETLe es integrar el conjunto de algoritmos de Sextante en GeoKettle, de forma que estén disponibles como transformaciones de GeoKettle. Las principales características de la herramienta BeETLe incluyen: automatización de procesos de análisis espacial o de transformaciones repetitivas de datos espaciales, ejecución paralela y distribuida (grid computing), capacidad para procesar grandes cantidades de datos sin limitaciones de memoria, y soporte de datos ráster y vectorial. Los usuarios actuales de Sextante descubrirán que BeETLe les propone una forma de trabajo sencilla e intuitiva, que añade a Sextante toda la potencia que ofrecen las herramientas ETL para procesar y transformar información en bases de datos
Resumo:
When publishing information on the web, one expects it to reach all the people that could be interested in. This is mainly achieved with general purpose indexing and search engines like Google which is the most used today. In the particular case of geographic information (GI) domain, exposing content to mainstream search engines is a complex task that needs specific actions. In many occasions it is convenient to provide a web site with a specially tailored search engine. Such is the case for on-line dictionaries (wikipedia, wordreference), stores (amazon, ebay), and generally all those holding thematic databases. Due to proliferation of these engines, A9.com proposed a standard interface called OpenSearch, used by modern web browsers to manage custom search engines. Geographic information can also benefit from the use of specific search engines. We can distinguish between two main approaches in GI retrieval information efforts: Classical OGC standardization on one hand (CSW, WFS filters), which are very complex for the mainstream user, and on the other hand the neogeographer’s approach, usually in the form of specific APIs lacking a common query interface and standard geographic formats. A draft ‘geo’ extension for OpenSearch has been proposed. It adds geographic filtering for queries and recommends a set of simple standard response geographic formats, such as KML, Atom and GeoRSS. This proposal enables standardization while keeping simplicity, thus covering a wide range of use cases, in both OGC and the neogeography paradigms. In this article we will analyze the OpenSearch geo extension in detail and its use cases, demonstrating its applicability to both the SDI and the geoweb. Open source implementations will be presented as well
Resumo:
El mundo del software está cambiando. El desarrollo de Internet y las conexiones de datos hacen que las personas estén conectadas prácticamente en cualquier lugar. La madurez de determinadas tecnologías y el cambio del perfil de los usuarios de consumidor a generador de contenidos son algunos de los pilares de este cambio. Los Content Management Systems (CMS) son plataformas que proporcionan la base para poder generar webs colaborativas de forma sencilla y sin necesidad de tener excesivos conocimientos previos y son responsables de buena parte de este desarrollo. Una de las posibilidades que todavía no se han explotado suficientemente en estos sistemas es la georreferenciación de contenidos. De esta forma, aparece una nueva categoría de enlaces semánticos en base a las relaciones espaciales. En el actual estado de la técnica, se puede aprovechar la potencia de las bases de datos espaciales para manejar contenidos georreferenciados y sus relaciones espaciales, pero prácticamente ningún CMS lo aprovecha. Este proyecto se centra en desarrollar un módulo para el CMS Drupal que proporcione un soporte verdaderamente espacial y una interfaz gráfica en forma de mapa, mediante las que se puedan georreferenciar los contenidos. El módulo es independiente del proveedor de cartografía, ya que se utiliza la librería Open Source de abstracción de mapas IDELab Mapstraction Interactive. De esta forma se aúna la independencia tecnológica con la gestión verdaderamente espacial de los contenidos
Resumo:
The prediction of extratropical cyclones by the European Centre for Medium Range Weather Forecasts (ECMWF) and the National Centers for Environmental Prediction (NCEP) Ensemble Prediction Systems (EPS) has been investigated using an objective feature tracking methodology to identify and track the cyclones along the forecast trajectories. Overall the results show that the ECMWF EPS has a slightly higher level of skill than the NCEP EPS in the northern hemisphere (NH). However in the southern hemisphere (SH), NCEP has higher predictive skill than ECMWF for the intensity of the cyclones. The results from both EPS indicate a higher level of predictive skill for the position of extratropical cyclones than their intensity and show that there is a larger spread in intensity than position. Further analysis shows that the predicted propagation speed of cyclones is generally too slow for the ECMWF EPS and show a slight bias for the intensity of the cyclones to be overpredicted. This is also true for the NCEP EPS in the SH. For the NCEP EPS in the NH the intensity of the cyclones is underpredicted. There is small bias in both the EPS for the cyclones to be displaced towards the poles. For each ensemble forecast of each cyclone, the predictive skill of the ensemble member that best predicts the cyclones position and intensity was computed. The results are very encouraging showing that the predictive skill of the best ensemble member is significantly higher than that of the control forecast in terms of both the position and intensity of the cyclones. The prediction of cyclones before they are identified as 850 hPa vorticity centers in the analysis cycle was also considered. It is shown that an indication of extratropical cyclones can be given by at least 1 ensemble member 7 days before they are identified in the analysis. Further analysis of the ECMWF EPS shows that the ensemble mean has a higher level of skill than the control forecast, particularly for the intensity of the cyclones, 2 from day 3 of the forecast. There is a higher level of skill in the NH than the SH and the spread in the SH is correspondingly larger. The difference between the ensemble mean and spread is very small for the position of the cyclones, but the spread of the ensemble is smaller than the ensemble mean error for the intensity of the cyclones in both hemispheres. Results also show that the ECMWF control forecast has ½ to 1 day more skill than the perturbed members, for both the position and intensity of the cyclones, throughout the forecast.
Resumo:
Virtual globe technology holds many exciting possibilities for environmental science. These easy-to-use, intuitive systems provide means for simultaneously visualizing four-dimensional environmental data from many different sources, enabling the generation of new hypotheses and driving greater understanding of the Earth system. Through the use of simple markup languages, scientists can publish and consume data in interoperable formats without the need for technical assistance. In this paper we give, with examples from our own work, a number of scientific uses for virtual globes, demonstrating their particular advantages. We explain how we have used Web Services to connect virtual globes with diverse data sources and enable more sophisticated usage such as data analysis and collaborative visualization. We also discuss the current limitations of the technology, with particular regard to the visualization of subsurface data and vertical sections.
Resumo:
GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.
Resumo:
Soils represent a large carbon pool, approximately 1500 Gt, which is equivalent to almost three times the quantity stored in terrestrial biomass and twice the amount stored in the atmosphere. Any modification of land use or land management can induce variations in soil carbon stocks, even in agricultural systems that are perceived to be in a steady state. Tillage practices often induce soil aerobic conditions that are favourable to microbial activity and may lead to a degradation of soil structure. As a result, mineralisation of soil organic matter increases in the long term. The adoption of no-tillage systems and the maintenance of a permanent vegetation cover using Direct seeding Mulch-based Cropping system or DMC, may increase carbon levels in the topsoil. In Brazil, no-tillage practices (mainly DMC), were introduced approximately 30 years ago in the south in the Parana state, primarily as a means of reducing erosion. Subsequently, research has begun to study the management of the crop waste products and their effects on soil fertility, either in terms of phosphorus management, as a means of controlling soil acidity, or determining how manures can be applied in a more localised manner. The spread of no-till in Brazil has involved a large amount of extension work. The area under no-tillage is still increasing in the centre and north of the country and currently occupies ca. 20 million hectares, covering a diversity of environmental conditions, cropping systems and management practices. Most studies of Brazilian soils give rates of carbon storage in the top 40 cm of the soil of 0.4 to 1.7 t C ha(-1) per year, with the highest rates in the Cerrado region. However, caution must be taken when analysing DMC systems in terms of carbon sequestration. Comparisons should include changes in trace gas fluxes and should not be limited to a consideration of carbon storage in the soil alone if the full implications for global warming are to be assessed.
Resumo:
This paper describes a new bio-indicator method for assessing wetland ecosystem health: as such, the study is particularly relevant to current legislation such as the EU Water Framework Directive, which provides a baseline of the current status Of Surface waters. Seven wetland sites were monitored across northern Britain, with model construction data for predicting, eco-hydroloplical relationships collected from five sites during 1999, Two new sites and one repeat site were monitored during 2000 to provide model test data. The main growing season for the vegetation, and hence the sampling period, was May-August during both years. Seasonal mean concentrations of nitrate (NO3-) in surface and soil water samples during 1999 ranged from 0.01 to 14.07 mg N 1(-1), with a mean value of 1.01 mg N 1(-1). During 2000, concentrations ranged from trace level (<0.01 m- N 1(-1)) to 9.43 mg N 1(-1), with a mean of 2.73 mg N 1(.)(-1) Surface and soil-water nitrate concentrations did not influence plant species composition significantly across representative tall herb fen and mire communities. Predictive relationships were found between nitrate concentrations and structural characteristics of the wetland vegetation, and a model was developed which predicted nitrate concentrations from measures of plant diversity, canopy structure and density of reproductive structures. Two further models, which predicted stem density and density of reproductive structures respectively, utilised nitrate concentration as one of the independent predictor variables. Where appropriate, the models were tested using data collected during 2000. This approach is complementary to species-based monitoring, representing a useful and simple too] to assess ecological status in target wetland systems and has potential for bio-indication purposes.
Resumo:
This paper describes the results and conclusions of the INCA (Integrated Nitrogen Model for European CAtchments) project and sets the findings in the context of the ELOISE (European Land-Ocean Interaction Studies) programme. The INCA project was concerned with the development of a generic model of the major factors and processes controlling nitrogen dynamics in European river systems, thereby providing a tool (a) to aid the scientific understanding of nitrogen transport and retention in catchments and (b) for river-basin management and policy-making. The findings of the study highlight the heterogeneity of the factors and processes controlling nitrogen dynamics in freshwater systems. Nonetheless, the INCA model was able to simulate the in-stream nitrogen concentrations and fluxes observed at annual and seasonal timescales in Arctic, Continental and Maritime-Temperate regimes. This result suggests that the data requirements and structural complexity of the INCA model are appropriate to simulate nitrogen fluxes across a wide range of European freshwater environments. This is a major requirement for the production of coupled fiver-estuary-coastal shelf models for the management of our aquatic environment. With regard to river-basin management, to achieve an efficient reduction in nutrient fluxes from the land to the estuarine and coastal zone, the model simulations suggest that management options must be adaptable to the prevailing environmental and socio-economic factors in individual catchments: 'Blanket approaches' to environmental policy appear too simple. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an engineering factory and offices. A multi-disciplinary professional practice was used to manage and design the project. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of buildings provision can be viewed as systems and sub-systems which are differentiated form each other at decision points. Further to this, the sub-systematic analysis of the relationship between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. There was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's proposition. However, the project was subject to string environmental forces which the project organization was not capable of entirely overcoming.
Resumo:
The management of a public sector project is analysed using a model developed from systems theory. Linear responsibility analysis is used to identify the primary and key decision structure of the project and to generate quantitative data regarding differentiation and integration of the operating system, the managing system and the client/project team. The environmental context of the project is identified. Conclusions are drawn regarding the project organization structure's ability to cope with the prevailing environmental conditions. It is found that the complexity of the managing system imposed on the project was unable to achieve this and created serious deficiencies in the outcome of the project.
Resumo:
The systems used for the procurement of buildings are organizational systems. They involve people in a series of strategic decisions, and a pattern of roles, responsibilities and relationships that combine to form the organizational structure of the project. To ensure effectiveness of the building team, this organizational structure needs to be contingent upon the environment within which the construction project takes place. In addition, a changing environment means that the organizational structure within a project needs to be responsive, and dynamic. These needs are often not satisfied in the construction industry, due to the lack of analytical tools with which to analyse the environment and to design appropriate temporary organizations. This paper presents two techniques. First is the technique of "Environmental Complexity Analysis", which identifies the key variables in the environment of the construction project. These are classified as Financial, Legal, Technological, Aesthetic and Policy. It is proposed that their identification will set the parameters within which the project has to be managed. This provides a basis for the project managers to define the relevant set of decision points that will be required for the project. The Environmental Complexity Analysis also identifies the project's requirements for control systems concerning Budget, Contractual, Functional, Quality and Time control. The process of environmental scanning needs to be done at regular points during the procurement process to ensure that the organizational structure is adaptive to the changing environment. The second technique introduced is the technique of "3R analysis", being a graphical technique for describing and modelling Roles, Responsibilities and Relationships. A list of steps is introduced that explains the procedure recommended for setting up a flexible organizational structure that is responsive to the environment of the project. This is by contrast with the current trend towards predetermined procurement paths that may not always be in the best interests of the client.
Resumo:
This report forms part of a larger research programme on 'Reinterpreting the Urban-Rural Continuum', which conceptualises and investigates current knowledge and research gaps concerning 'the role that ecosystems services play in the livelihoods of the poor in regions undergoing rapid change'. The report aims to conduct a baseline appraisal of water-dependant ecosystem services, the roles they play within desakota livelihood systems and their potential sensitivity to climate change. The appraisal is conducted at three spatial scales: global, regional (four consortia areas), and meso scale (case studies within the four regions). At all three scales of analysis water resources form the interweaving theme because water provides a vital provisioning service for people, supports all other ecosystem processes and because water resources are forecast to be severely affected under climate change scenarios. This report, combined with an Endnote library of over 1100 scientific papers, provides an annotated bibliography of water-dependant ecosystem services, the roles they play within desakota livelihood systems and their potential sensitivity to climate change. After an introductory, section, Section 2 of the report defines water-related ecosystem services and how these are affected by human activities. Current knowledge and research gaps are then explored in relation to global scale climate and related hydrological changes (e.g. floods, droughts, flow regimes) (section 3). The report then discusses the impacts of climate changes on the ESPA regions, emphasising potential responses of biomes to the combined effects of climate change and human activities (particularly land use and management), and how these effects coupled with water store and flow regime manipulation by humans may affect the functioning of catchments and their ecosystem services (section 4). Finally, at the meso-scale, case studies are presented from within the ESPA regions to illustrate the close coupling of human activities and catchment performance in the context of environmental change (section 5). At the end of each section, research needs are identified and justified. These research needs are then amalgamated in section 6.
Resumo:
The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.
Resumo:
Near isogenic lines (NILs) varying for genes for reduced height (Rht) and photoperiod insensitivity (Ppd-D1a) in a cv. Mercia background (rht (tall), Rht-B1b, Rht-D1b, Rht-B1c, Rht8c + Ppd-D1a, Rht-D1c, Rht12) were compared at one field site but within contrasting ('organic' vs. 'conventional') rotational and agronomic contexts, in each of 3 years. In the final year, further NILs (rht (tall), Rht-B1b, Rht-D1b, Rht-B1c, Rht-B1b + Rht-D1b, Rht-D1b + Rht-B1c) in both Maris Huntsman and Maris Widgeon backgrounds were added together with 64 lines of a doubled haploid (DH) population [Savannah (Rht-D1b) x Renesansa (Rht-8c + Ppd-D1a)]. Assessments included laboratory tests of germination and coleoptile length, and various field measurements of crop growth between emergence and pre jointing [plant population, tillering, leaf length, ground cover (GC), interception of photosynthetically active radiation (PAR), crop dry matter (DM) and nitrogen accumulation (N), far red: red reflectance ratio (FR:R), crop height, and weed dry matter]. All of the dwarfing alleles except Rht12 in the Mercia background and Rht8c in the DHs were associated with reduced coleoptile length. Most of the dwarfing alleles (depending on background) reduced seed viability. Severe dwarfing alleles (Rht-B1c, Rht-D1c and Rht12) were routinely associated with fewer plant numbers and reduced early crop growth (GC, PAR, DM, N, FR:R), and in 1 year, increased weed DM. In the Mercia background and the DHs the semi-dwarfing allele Rht-D1b was also sometimes associated with reductions in early crop growth; no such negative effects were associated with the marker for Rht8c. When significant interactions between cropping system and genotype did occur it was because differences between lines were more exaggerated in the organic system than in the conventional system. Ppd-D1a was associated positively with plant numbers surviving the winter and early crop growth (GC, FR:R, DM, N, PAR, height), and was the most significant locus in a QTL analysis. We conclude that, within these environmental and system contexts, genes moderating development are likely to be more important in influencing early resource capture than using Rht8c as an alternative semi-dwarfing gene to Rht-D1b.