72 resultados para LCA methodology
Resumo:
The Twitter network has been labelled the most commonly used microblogging application around today. With about 500 million estimated registered users as of June, 2012, Twitter has become a credible medium of sentiment/opinion expression. It is also a notable medium for information dissemination; including breaking news on diverse issues since it was launched in 2007. Many organisations, individuals and even government bodies follow activities on the network in order to obtain knowledge on how their audience reacts to tweets that affect them. We can use postings on Twitter (known as tweets) to analyse patterns associated with events by detecting the dynamics of the tweets. A common way of labelling a tweet is by including a number of hashtags that describe its contents. Association Rule Mining can find the likelihood of co-occurrence of hashtags. In this paper, we propose the use of temporal Association Rule Mining to detect rule dynamics, and consequently dynamics of tweets. We coined our methodology Transaction-based Rule Change Mining (TRCM). A number of patterns are identifiable in these rule dynamics including, new rules, emerging rules, unexpected rules and ?dead' rules. Also the linkage between the different types of rule dynamics is investigated experimentally in this paper.
Resumo:
We present a simple sieving methodology to aid the recovery of large cultigen pollen grains, such as maize (Zea mays L.), manioc (Manihot esculenta Crantz), and sweet potato (Ipomoea batatas L.), among others, for the detection of food production using fossil pollen analysis of lake sediments in the tropical Americas. The new methodology was tested on three large study lakes located next to known and/or excavated pre-Columbian archaeological sites in South and Central America. Five paired samples, one treated by sieving, the other prepared using standard methodology, were compared for each of the three sites. Using the new methodology, chemically digested sediment samples were passed through a 53 µm sieve, and the residue was retained, mounted in silicone oil, and counted for large cultigen pollen grains. The filtrate was mounted and analysed for pollen according to standard palynological procedures. Zea mays (L.) was recovered from the sediments of all three study lakes using the sieving technique, where no cultigen pollen had been previously recorded using the standard methodology. Confidence intervals demonstrate there is no significant difference in pollen assemblages between the sieved versus unsieved samples. Equal numbers of exotic Lycopodium spores added to both the filtrate and residue of the sieved samples allow for direct comparison of cultigen pollen abundance with the standard terrestrial pollen count. Our technique enables the isolation and rapid scanning for maize and other cultigen pollen in lake sediments, which, in conjunction with charcoal and pollen records, is key to determining land-use patterns and the environmental impact of pre-Columbian societies.
Resumo:
Contrails and especially their evolution into cirrus-like clouds are thought to have very important effects on local and global radiation budgets, though are generally not well represented in global climate models. Lack of contrail parameterisations is due to the limited availability of in situ contrail measurements which are difficult to obtain. Here we present a methodology for successful sampling and interpretation of contrail microphysical and radiative data using both in situ and remote sensing instrumentation on board the FAAM BAe146 UK research aircraft as part of the COntrails Spreading Into Cirrus (COSIC) study.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.
Resumo:
We analyse by simulation the impact of model-selection strategies (sometimes called pre-testing) on forecast performance in both constant-and non-constant-parameter processes. Restricted, unrestricted and selected models are compared when either of the first two might generate the data. We find little evidence that strategies such as general-to-specific induce significant over-fitting, or thereby cause forecast-failure rejection rates to greatly exceed nominal sizes. Parameter non-constancies put a premium on correct specification, but in general, model-selection effects appear to be relatively small, and progressive research is able to detect the mis-specifications.
Resumo:
Anticoagulants rodenticides have already known for over half a century, as effective and safe method of rodent control. However, discovered in 1958 anticoagulant resistance has given us a very important problem for their future long-term use. Laboratory tests provide the main method for identification the different types of anticoagulant resistances, quantify the magnitude of their effect and help us to choose the best pest control strategy. The main important tests are lethal feeding period (LFP) and blood clotting response (BCR) tests. These tests can now be used to quantify the likely effect of the resistance on treatment outcome by providing an estimate of the ‘resistance factor’. In 2004 the gene responsible for anticoagulant resistance (VKORC1) was identified and sequenced. As a result, a new molecular resistance testing methodology has been developed, and a number of resistance mutations, particularly in Norway rats and house mice. Three mutations of the VKORC1 gene in Norway rats have been identified to date that confer a degree of resistance to bromadiolone and difenacoum, sufficient to affect treatment outcome in the field.
Resumo:
This paper makes a theoretical case for using these two systems approaches together. The theoretical and methodological assumptions of system dynamics (SD) and soft system methodology (SSM) are briefly described and a partial critique is presented. SSM generates and represents diverse perspectives on a problem situation and addresses the socio-political elements of an intervention. However, it is weak in ensuring `dynamic coherence'. consistency between the intuitive behaviour resulting from proposed changes and behaviour deduced from ideas on causal structure. Conversely, SD examines causal structures and dynamic behaviours. However, whilst emphasising the need for a clear issue focus, it has little theory for generating and representing diverse issues. Also, there is no theory for facilitating sensitivity to socio-political elements. A synthesis of the two called ‘Holon Dynamics' is proposed. After an SSM intervention, a second stage continues the socio-political analysis and also operates within a new perspective which values dynamic coherence of the mental construct - the holon - which is capable of expressing the proposed changes. A model of this holon is constructed using SD and the changes are thus rendered `systemically desirable' in the additional sense that dynamic consistency has been confirmed. The paper closes with reflections on the proposal and the need for theoretical consistency when mixing tools is emphasised.
Resumo:
This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.
Resumo:
The WFDEI meteorological forcing data set has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. We discuss the specifics of how changes in the reanalysis and processing have led to improvement over the WFD. We attribute improvements in precipitation and wind speed to the latest reanalysis basis data and improved downward shortwave fluxes to the changes in the aerosol corrections. Covering 1979–2012, the WFDEI will allow more thorough comparisons of hydrological and Earth System model outputs with hydrologically and phenologically relevant satellite products than using the WFD.
Resumo:
There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.
Resumo:
The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.