787 resultados para Green IT framework
Resumo:
Purpose – The aim of this paper is to investigate how values from within Abrahamic religions could be adopted to improve liberal market economies’ (LMEs’) corporate governance business practices. Design/methodology/approach – The concept of spiritual capitalism is explained from an Islamic perspective by adopting three universal Abrahamic values to critically analyse LMEs and offer an ethical alternative to current capitalism concerns. Findings – It is found that LMEs can be improved by considering all stakeholders, putting ethics before economics, and introducing shared risk/reward plus lower debt. Originality/value – The paper compares LMEs/Co-ordinated market economies (CMEs)/Islamic countries economies (ICEs) within an ethical framework for LMEs.
Resumo:
Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.
Resumo:
Concern that European forest biodiversity is depleted and declining has provoked widespread efforts to improve management practices. To gauge the success of these actions, appropriate monitoring of forest ecosystems is paramount. Multi-species indicators are frequently used to assess the state of biodiversity and its response to implemented management, but generally applicable and objective methodologies for species' selection are lacking. Here we use a niche-based approach, underpinned by coarse quantification of species' resource use, to objectively select species for inclusion in a pan-European forest bird indicator. We identify both the minimum number of species required to deliver full resource coverage and the most sensitive species' combination, and explore the trade-off between two key characteristics, sensitivity and redundancy, associated with indicators comprising different numbers of species. We compare our indicator to an existing forest bird indicator selected on the basis of expert opinion and show it is more representative of the wider community. We also present alternative indicators for regional and forest type specific monitoring and show that species' choice can have a significant impact on the indicator and consequent projections about the state of the biodiversity it represents. Furthermore, by comparing indicator sets drawn from currently monitored species and the full forest bird community, we identify gaps in the coverage of the current monitoring scheme. We believe that adopting this niche-based framework for species' selection supports the objective development of multi-species indicators and that it has good potential to be extended to a range of habitats and taxa.
Resumo:
Geoengineering by stratospheric aerosol injection has been proposed as a policy response to warming from human emissions of greenhouse gases, but it may produce unequal regional impacts. We present a simple, intuitive risk-based framework for classifying these impacts according to whether geoengineering increases or decreases the risk of substantial climate change, with further classification by the level of existing risk from climate change from increasing carbon dioxide concentrations. This framework is applied to two climate model simulations of geoengineering counterbalancing the surface warming produced by a quadrupling of carbon dioxide concentrations, with one using a layer of sulphate aerosol in the lower stratosphere, and the other a reduction in total solar irradiance. The solar dimming model simulation shows less regional inequality of impacts compared with the aerosol geoengineering simulation. In the solar dimming simulation, 10% of the Earth’s surface area, containing 10% of its population and 11% of its gross domestic product, experiences greater risk of substantial precipitation changes under geoengineering than under enhanced carbon dioxide concentrations. In the aerosol geoengineering simulation the increased risk of substantial precipitation change is experienced by 42% of Earth’s surface area, containing 36% of its population and 60% of its gross domestic product.
Resumo:
The traditional forcing-feedback framework has provided an indispensable basis for discussing global climate changes. However, as analysis of model behavior has become more detailed, shortcomings and ambiguities in the framework have become more evident and physical effects unaccounted for by the traditional framework have become interesting. In particular, the new concept of adjustments, which are responses to forcings that are not mediated by the global mean temperature, has emerged. This concept, related to the older ones of climate efficacy and stratospheric adjustment, is a more physical way of capturing unique responses to specific forcings. We present a pedagogical review of the adjustment concept, why it is important, and how it can be used. The concept is particularly useful for aerosols, where it helps to organize what has become a complex array of forcing mechanisms. It also helps clarify issues around cloud and hydrological response, transient vs. equilibrium climate change, and geoengineering.
Resumo:
Conceptualizing climate as a distinct variable limits our understanding of the synergies and interactions between climate change and the range of abiotic and biotic factors, which influence animal health. Frameworks such as eco-epidemiology and the epi-systems approach, while more holistic, view climate and climate change as one of many discreet drivers of disease. Here, I argue for a new paradigmatic framework: climate-change syndemics. Climate-change syndemics begins from the assumption that climate change is one of many potential influences on infectious disease processes, but crucially is unlikely to act independently or in isolation; and as such, it is the inter-relationship between factors that take primacy in explorations of infectious disease and climate change. Equally importantly, as climate change will impact a wide range of diseases, the frame of analysis is at the collective rather than individual level (for both human and animal infectious disease) across populations.
Resumo:
Organisations typically define and execute their selected strategy by developing and managing a portfolio of projects. The governance of this portfolio has proved to be a major challenge, particularly for large organisations. Executives and managers face even greater pressures when the nature of the strategic landscape is uncertain. This paper explores approaches for dealing with different levels of certainty in business IT projects and provides a contingent governance framework. Historically business IT projects have relied on a structured sequential approach, also referred to as a waterfall method. There is a distinction between the development stages of a solution and the management stages of a project that delivers the solution although these are often integrated in a business IT systems project. Prior research has demonstrated that the level of certainty varies between development projects. There can be uncertainty on what needs to be developed and also on how this solution should be developed. The move to agile development and management reflects a greater level of uncertainty often on both dimensions and this has led the adoption of more iterative approaches. What has been less well researched is the impact of uncertainty on the governance of the change portfolio and the corresponding implications for business executives. This paper poses this research question and proposes a govemance framework to address these aspects. The governance framework has been reviewed in the context of a major anonymous organisation, FinOrg. Findings are reported in this paper with a focus on the need to apply different approaches. In particular, the governance of uncertain business change is contrasted with the management approach for defined IT projects. Practical outputs from the paper include a consideration of some innovative approaches that can be used by executives. It also investigates the role of the business change portfolio group in evaluating and executing the appropriate level of governance. These results lead to recommendations for executives and also proposed further research.
Resumo:
Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams ? cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
This paper describes the hydrochemistry of a lowland, urbanised river-system, The Cut in England, using in situ sub-daily sampling. The Cut receives effluent discharges from four major sewage treatment works serving around 190,000 people. These discharges consist largely of treated water, originally abstracted from the River Thames and returned via the water supply network, substantially increasing the natural flow. The hourly water quality data were supplemented by weekly manual sampling with laboratory analysis to check the hourly data and measure further determinands. Mean phosphorus and nitrate concentrations were very high, breaching standards set by EU legislation. Though 56% of the catchment area is agricultural, the hydrochemical dynamics were significantly impacted by effluent discharges which accounted for approximately 50% of the annual P catchment input loads and, on average, 59% of river flow at the monitoring point. Diurnal dissolved oxygen data demonstrated high in-stream productivity. From a comparison of high frequency and conventional monitoring data, it is inferred that much of the primary production was dominated by benthic algae, largely diatoms. Despite the high productivity and nutrient concentrations, the river water did not become anoxic and major phytoplankton blooms were not observed. The strong diurnal and annual variation observed showed that assessments of water quality made under the Water Framework Directive (WFD) are sensitive to the time and season of sampling. It is recommended that specific sampling time windows be specified for each determinand, and that WFD targets should be applied in combination to help identify periods of greatest ecological risk. This article is protected by copyright. All rights reserved.
Resumo:
The Green Feed (GF) system (C-Lock Inc., Rapid City, USA) is used to estimate total daily methane emissions of individual cattle using short-term measurements obtained over several days. Our objective was to compare measurements of methane emission by growing cattle obtained using the GF system with measurements using respiration chambers (RC)or sulphur hexafluoride tracer (SF6). It was hypothesised that estimates of methane emission for individual animals and treatments would be similar for GF compared to RC or SF6 techniques. In experiment 1, maize or grass silage-based diets were fed to four growing Holstein heifers, whilst for experiment 2, four different heifers were fed four haylage treatments. Both experiments were a 4 × 4 Latin square design with 33 day periods. Green Feed measurements of methane emission were obtained over 7 days (days 22–28) and com-pared to subsequent RC measurements over 4 days (days 29–33). For experiment 3, 12growing heifers rotationally grazed three swards for 26 days, with simultaneous GF and SF6 measurements over two 4 day measurement periods (days 15–19 and days 22–26).Overall methane emissions (g/day and g/kg dry matter intake [DMI]) measured using GF in experiments 1 (198 and 26.6, respectively) and 2 (208 and 27.8, respectively) were similar to averages obtained using RC (218 and 28.3, respectively for experiment 1; and 209 and 27.7, respectively, for experiment 2); but there was poor concordance between the two methods (0.1043 for experiments 1 and 2 combined). Overall, methane emissions measured using SF6 were higher (P<0.001) than GF during grazing (186 vs. 164 g/day), but there was significant (P<0.01) concordance between the two methods (0.6017). There were fewer methane measurements by GF under grazing conditions in experiment 3 (1.60/day) com-pared to indoor measurements in experiments 1 (2.11/day) and 2 (2.34/day). Significant treatment effects on methane emission measured using RC and SF6 were not evident for GF measurements, and the ranking for treatments and individual animals differed using the GF system. We conclude that under our conditions of use the GF system was unable to detectsignificant treatment and individual animal differences in methane emissions that were identified using both RC and SF6techniques, in part due to limited numbers and timing ofmeasurements obtained. Our data suggest that successful use of the GF system is reliant on the number and timing of measurements obtained relative to diurnal patterns of methane emission.
Resumo:
Assessing the ways in which rural agrarian areas provide Cultural Ecosystem Services (CES) is proving difficult to achieve. This research has developed an innovative methodological approach named as Multi Scale Indicator Framework (MSIF) for capturing the CES embedded into the rural agrarian areas. This framework reconciles a literature review with a trans-disciplinary participatory workshop. Both of these sources reveal that societal preferences diverge upon judgemental criteria which in turn relate to different visual concepts that can be drawn from analysing attributes, elements, features and characteristics of rural areas. We contend that it is now possible to list a group of possible multi scale indicators for stewardship, diversity and aesthetics. These results might also be of use for improving any existing European indicators frameworks by also including CES. This research carries major implications for policy at different levels of governance, as it makes possible to target and monitor policy instruments to the physical rural settings so that cultural dimensions are adequately considered. There is still work to be developed on regional specific values and thresholds for each criteria and its indicator set. In practical terms, by developing the conceptual design within a common framework as described in this paper, a considerable step forward towards the inclusion of the cultural dimension in European wide assessments can be made.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.
Resumo:
The Chiltern commons are typical of those in the south east of England: small and numerous, but with the potential to provide important natural green space whilst contributing to environmental sustainability. In order to keep commons in good heart, they need to be managed. However, as activities such as grazing and coppicing become unviable on the commons, owners need to find sustainable roles beyond traditional agricultural and silvicultural practices. This paper examines ways of making management pay. It begins by exploring the economic, social and environmental challenges of sustainable management within the context of contemporary life. Section 2 identifies the different ways in which revenue contributions might be made towards the management of commons. Section 3 examines the relevant legal and other restrictions and Section 4 offers insights into where management proposals might offer multiple positive benefits, but also where there is the potential to cause conflict with environmental and social interests. Section 5 explores alternative funding streams for commons. Finally, Section 6 concludes with practical tips for the owners and managers of commons in the Chilterns and identifies areas for further research. Full references, links and resources are provided in the footnotes and appendix.
Resumo:
The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5-year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved.