926 resultados para Environmental information
Resumo:
Purpose: This paper explores the extent of site-specific and geographic segmental social, environmental and ethical reporting by mining companies operating in Ghana. We aim to: (i) establish a picture of corporate transparency relating to geographic segmentation of social, environmental and ethical reporting which is specific to operating sites and country of operation, and; (ii) gauge the impact of the introduction of integrated reporting on site-specific social, environmental and ethical reporting. Methodology/Approach: We conducted an interpretive content analysis of the annual/integrated reports of mining companies for the years 2009, 2010 and 2011 in order to extract site-specific social, environmental and ethical information relating to the companies’ mining operations in Ghana. Findings and Implications: We found that site-specific social, environmental and ethical reporting is extremely patchy and inconsistent between the companies’ reports studied. We also found that there was no information relating to certain sites, which were in operation, according to the Ghana Minerals Commission. This could simply be because operations were not in progress. Alternatively it could be that decisions are made concerning which site-specific information is reported according to a certain benchmark. One policy implication arising from this research is that IFRS should require geographic segmental reporting of material social, environmental and ethical information in order to bring IFRS into line with global developments in integrated reporting. Originality: Although there is a wealth of sustainability reporting research and an emergent literature on integrated reporting, there is currently no academic research exploring site-specific social, environmental and ethical reporting
Resumo:
The financial crisis of 2008 led to new international regulatory controls for the governance, risk and compliance of financial services firms. Information systems play a critical role here as political, functional and social pressures may lead to the deinstitutionalization of existing structures, processes and practices. This research examines how an investment management system is introduced by a leading IT vendor across eight client sites in the post-crisis era. Using institutional theory, it examines changes in working practices occurring at the environmental and organizational levels and the ways in which technological interventions are used to apply disciplinary effects in order to prevent inappropriate behaviors. The results extend the constructs of deinstitutionalization and identify empirical predictors for the deinstitutionalization of compliance and trading practices within financial organizations.
Resumo:
This paper presents preliminary results from an ethnoarchaeological study of animal husbandry in the modern village of Bestansur, situated in the lower Zagros Mountains of Iraqi Kurdistan. This research explores how modern families use and manage their livestock within the local landscape and identifies traces of this use. The aim is to provide the groundwork for future archaeological investigations focusing on the nearby Neolithic site of Bestansur. This is based on the premise that modern behaviours can suggest testable patterns for past practices within the same functional and ecological domains. Semi-structured interviews conducted with villagers from several households provided large amounts of information on modern behaviours that helped direct data collection, and which also illustrate notable shifts in practices and use of the local landscape over time. Strontium isotope analysis of modern plant material demonstrates that a measurable variation exists between the alluvial floodplain and the lower foothills, while analysis of modern dung samples shows clear variation between sheep/goat and cow dung, in terms of numbers of faecal spherulites. These results are specific to the local environment of Bestansur and can be used for evaluating and contextualising archaeological evidence as well as providing modern reference material for comparative purposes.
Resumo:
The archaeological site of Kharaneh IV in Jordan's Azraq Basin, and its relatively near neighbour Jilat 6 show evidence of sustained occupation of substantial size through the Early to Middle Epipalaeolithic (c. 24,000 - 15,000 cal BP). Here we review the geomorphological evidence for the environmental setting in which Kharaneh IV was established. The on-site stratigraphy is clearly differentiated from surrounding sediments, marked visually as well as by higher magnetic susceptibility values. Dating and analysis of off-site sediments show that a significant wetland existed at the site prior to and during early site occupation (~ 23,000 - 19,000 BP). This may explain why such a substantial site existed at this location. This wetland dating to the Last Glacial Maximum also provides important information on the palaeoenvironments and potential palaeoclimatic scenarios for today's eastern Jordanian desert, from where such evidence is scarce.
Resumo:
This thesis is an empirical-based study of the European Union’s Emissions Trading Scheme (EU ETS) and its implications in terms of corporate environmental and financial performance. The novelty of this study includes the extended scope of the data coverage, as most previous studies have examined only the power sector. The use of verified emissions data of ETS-regulated firms as the environmental compliance measure and as the potential differentiating criteria that concern the valuation of EU ETS-exposed firms in the stock market is also an original aspect of this study. The study begins in Chapter 2 by introducing the background information on the emission trading system (ETS), which focuses on (i) the adoption of ETS as an environmental management instrument and (ii) the adoption of ETS by the European Union as one of its central climate policies. Chapter 3 surveys four databases that provide carbon emissions data in order to determine the most suitable source of the data to be used in the later empirical chapters. The first empirical chapter, which is also Chapter 4 of this thesis, investigates the determinants of the emissions compliance performance of the EU ETS-exposed firms through constructing the best possible performance ratio from verified emissions data and self-configuring models for a panel regression analysis. Chapter 5 examines the impacts on the EU ETS-exposed firms in terms of their equity valuation with customised portfolios and multi-factor market models. The research design takes into account the emissions allowance (EUA) price as an additional factor, as it has the most direct association with the EU ETS to control for the exposure. The final empirical Chapter 6 takes the investigation one step further, by specifically testing the degree of ETS exposure facing different sectors with sector-based portfolios and an extended multi-factor market model. The findings from the emissions performance ratio analysis show that the business model of firms significantly influences emissions compliance, as the capital intensity has a positive association with the increasing emissions-to-emissions cap ratio. Furthermore, different sectors show different degrees of sensitivity towards the determining factors. The production factor influences the performance ratio of the Utilities sector, but not the Energy or Materials sectors. The results show that the capital intensity has a more profound influence on the utilities sector than on the materials sector. With regard to the financial performance impact, ETS-exposed firms as aggregate portfolios experienced a substantial underperformance during the 2001–2004 period, but not in the operating period of 2005–2011. The results of the sector-based portfolios show again the differentiating effect of the EU ETS on sectors, as one sector is priced indifferently against its benchmark, three sectors see a constant underperformance, and three sectors have altered outcomes.
Resumo:
Improving the environmental performance of non-domestic buildings is a complex and ‘wicked’ problem due to conflicting interests and incentives. This is particularly challenging in tenanted spaces, where landlord and tenant interactions are regulated through leases that traditionally ignore environmental considerations. ‘Green leasing’ is conceptualized as a form of ‘middle-out’ inter-organizational environmental governance that operates between organizations, alongside other drivers. This paper investigates how leases are evolving to become ‘greener’ in the UK and Australia, providing evidence from five varied sources on: (1) UK office and retail leases, (2) UK retail sector energy management, (3) a major UK retailer case study; (4) office leasing in Sydney, and (5) expert interviews on Australian retail leases. With some exceptions, the evidence reveals an increasing trend towards green leases in prime offices in both countries, but not in retail or sub-prime offices. Generally introduced by landlords, adopted green leases contain a variety of ambitions and levels of enforcement. As an evolving form of private–private environmental governance, green leases form a valuable framework for further tenant–landlord cooperation within properties and across portfolios. This increased cohesion could create new opportunities for polycentric governance, particularly at the interface of cities and the property industry.
Resumo:
Objective. To compare the nutritional value of meals provided by companies participating in the Workers` Meal Program in the city of Sao Paulo, Brazil, to the nutritional recommendations and guidelines established by the Ministry of Health for the Brazilian population. Methods. The 72 companies studied were grouped according to economic sector (industrial, services, or commerce), size (micro, small, medium, or large), meal preparation modality (prepared on-site by the company itself, on-site by a hired caterer, or off-site by a hired caterer), and supervision by a dietitian (yes or no). The per capita amount of food was determined based on the lunch, dinner, and supper menus for three days. The nutritional value of the meals was defined by the amount of calories, carbohydrates, protein, total fat, polyunsaturated fat, saturated fat, trans fat, sugars, cholesterol, and fruits and vegetables. Results. Most of the menus were deficient in the number of fruits and vegetables (63.9%) and amount of polyunsaturated fat (83.3%), but high in total fat (47.2%) and cholesterol (62.5%). Group 2, composed of mostly medium and large companies, supervised by a dietician, belonging to the industrial and/or service sectors, and using a hired caterer, on averaged served meals with higher calorie content (P < 0.001), higher percentage of polyunsaturated fat (P < 0.001), more cholesterol (P = 0.015), and more fruits and vegetables (P < 0.001) than Group 1, which was composed of micro and small companies from the commercial sector, that prepare the meals themselves on-site, and are not supervised by a dietitian. Regarding the nutrition guidelines set for the Brazilian population, Group 2 meals were better in terms of fruit and vegetable servings (P < 0.001). Group I meals were better in terms of cholesterol content (P = 0.05). Conclusions. More specific action is required targeting company officers and managers in charge of food and nutrition services, especially in companies without dietitian supervision.
Resumo:
Background: Genetic variation for environmental sensitivity indicates that animals are genetically different in their response to environmental factors. Environmental factors are either identifiable (e.g. temperature) and called macro-environmental or unknown and called micro-environmental. The objectives of this study were to develop a statistical method to estimate genetic parameters for macro- and micro-environmental sensitivities simultaneously, to investigate bias and precision of resulting estimates of genetic parameters and to develop and evaluate use of Akaike’s information criterion using h-likelihood to select the best fitting model. Methods: We assumed that genetic variation in macro- and micro-environmental sensitivities is expressed as genetic variance in the slope of a linear reaction norm and environmental variance, respectively. A reaction norm model to estimate genetic variance for macro-environmental sensitivity was combined with a structural model for residual variance to estimate genetic variance for micro-environmental sensitivity using a double hierarchical generalized linear model in ASReml. Akaike’s information criterion was constructed as model selection criterion using approximated h-likelihood. Populations of sires with large half-sib offspring groups were simulated to investigate bias and precision of estimated genetic parameters. Results: Designs with 100 sires, each with at least 100 offspring, are required to have standard deviations of estimated variances lower than 50% of the true value. When the number of offspring increased, standard deviations of estimates across replicates decreased substantially, especially for genetic variances of macro- and micro-environmental sensitivities. Standard deviations of estimated genetic correlations across replicates were quite large (between 0.1 and 0.4), especially when sires had few offspring. Practically, no bias was observed for estimates of any of the parameters. Using Akaike’s information criterion the true genetic model was selected as the best statistical model in at least 90% of 100 replicates when the number of offspring per sire was 100. Application of the model to lactation milk yield in dairy cattle showed that genetic variance for micro- and macro-environmental sensitivities existed. Conclusion: The algorithm and model selection criterion presented here can contribute to better understand genetic control of macro- and micro-environmental sensitivities. Designs or datasets should have at least 100 sires each with 100 offspring.
Resumo:
Previous to 1970, state and federal agencies held exclusive enforcement responsibilities over the violation of pollution control standards. However, recognizing that the government had neither the time nor resources to provide full enforcement, Congress created citizen suits. Citizen suits, first amended to the Clean Air Act in 1970, authorize citizens to act as private attorney generals and to sue polluters for violating the terms of their operating permits. Since that time, Congress has included citizen suits in 13 other federal statutes. The citizen suit phenomenon is sufficiently new that little is known about it. However, we do know that citizen suits have increased rapidly since the early 1980's. Between 1982 and 1986 the number of citizen suits jumped from 41 to 266. Obviously, they are becoming a widely used method of enforcing the environmental statutes. This paper will provide a detailed description, analysis and evaluation of citizen suits. It will begin with an introduction and will then move on to provide some historic and descriptive background on such issues as how citizen suit powers are delegated, what limitations are placed on the citizens, what parties are on each side of the suit, what citizens can enforce against, and the types of remedies available. The following section of the paper will provide an economic analysis of citizen suits. It will begin with a discussion of non-profit organizations, especially non-profit environmental organizations, detailing the economic factors which instigate their creation and activities. Three models will be developed to investigate the evolution and effects of citizen suits. The first model will provide an analysis of the demand for citizen suits from the point of view of a potential litigator showing how varying remedies, limitations and reimbursement procedures can effect both the level and types of activities undertaken. The second model shows how firm behavior could be expected to respond to citizen suits. Finally, a third model will look specifically at the issue of efficiency to determine whether the introduction of citizen enforcement leads to greater or lesser economic efficiency in pollution control. The database on which the analysis rests consists of 1205 cases compiled by the author. For the purposes of this project this list of citizen suit cases and their attributes were computerized and used to test a series of hypotheses derived from three original economic models. The database includes information regarding plaintiffs, defendants date notice and/or complaint was filed and statutes involved in the claim. The analysis focuses on six federal environmental statutes (Clean Water Act} Resource Conservation and Recovery Act, Comprehensive Environmental Response Compensation and Liability Act, Clean Air Act, Toxic Substances Control Act, and Safe Drinking Water Act) because the majority of citizen suits have occurred under these statutes.
Resumo:
GCM outputs such as CMIP3 are available via network access to PCMDI web site. Meteorological researchers are familiar with the usage of the GCM data, but the most of researchers other than meteorology such as agriculture, civil engineering, etc., and general people are not familiar with the GCM. There are some difficulties to use GCM; 1) to download the enormous quantity of data, 2) to understand the GCM methodology, parameters and grids. In order to provide a quick access way to GCM, Climate Change Information Database has been developed. The purpose of the database is to bridge the users and meteorological specialists and to facilitate the understanding the climate changes. The resolution of the data is unified, and climate change amount or factors for each meteorological element are provided from the database. All data in the database are interpolated on the same 80km mesh. Available data are the present-future projections of 27 GCMs, 16 meteorological elements (precipitation, temperature, etc.), 3 emission scenarios (A1B, A2, B1). We showed the summary of this database to residents in Toyama prefecture and measured the effect of showing and grasped the image for the climate change by using the Internet questionary survey. The persons who feel a climate change at the present tend to feel the additional changes in the future. It is important to show the monitoring results of climate change for a citizen and promote the understanding for the climate change that had already occurred. It has been shown that general images for the climate change promote to understand the need of the mitigation, and that it is important to explain about the climate change that might occur in the future even if it did not occur at the present in order to have people recognize widely the need of the adaptation.
Resumo:
This article highlights the potential benefits that the Kohonen method has for the classification of rivers with similar characteristics by determining regional ecological flows using the ELOHA (Ecological Limits of Hydrologic Alteration) methodology. Currently, there are many methodologies for the classification of rivers, however none of them include the characteristics found in Kohonen method such as (i) providing the number of groups that actually underlie the information presented, (ii) used to make variable importance analysis, (iii) which in any case can display two-dimensional classification process, and (iv) that regardless of the parameters used in the model the clustering structure remains. In order to evaluate the potential benefits of the Kohonen method, 174 flow stations distributed along the great river basin “Magdalena-Cauca” (Colombia) were analyzed. 73 variables were obtained for the classification process in each case. Six trials were done using different combinations of variables and the results were validated against reference classification obtained by Ingfocol in 2010, whose results were also framed using ELOHA guidelines. In the process of validation it was found that two of the tested models reproduced a level higher than 80% of the reference classification with the first trial, meaning that more than 80% of the flow stations analyzed in both models formed invariant groups of streams.
Resumo:
The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.
Resumo:
Driven by Web 2.0 technology and the almost ubiquitous presence of mobile devices, Volunteered Geographic Information (VGI) is knowing an unprecedented growth. These notable technological advancements have opened fruitful perspectives also in the field of water management and protection, raising the demand for a reconsideration of policies which also takes into account the emerging trend of VGI. This research investigates the opportunity of leveraging such technology to involve citizens equipped with common mobile devices (e.g. tablets and smartphones) in a campaign of report of water-related phenomena. The work is carried out in collaboration with ADBPO - Autorità di bacino del fiume Po (Po river basin Authority), i.e. the entity responsible for the environmental planning and protection of the basin of river Po. This is the longest Italian river, spreading over eight among the twenty Italian Regions and characterized by complex environmental issues. To enrich ADBPO official database with user-generated contents, a FOSS (Free and Open Source Software) architecture was designed which allows not only user field-data collection, but also data Web publication through standard protocols. Open Data Kit suite allows users to collect georeferenced multimedia information using mobile devices equipped with location sensors (e.g. the GPS). Users can report a number of environmental emergencies, problems or simple points of interest related to the Po river basin, taking pictures of them and providing other contextual information. Field-registered data is sent to a server and stored into a PostgreSQL database with PostGIS spatial extension. GeoServer provides then data dissemination on the Web, while specific OpenLayers-based viewers were built to optimize data access on both desktop computers and mobile devices. Besides proving the suitability of FOSS in the frame of VGI, the system represents a successful prototype for the exploitation of user local, real-time information aimed at managing and protecting water resources.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
Libraries seek active ways to innovate amidst macroeconomic shifts, growing online education to help alleviate ever-growing schedule conflicts as students juggle jobs and course schedules, as well as changing business models in publishing and evolving information technologies. Patron-driven acquisition (PDA), also known as demand-driven acquisition (DDA), offers numerous strengths in supporting university curricula in the context of these significant shifts. PDA is a business model centered on short-term loans and subsequent purchases of ebooks resulting directly from patrons' natural use stemming from their discovery of the ebooks in library catalogs where the ebooks' bibliographic records are loaded at regular intervals established between the library and ebook supplier. Winthrop University's PDA plan went live in October 2011, and this article chronicles the philosophical and operational considerations, the in-library collaboration, and technical preparations in concert with the library system vendor and ebook supplier. Short-term loan is invoked after a threshold is crossed, typically number of pages or time spent in the ebook. After a certain number of short-term loans negotiated between the library and ebook supplier, the next short-term loan becomes an automatic purchase after which the library owns the ebook in perpetuity. Purchasing options include single-user and multi-user licenses. Owing to high levels of need in college and university environments, Winthrop chose the multi-user license as the preferred default purchase. Only where multi-user licenses are unavailable does the automatic purchase occur with single-user title licenses. Data on initial use between October 2011 and February 2013 reveal that of all PDA ebooks viewed, only 30% crossed the threshold into short-term loans. Of all triggered short-term loans, Psychology was the highest-using. Of all ebook views too brief to trigger short-term loans, Business was the highest-using area. Although the data are still too young to draw conclusions after only a few months, thought-provoking usage differences between academic disciplines have begun to emerge. These differences should be considered in library plans for the best possible curricular support for each academic program. As higher education struggles with costs and course-delivery methods libraries have an enduring lead role.