880 resultados para Value of complex use
Resumo:
There is concern that insect pollinators, such as honey bees, are currently declining in abundance, and are under serious threat from environmental changes such as habitat loss and climate change; the use of pesticides in intensive agriculture, and emerging diseases. This paper aims to evaluate how much public support there would be in preventing further decline to maintain the current number of bee colonies in the UK. The contingent valuation method (CVM) was used to obtain the willingness to pay (WTP) for a theoretical pollinator protection policy. Respondents were asked whether they would be WTP to support such a policy and how much would they pay? Results show that the mean WTP to support the bee protection policy was £1.37/week/household. Based on there being 24.9 million households in the UK, this is equivalent to £1.77 billion per year. This total value can show the importance of maintaining the overall pollination service to policy makers. We compare this total with estimates obtained using a simple market valuation of pollination for the UK.
Resumo:
Tepe Pardis, a significant Neolithic–Chalcolithic site on the Tehran Plain in Iran, is, like many sites in the area, under threat from development. The site contains detailed evidence of (1) the Neolithic–Chalcolithic transition, (2) an Iron Age cemetery and (3) how the inhabitants adapted to an unstable fan environment through resource exploitation (of clay deposits for relatively large-scale ceramic production by c. 5000 BC, and importantly, possible cutting of artificial water channels). Given this significance, models have been produced to better understand settlement distribution and change in the region. However, these models must be tied into a greater understanding of the impact of the geosphere on human development over this period. Forming part of a larger project focusing on the transformation of simple, egalitarian Neolithic communities into more hierarchical Chalcolithic ones, the site has become the focus of a multidisciplinary project to address this issue. Through the combined use of sedimentary and limited pollen analysis, radiocarbon and optically stimulated luminescence dating (the application of the last still rare in Iran), a greater understanding of the impact of alluvial fan development on human settlement through alluviation and the development of river channel sequences is possible. Notably, the findings presented here suggest that artificial irrigation was occurring at the site as early as 6.7±0.4 ka (4300–5100 BC).
Resumo:
A manageable, relatively inexpensive model was constructed to predict the loss of nitrogen and phosphorus from a complex catchment to its drainage system. The model used an export coefficient approach, calculating the total nitrogen (N) and total phosphorus (P) load delivered annually to a water body as the sum of the individual loads exported from each nutrient source in its catchment. The export coefficient modelling approach permits scaling up from plot-scale experiments to the catchment scale, allowing application of findings from field experimental studies at a suitable scale for catchment management. The catchment of the River Windrush, a tributary of the River Thames, UK, was selected as the initial study site. The Windrush model predicted nitrogen and phosphorus loading within 2% of observed total nitrogen load and 0.5% of observed total phosphorus load in 1989. The export coefficient modelling approach was then validated by application in a second research basin, the catchment of Slapton Ley, south Devon, which has markedly different catchment hydrology and land use. The Slapton model was calibrated within 2% of observed total nitrogen load and 2.5% of observed total phosphorus load in 1986. Both models proved sensitive to the impact of temporal changes in land use and management on water quality in both catchments, and were therefore used to evaluate the potential impact of proposed pollution control strategies on the nutrient loading delivered to the River Windrush and Slapton Ley
Resumo:
Although most researchers recognise that the language repertoire of bilinguals canmvary, few studies have tried to address variation in bilingual competence in any detail. This study aims to take a first step towards further understanding the way in which bilingual competencies can vary at the level of syntax by comparing the use of syntactic embeddings among three different groups of Turkish�German bilinguals. The approach of the present paper is new in that different groups of bilinguals are compared with each other, and not only with monolingual speakers, as is common in most studies in the field. The analysis focuses on differences in the use of embeddings in Turkish, which are generally considered to be one of the more complex aspects of Turkish grammar. The study shows that young Turkish� German bilingual adults who were born and raised in Germany use fewer, and less complex embeddings than Turkish�German bilingual returnees who had lived in Turkey for eight years at the time of recording. The present study provides new insights in the nature of bilingual competence, as well as a new perspective on syntactic change in immigrant Turkish as spoken in Europe.
Resumo:
In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.
Resumo:
The construction field is dynamic and dominated by complex, ill-defined problems for which myriad possible solutions exist. Teaching students to solve construction-related problems requires an understanding of the nature of these complex problems as well as the implementation of effective instructional strategies to address them. Traditional approaches to teaching construction planning and management have long been criticized for presenting students primarily with well-defined problems - an approach inconsistent with the challenges encountered in the industry. However, growing evidence suggests that employing innovative teaching approaches, such as interactive simulation games, offers more active, hands-on and problem-based learning opportunities for students to synthesize and test acquired knowledge more closely aligned with real-life construction scenarios. Simulation games have demonstrated educational value in increasing student problem solving skills and motivation through critical attributes such as interaction and feedback-supported active learning. Nevertheless, broad acceptance of simulation games in construction engineering education remains limited. While recognizing benefits, research focused on the role of simulation games in educational settings lacks a unified approach to developing, implementing and evaluating these games. To address this gap, this paper provides an overview of the challenges associated with evaluating the effectiveness of simulation games in construction education that still impede their wide adoption. An overview of the current status, as well as the results from recently implemented Virtual Construction Simulator (VCS) game at Penn State provide lessons learned, and are intended to guide future efforts in developing interactive simulation games to reach their full potential.
Resumo:
Using monthly time-series data 1999-2013, the paper shows that markets for agricultural commodities provide a yardstick for real purchasing power, and thus a reference point for the real value of fiat currencies. The daily need for each adult to consume about 2800 food calories is universal; data from FAO food balance sheets confirm that the world basket of food consumed daily is non-volatile in comparison to the volatility of currency exchange rates, and so the replacement cost of food consumed provides a consistent indicator of economic value. Food commodities are storable for short periods, but ultimately perishable, and this exerts continual pressure for markets to clear in the short term; moreover, food calories can be obtained from a very large range of foodstuffs, and so most households are able to use arbitrage to select a near optimal weighting of quantities purchased. The paper proposes an original method to enable a standard of value to be established, definable in physical units on the basis of actual worldwide consumption of food goods, with an illustration of the method.
Resumo:
Paternal biocontainment methods (PBMs) act by preventing pollen-mediated transgene flow. They are compromised by transgene escape via the crop-maternal line. We therefore assess the efficacy of PBMs for transgenic rapeseed (Brassica napus) biocontainment across the United Kingdom by estimating crop-maternal hybridization with its two progenitor species. We used remote sensing, field surveys, agricultural statistics, and meta-analysis to determine the extent of sympatry between the crop and populations of riparian and weedy B. rapa and B. oleracea. We then estimated the incidence of crop-maternal hybridization across all settings to predict the efficacy of PBMs. Evidence of crop chloroplast capture by the progenitors was expanded to a national scale, revealing that crop-maternal gene flow occurs at widely variable rates and is dependent on both the recipient and setting. We use these data to explore the value that this kind of biocontainment can bring to genetic modification (GM) risk management in terms of reducing the impact that hybrids have on the environment rather than preventing or reducing hybrid abundance per se.
Resumo:
This Study investigated the impact of thermoplastic extrusion on the nutritive quality of bovine rumen protein. Proximal composition, amino acid profile and in vivo true protein digestibility among rats were determined in raw (RBR) and extruded (EBR) rumen. Raw and extruded bovine rumen presented high percentages of protein (more than 95% on dry basis). Neither raw nor extruded proteins had any limiting amino acid, and the RBR and EBR amino acid scores were, respectively, 1.28 (leucine) and 1.25 (methionine plus cystine). Extrusion reduced significantly true protein digestibility from 97.7% to 93.1% (p < 0.001), but protein digestibility-corrected amino acid scores for both proteins (RBR and EBR) were 100%. Animal growth presented comparable profiles using raw and extruded rumen. In conclusion, thermoplastic extrusion did not affect the protein quality of bovine rumen, and this does not hinder the use of this material as a food ingredient. (C) 2009 Elsevier Ltd. Ail rights reserved.
Resumo:
The pulp- and paper production is a very energy intensive industry sector. Both Sweden and the U.S. are major pulpandpaper producers. This report examines the energy and the CO2-emission connected with the pulp- and paperindustry for the two countries from a lifecycle perspective.New technologies make it possible to increase the electricity production in the integrated pulp- andpaper mill through black liquor gasification and a combined cycle (BLGCC). That way, the mill canproduce excess electricity, which can be sold and replace electricity produced in power plants. In thisprocess the by-products that are formed at the pulp-making process is used as fuel to produce electricity.In pulp- and paper mills today the technology for generating energy from the by-product in aTomlinson boiler is not as efficient as it could be compared to the BLGCC technology. Scenarios havebeen designed to investigate the results from using the BLGCC technique using a life cycle analysis.Two scenarios are being represented by a 1994 mill in the U.S. and a 1994 mill in Sweden.The scenariosare based on the average energy intensity of pulp- and paper mills as operating in 1994 in the U.S.and Sweden respectively. The two other scenarios are constituted by a »reference mill« in the U.S. andSweden using state-of-the-art technology. We investigate the impact of varying recycling rates and totalenergy use and CO2-emissions from the production of printing and writing paper. To economize withthe wood and that way save trees, we can use the trees that are replaced by recycling in a biomassgasification combined cycle (BIGCC) to produce electricity in a power station. This produces extra electricitywith a lower CO2 intensity than electricity generated by, for example, coal-fired power plants.The lifecycle analysis in this thesis also includes the use of waste treatment in the paper lifecycle. Both Sweden and theU.S. are countries that recycle paper. Still there is a lot of paper waste, this paper is a part of the countries municipalsolid waste (MSW). A lot of the MSW is landfilled, but parts of it are incinerated to extract electricity. The thesis hasdesigned special scenarios for the use of MSW in the lifecycle analysis.This report is studying and comparing two different countries and two different efficiencies on theBLGCC in four different scenarios. This gives a wide survey and points to essential parameters to specificallyreflect on, when making assumptions in a lifecycle analysis. The report shows that there arethree key parameters that have to be carefully considered when making a lifecycle analysis of wood inan energy and CO2-emission perspective in the pulp- and paper mill in the U.S. and in Sweden. First,there is the energy efficiency in the pulp- and paper mill, then the efficiency of the BLGCC and last theCO2 intensity of the electricity displaced by BIGCC or BLGCC generatedelectricity. It also show that with the current technology that we havetoday, it is possible to produce CO2 free paper with a waste paper amountup to 30%. The thesis discusses the system boundaries and the assumptions.Further and more detailed research, including amongst others thesystem boundaries and forestry, is recommended for more specificanswers.
Resumo:
In the field of Information and Communication Technologies for Development (ICT4D) ICT use in education is well studied. Education is often seen as a pre-requisite for development and ICTs are believed to aid in education, e.g. to make it more accessible and to increase its quality. In this paper we study the access and use of ICT in a study circle (SC) education program in the south coast of Kenya. The study is qualitative reporting results based on interviews and observations with SC participants, government officers and SC coordinators and teachers. The study builds on the capability approach perspective of development where individuals’ opportunities and ability to live a life that they value are focused. The aim of the study is to investigate the capability outcomes enabled through the capability inputs access and use of ICT in education as well as the factors that enabled and/or restricted the outcomes. Findings show that many opportunities have been enabled such as an increase in the ability to generate an income, learning benefits, community development and basic human development (e.g. literacy and self-confidence). However, conversion factors such as a poorly developed infrastructure and poor IT literacy prevent many of the individuals from taking full advantage of the ICT and the opportunities it enables.
Resumo:
We analyze a dynamic principal–agent model where an infinitely-lived principal faces a sequence of finitely-lived agents who differ in their ability to produce output. The ability of an agent is initially unknown to both him and the principal. An agent’s effort affects the information on ability that is conveyed by performance. We characterize the equilibrium contracts and show that they display short–term commitment to employment when the impact of effort on output is persistent but delayed. By providing insurance against early termination, commitment encourages agents to exert effort, and thus improves on the principal’s ability to identify their talent. We argue that this helps explain the use of probationary appointments in environments in which there exists uncertainty about individual ability.
Resumo:
The value of life methodology has been recently applied to a wide range of contexts as a means to evaluate welfare gains attributable to mortality reductions and health improvements. Yet, it suffers from an important methodological drawback: it does not incorporate into the analysis child mortality, individuals’ decisions regarding fertility, and their altruism towards offspring. Two interrelated dimensions of fertility choice are potentially essential in evaluating life expectancy and health related gains. First, child mortality rates can be very important in determining welfare in a context where individuals choose the number of children they have. Second, if altruism motivates fertility, life expectancy gains at any point in life have a twofold effect: they directly increase utility via increased survival probabilities, and they increase utility via increased welfare of the offspring. We develop a manageable way to deal with value of life valuations when fertility choices are endogenous and individuals are altruistic towards their offspring. We use the methodology developed in the paper to value the reductions in mortality rates experienced by the US between 1965 and 1995. The calculations show that, with a very conservative set of parameters, altruism and fertility can easily double the value of mortality reductions for a young adult, when compared to results obtained using the traditional value of life methodology.
Resumo:
Aim. Duplex scanning has been used in the evaluation of the aorta and proximal arteries of the lower extremities, but has limitations in evaluating the arteries of the leg. The utilization of ultrasonographic contrast (USC) may be helpful in improving the quality of the image in these arteries. The objective of the present study was to verify whether the USC increases the diagnostic accuracy of patency of the leg arteries and if it diminishes the time needed to perform duplex scanning.Methods. Twenty patients with critical ischemia (20 lower extremities) were examined by standard duplex scanning, duplex scanning with contrast and digital subtraction arteriography (DSA). The 3 arteries of the leg were divided into 3 segments, for a total of 9 segments per limb. Each segment was evaluated for patency in order to compare the 3 diagnostic methods. Comparison was made between standard duplex scanning and duplex scanning with contrast in terms of quality of the color-coded Doppler signal and of the spectral curve, and also of the time to perform the exams.Results. Duplex scanning with contrast was similar to arteriography in relation to patency diagnosis (p>0.3) and even superior in some of the segments. Standard duplex scanning was inferior to arteriography and to duplex scanning with contrast (p<0.001). There were improvements of 70% in intensity of the color-coded Doppler signal and 76% in the spectral curve after the utilization of contrast. The time necessary to perform the examinations was 23.7 minutes for standard duplex scanning and 16.9 minutes for duplex scanning with contrast (p<0.001).Conclusion. The use of ultrasonographic contrast increased the accuracy of the diagnosis of patency of leg arteries and diminished the time necessary for the execution of duplex scanning.