96 resultados para Commodity exchanges


Relevância:

10.00% 10.00%

Publicador:

Resumo:

•In current models, the ecophysiological effects of CO2 create both woody thickening and terrestrial carbon uptake, as observed now, and forest cover and terrestrial carbon storage increases that took place after the last glacial maximum (LGM). Here, we aimed to assess the realism of modelled vegetation and carbon storage changes between LGM and the pre-industrial Holocene (PIH). •We applied Land Processes and eXchanges (LPX), a dynamic global vegetation model (DGVM), with lowered CO2 and LGM climate anomalies from the Palaeoclimate Modelling Intercomparison Project (PMIP II), and compared the model results with palaeodata. •Modelled global gross primary production was reduced by 27–36% and carbon storage by 550–694 Pg C compared with PIH. Comparable reductions have been estimated from stable isotopes. The modelled areal reduction of forests is broadly consistent with pollen records. Despite reduced productivity and biomass, tropical forests accounted for a greater proportion of modelled land carbon storage at LGM (28–32%) than at PIH (25%). •The agreement between palaeodata and model results for LGM is consistent with the hypothesis that the ecophysiological effects of CO2 influence tree–grass competition and vegetation productivity, and suggests that these effects are also at work today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given the decision to include small-scale sinks projects implemented by low-income communities in the clean development mechanism of the Kyoto Protocol, the paper explores some of the basic governance conditions that such carbon forestry projects will have to meet if they are to be successfully put in practice. To date there are no validated small-scale sinks projects and investors have shown little interest in financing such projects, possibly to due to the risks and uncertainties associated with sinks projects. Some suggest however, that carbon has the potential to become a serious commodity on the world market, thus governance over ownership, rights and responsibilities merit discussion. Drawing on the interdisciplinary development, as well as from the literature on livelihoods and democratic decentralization in forestry, the paper explores how to adapt forest carbon projects to the realities encountered in the local context. It also highlights the importance of capitalizing on synergies with other rural development strategies, ensuring stakeholder participation by working with accountable, representative local organizations, and creating flexible and adaptive project designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we provide an alternative explanation for why illegal immigration can exhibit substantial fluctuation. We develop a model economy in which migrants make decisions in the face of uncertain border enforcement and lump-sum transfers from the host country. The uncertainty is extrinsic in nature, a sunspot, and arises as a result of ambiguity regarding the commodity price of money. Migrants are restricted from participating in state-contingent insurance markets in the host country, whereas host country natives are not. Volatility in migration flows stems from two distinct sources: the tension between transfers inducing migration and enforcement discouraging it and secondly the existence of a sunspot. Finally, we examine the impact of a change in tax/transfer policies by the government on migration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although certified Fairtrade continues to use discourses of defetishization, its move into mainstream markets has acted to refetishize the consumer–producer relationship through the use of a standardized label, which acts as a substitute for engaged knowledges. Through Fairhills, a South African Fairtrade wine project, this paper explores the contextual complexity on the producer side of the commodity network. By incorporating the national discourse of Black Economic Empowerment into its operations, both in Fairhills and in South Africa in general, Fairtrade has adapted to this context, ensuring its relevance and credibility to stakeholders. However, in the UK, little more information than that commonly associated with Fairtrade is offered to Fairhills consumers. The particular market challenges facing Fairtrade wine in the UK make this negotiation between regulation and representation extremely pertinent. A productive way forward may be to conceptualize commodity fetishism as a continuum rather than a binary particularly when considering the difficult balance required when adding complexity to the targeted message of the existing label. This strategy for the sustainability of Fairtrade may be enhanced by utilizing the micro-level dynamism and adaptability that this paper shows is inherent, and indeed essential, to the durability and transferability of the discourse of Fairtrade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

So-called ‘radical’ and ‘critical’ pedagogy seems to be everywhere these days on the landscapes of geographical teaching praxis and theory. Part of the remit of radical/critical pedagogy involves a de-centring of the traditional ‘banking’ method of pedagogical praxis. Yet, how do we challenge this ‘banking’ model of knowledge transmission in both a large-class setting and around the topic of commodity geographies where the banking model of information transfer still holds sway? This paper presents a theoretically and pedagogically driven argument, as well as a series of practical teaching ‘techniques’ and tools—mind-mapping and group work—designed to promote ‘deep learning’ and a progressive political potential in a first-year large-scale geography course centred around lectures on the Geographies of Consumption and Material Culture. Here students are not only asked to place themselves within and without the academic materials and other media but are urged to make intimate connections between themselves and their own consumptive acts and the commodity networks in which they are enmeshed. Thus, perhaps pedagogy needs to be emplaced firmly within the realms of research practice rather than as simply the transference of research findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover; composition and height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, and are compared to scores based on the temporal or spatial mean value of the observations and a "random" model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global vegetation models (DGVMs). In general, the SDBM performs better than either of the DGVMs. It reproduces independent measurements of net primary production (NPP) but underestimates the amplitude of the observed CO2 seasonal cycle. The two DGVMs show little difference for most benchmarks (including the inter-annual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change impacts and feedbacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The centre of cities, characterised by spatial and temporal complexity, are challenging environments for micrometeorological research. This paper considers the impact of sensor location and heterogeneity of the urban surface on flux observations in the dense city centre of London, UK. Data gathered at two sites in close vicinity, but with different measurement heights, were analysed to investigate the influence of source area characteristics on long-term radiation and turbulent heat fluxes. Combining consideration of diffuse radiation and effects of specular reflections, the non-Lambertian urban surface is found to impact the measurements of surface albedo. Comparisons of observations from the two sites reveal that turbulent heat fluxes are similar under some flow conditions. However, they mostly observe processes at different scales due to their differing measurement heights, highlighting the critical impact of siting sensors in urban areas. A detailed source area analysis is presented to investigate the surface controls influencing the energy exchanges at the different scales

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Natural mineral aerosol (dust) is an active component of the climate system and plays multiple roles in mediating physical and biogeochemical exchanges between the atmosphere, land surface and ocean. Changes in the amount of dust in the atmosphere are caused both by changes in climate (precipitation, wind strength, regional moisture balance) and changes in the extent of dust sources caused by either anthropogenic or climatically induced changes in vegetation cover. Models of the global dust cycle take into account the physical controls on dust deflation from prescribed source areas (based largely on soil wetness and vegetation cover thresholds), dust transport within the atmospheric column, and dust deposition through sedimentation and scavenging by precipitation. These models successfully reproduce the first-order spatial and temporal patterns in atmospheric dust loading under modern conditions. Atmospheric dust loading was as much as an order-of-magnitude larger than today during the last glacial maximum (LGM). While the observed increase in emissions from northern Africa can be explained solely in terms of climate changes (colder, drier and windier glacial climates), increased emissions from other regions appear to have been largely a response to climatically induced changes in vegetation cover and hence in the extent of dust source areas. Model experiments suggest that the increased dust loading in tropical regions had an effect on radiative forcing comparable to that of low glacial CO2 levels. Changes in land-use are already increasing the dust loading of the atmosphere. However, simulations show that anthropogenically forced climate changes substantially reduce the extent and productivity of natural dust sources. Positive feedbacks initiated by a reduction of dust emissions from natural source areas on both radiative forcing and atmospheric CO2 could substantially mitigate the impacts of land-use changes, and need to be considered in climate change assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The animal gastrointestinal tract houses a large microbial community, the gut microbiota, that confers many benefits to its host, such as protection from pathogens and provision of essential metabolites. Metagenomic approaches have defined the chicken fecal microbiota in other studies, but here, we wished to assess the correlation between the metagenome and the bacterial proteome in order to better understand the healthy chicken gut microbiota. Here, we performed high-throughput sequencing of 16S rRNA gene amplicons and metaproteomics analysis of fecal samples to determine microbial gut composition and protein expression. 16 rRNA gene sequencing analysis identified Clostridiales, Bacteroidaceae, and Lactobacillaceae species as the most abundant species in the gut. For metaproteomics analysis, peptides were generated by using the Fasp method and subsequently fractionated by strong anion exchanges. Metaproteomics analysis identified 3,673 proteins. Among the most frequently identified proteins, 380 proteins belonged to Lactobacillus spp., 155 belonged to Clostridium spp., and 66 belonged to Streptococcus spp. The most frequently identified proteins were heat shock chaperones, including 349 GroEL proteins, from many bacterial species, whereas the most abundant enzymes were pyruvate kinases, as judged by the number of peptides identified per protein (spectral counting). Gene ontology and KEGG pathway analyses revealed the functions and locations of the identified proteins. The findings of both metaproteomics and 16S rRNA sequencing analyses are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The extent of the surface area sunlit is critical for radiative energy exchanges and therefore for a wide range of applications that require urban land surface models (ULSM), ranging from human comfort to weather forecasting. Here a computational demanding shadow casting algorithm is used to assess the capability of a simple single-layer urban canopy model, which assumes an infinitely long rotating canyon (ILC), to reproduce sunlit areas on roof and roads over central London. Results indicate that the sunlit roads areas are well-represented but somewhat smaller using an ILC, while sunlit roofs areas are consistently larger, especially for dense urban areas. The largest deviations from real world sunlit areas are found for roofs during mornings and evenings. Indications that sunlit fractions on walls are overestimated using an ILC during mornings and evenings are found. The implications of these errors are dependent on the application targeted. For example, (independent of albedo) ULSMs used in numerical weather prediction applying ILC representation of the urban form will overestimate outgoing shortwave radiation from roofs due to the overestimation of sunlit fraction of the roofs. Complications of deriving height to width ratios from real world data are also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Globalization, either directly or indirectly (e.g. through structural adjustment reforms), has called for profound changes in the previously existing institutional order. Some changes adversely impacted the production and market environment of many coffee producers in developing countries resulting in more risky and less remunerative coffee transactions. This paper focuses on customization of a tropical commodity, fair-trade coffee, as an approach to mitigating the effects of worsened market conditions for small-scale coffee producers in less developed countries. fair-trade labeling is viewed as a form of “de-commodification” of coffee through product differentiation on ethical grounds. This is significant not only as a solution to the market failure caused by pervasive information asymmetries along the supply chain, but also as a means of revitalizing the agricultural-commodity-based trade of less developed countries (LDCs) that has been languishing under globalization. More specifically, fair-trade is an example of how the same strategy adopted by developed countries’ producers/ processors (i.e. the sequence product differentiation - institutional certification - advertisement) can be used by LDC producers to increase the reputation content of their outputs by transforming them from mere commodities into “decommodified” (i.e. customized and more reputed) goods. The resulting segmentation of the world coffee market makes possible to meet the demand by consumers with preference for this “(ethically) customized” coffee and to transfer a share of the accruing economic rents backward to the Fair-trade coffee producers in LDCs. It should however be stressed that this outcome cannot be taken for granted since investments are needed to promote the required institutional innovations. In Italy FTC is a niche market with very few private brands selling this product. However, an increase of FTC market share could be a big commercial opportunity for farmers in LDCs and other economic agents involved along the international coffee chain. Hence, this research explores consumers’ knowledge of labels promoting quality products, consumption coffee habits, brand loyalty, willingness to pay and market segmentation according to the heterogeneity of preferences for coffee products. The latter was assessed developing a D-efficient design where stimuli refinement was tested during two focus groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use both Granger-causality and instrumental variables (IV) methods to examine the impact of index fund positions on price returns for the main US grains and oilseed futures markets. Our analysis supports earlier conclusions that Granger-causal impacts are generally not discernible. However, market microstructure theory suggests trading impacts should be instantaneous. IV-based tests for contemporaneous causality provide stronger evidence of price impact. We find even stronger evidence that changes in index positions can help predict future changes in aggregate commodity price indices. This result suggests that changes in index investment are in part driven by information which predicts commodity price changes over the coming months.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon has been described as a ‘surreal commodity’. Whilst carbon trading, storage, sequestration and emissions have become a part of the contemporary climate lexicon, how carbon is understood, valued and interpreted by actors responsible for implementing carbon sequestration projects is still unclear. In this review paper, we are concerned with how carbon has come to take on a range of meanings, and in particular, we appraise what is known about the situated meanings that people involved in delivering, and participating in, carbon sequestration projects in the global South assign to this complex element. Whilst there has been some reflection on the new meanings conferred on carbon via the neoliberal processes of marketisation, and how these processes interact with historical and contemporary narratives of environmental change, less is known about how these meanings are (re)produced and (re)interpreted locally. We review how carbon has been defined both as a chemical element and as a tradable, marketable commodity, and discuss the implications these global meanings might have for situated understandings, particularly linked to climate change narratives, amongst communities in the global South. We consider how the concept of carbon capabilities, alongside theoretical notions of networks, assemblages and local knowledges of the environment and nature, might be useful in beginning to understand how communities engage with abstract notions of carbon. We discuss the implications of specific values attributed to carbon, and therefore to different ecologies, for wider conceptualisations of how nature is valued, and climate is understood, and particularly how this may impact on community interactions with carbon sequestration projects. Knowing more about how people understand, value and know carbon allows policies to be better informed and practices more effectively targeted at engaging local populations meaningfully in carbon-related projects.