977 resultados para Large bees
Resumo:
The human large intestine is a highly complex ecosystem that contains somewhere in the region of 400 different species of bacterial1.The vast majority of these bacteria are strict anaerobes and grow on a wide variety of substrates that have either escaped digestion in the small bowel or have been produced by the host2. In Western populations, between 10–60g of carbohydrate and 6–18g of proteinaceous material are potentially available for fermentation each day, producing a total bacterial mass of approximately 90g3.
Resumo:
This article critically explores the nature and purpose of relationships and inter-dependencies between stakeholders in the context of a parastatal chromite mining company in the Betsiboka Region of Northern Madagascar. An examination of the institutional arrangements at the interface between the mining company and local communities identified power hierarchies and dependencies in the context of a dominant paternalistic environment. The interactions, inter alia, limited social cohesion and intensified the fragility and weakness of community representation, which was further influenced by ethnic hierarchies between the varied community groups; namely, indigenous communities and migrants to the area from different ethnic groups. Moreover, dependencies and nepotism, which may exist at all institutional levels, can create civil society stakeholder representatives who are unrepresentative of the society they are intended to represent. Similarly, a lack of horizontal and vertical trust and reciprocity inherent in Malagasy society engenders a culture of low expectations regarding transparency and accountability, which further catalyses a cycle of nepotism and elite rent-seeking behaviour. On the other hand, leaders retain power with minimal vertical delegation or decentralisation of authority among levels of government and limit opportunities to benefit the elite, perpetuating rent-seeking behaviour within the privileged minority. Within the union movement, pluralism and the associated politicisation of individual unions restricts solidarity, which impacts on the movement’s capacity to act as a cohesive body of opinion and opposition. Nevertheless, the unions’ drive to improve their social capital has increased expectations of transparency and accountability, resulting in demands for greater engagement in decision-making processes.
Resumo:
In projections of twenty-first century climate, Arctic sea ice declines and at the same time exhibits strong interannual anomalies. Here, we investigate the potential to predict these strong sea-ice anomalies under a perfect-model assumption, using the Max-Planck-Institute Earth System Model in the same setup as in the Coupled Model Intercomparison Project Phase 5 (CMIP5). We study two cases of strong negative sea-ice anomalies: a 5-year-long anomaly for present-day conditions, and a 10-year-long anomaly for conditions projected for the middle of the twenty-first century. We treat these anomalies in the CMIP5 projections as the truth, and use exactly the same model configuration for predictions of this synthetic truth. We start ensemble predictions at different times during the anomalies, considering lagged-perfect and sea-ice-assimilated initial conditions. We find that the onset and amplitude of the interannual anomalies are not predictable. However, the further deepening of the anomaly can be predicted for typically 1 year lead time if predictions start after the onset but before the maximal amplitude of the anomaly. The magnitude of an extremely low summer sea-ice minimum is hard to predict: the skill of the prediction ensemble is not better than a damped-persistence forecast for lead times of more than a few months, and is not better than a climatology forecast for lead times of two or more years. Predictions of the present-day anomaly are more skillful than predictions of the mid-century anomaly. Predictions using sea-ice-assimilated initial conditions are competitive with those using lagged-perfect initial conditions for lead times of a year or less, but yield degraded skill for longer lead times. The results presented here suggest that there is limited prospect of predicting the large interannual sea-ice anomalies expected to occur throughout the twenty-first century.
Resumo:
A set of coupled ocean-atmosphere simulations using state of the art climate models is now available for the Last Glacial Maximum and the Mid-Holocene through the second phase of the Paleoclimate Modeling Intercomparison Project (PMIP2). This study presents the large-scale features of the simulated climates and compares the new model results to those of the atmospheric models from the first phase of the PMIP, for which sea surface temperature was prescribed or computed using simple slab ocean formulations. We consider the large-scale features of the climate change, pointing out some of the major differences between the different sets of experiments. We show in particular that systematic differences between PMIP1 and PMIP2 simulations are due to the interactive ocean, such as the amplification of the African monsoon at the Mid-Holocene or the change in precipitation in mid-latitudes at the LGM. Also the PMIP2 simulations are in general in better agreement with data than PMIP1 simulations.
Resumo:
Sensible heat fluxes (QH) are determined using scintillometry and eddy covariance over a suburban area. Two large aperture scintillometers provide spatially integrated fluxes across path lengths of 2.8 km and 5.5 km over Swindon, UK. The shorter scintillometer path spans newly built residential areas and has an approximate source area of 2-4 km2, whilst the long path extends from the rural outskirts to the town centre and has a source area of around 5-10 km2. These large-scale heat fluxes are compared with local-scale eddy covariance measurements. Clear seasonal trends are revealed by the long duration of this dataset and variability in monthly QH is related to the meteorological conditions. At shorter time scales the response of QH to solar radiation often gives rise to close agreement between the measurements, but during times of rapidly changing cloud cover spatial differences in the net radiation (Q*) coincide with greater differences between heat fluxes. For clear days QH lags Q*, thus the ratio of QH to Q* increases throughout the day. In summer the observed energy partitioning is related to the vegetation fraction through use of a footprint model. The results demonstrate the value of scintillometry for integrating surface heterogeneity and offer improved understanding of the influence of anthropogenic materials on surface-atmosphere interactions.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
Wild bird feeding is popular in domestic gardens across the world. Nevertheless, there is surprisingly little empirical information on certain aspects of the activity and no year-round quantitative records of the amounts and nature of the different foods provided in individual gardens. We sought to characterise garden bird feeding in a large UK urban area in two ways. First, we conducted face-to-face questionnaires with a representative cross-section of residents. Just over half fed birds, the majority doing so year round and at least weekly. Second, a two-year study recorded all foodstuffs put out by households on every provisioning occasion. A median of 628 kcal/garden/day was given. Provisioning level was not significantly influenced by weather or season. Comparisons between the data sets revealed significantly less frequent feeding amongst these ‘keen’ feeders than the face-to-face questionnaire respondents, suggesting that one-off questionnaires may overestimate provisioning frequency. Assuming 100% uptake, the median provisioning level equates to sufficient supplementary resources across the UK to support 196 million individuals of a hypothetical average garden-feeding bird species (based on 10 common UK garden-feeding birds’ energy requirements). Taking the lowest provisioning level recorded (101 kcal/day) as a conservative measure, 31 million of these average individuals could theoretically be supported.
Resumo:
Variational data assimilation is commonly used in environmental forecasting to estimate the current state of the system from a model forecast and observational data. The assimilation problem can be written simply in the form of a nonlinear least squares optimization problem. However the practical solution of the problem in large systems requires many careful choices to be made in the implementation. In this article we present the theory of variational data assimilation and then discuss in detail how it is implemented in practice. Current solutions and open questions are discussed.
Resumo:
A lattice Boltzmann method for simulating the viscous flow in large distensible blood vessels is presented by introducing a boundary condition for elastic and moving boundaries. The mass conservation for the boundary condition is tested in detail. The viscous flow in elastic vessels is simulated with a pressure-radius relationship similar to that of the Pulmonary blood vessels. The numerical results for steady flow agree with the analytical prediction to very high accuracy, and the simulation results for pulsatile flow are comparable with those of the aortic flows observed experimentally. The model is expected to find many applications for studying blood flows in large distensible arteries, especially in those suffering from atherosclerosis. stenosis. aneurysm, etc.
Resumo:
Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.