139 resultados para Arid regions agriculture.
Resumo:
Land use and agricultural practices can result in important contributions to the global source strength of atmospheric nitrous oxide (N2O) and methane (CH4). However, knowledge of gas flux from irrigated agriculture is very limited. From April 2005 to October 2006, a study was conducted in the Aral Sea Basin, Uzbekistan, to quantify and compare emissions of N2O and CH4 in various annual and perennial land-use systems: irrigated cotton, winter wheat and rice crops, a poplar plantation and a natural Tugai (floodplain) forest. In the annual systems, average N2O emissions ranged from 10 to 150 μg N2O-N m−2 h−1 with highest N2O emissions in the cotton fields, covering a similar range of previous studies from irrigated cropping systems. Emission factors (uncorrected for background emission), used to determine the fertilizer-induced N2O emission as a percentage of N fertilizer applied, ranged from 0.2% to 2.6%. Seasonal variations in N2O emissions were principally controlled by fertilization and irrigation management. Pulses of N2O emissions occurred after concomitant N-fertilizer application and irrigation. The unfertilized poplar plantation showed high N2O emissions over the entire study period (30 μg N2O-N m−2 h−1), whereas only negligible fluxes of N2O (<2 μg N2O-N m−2 h−1) occurred in the Tugai. Significant CH4 fluxes only were determined from the flooded rice field: Fluxes were low with mean flux rates of 32 mg CH4 m−2 day−1 and a low seasonal total of 35.2 kg CH4 ha−1. The global warming potential (GWP) of the N2O and CH4 fluxes was highest under rice and cotton, with seasonal changes between 500 and 3000 kg CO2 eq. ha−1. The biennial cotton–wheat–rice crop rotation commonly practiced in the region would average a GWP of 2500 kg CO2 eq. ha−1 yr−1. The analyses point out opportunities for reducing the GWP of these irrigated agricultural systems by (i) optimization of fertilization and irrigation practices and (ii) conversion of annual cropping systems into perennial forest plantations, especially on less profitable, marginal lands.
Resumo:
Malaria has been eliminated from over 40 countries with an additional 39 currently planning for, or committed to, elimination. Information on the likely impact of available interventions, and the required time, is urgently needed to help plan resource allocation. Mathematical modelling has been used to investigate the impact of various interventions; the strength of the conclusions is boosted when several models with differing formulation produce similar data. Here we predict by using an individual-based stochastic simulation model of seasonal Plasmodium falciparum transmission that transmission can be interrupted and parasite reintroductions controlled in villages of 1,000 individuals where the entomological inoculation rate is <7 infectious bites per person per year using chemotherapy and bed net strategies. Above this transmission intensity bed nets and symptomatic treatment alone were not sufficient to interrupt transmission and control the importation of malaria for at least 150 days. Our model results suggest that 1) stochastic events impact the likelihood of successfully interrupting transmission with large variability in the times required, 2) the relative reduction in morbidity caused by the interventions were age-group specific, changing over time, and 3) the post-intervention changes in morbidity were larger than the corresponding impact on transmission. These results generally agree with the conclusions from previously published models. However the model also predicted changes in parasite population structure as a result of improved treatment of symptomatic individuals; the survival probability of introduced parasites reduced leading to an increase in the prevalence of sub-patent infections in semi-immune individuals. This novel finding requires further investigation in the field because, if confirmed, such a change would have a negative impact on attempts to eliminate the disease from areas of moderate transmission.
Resumo:
Agriculture is responsible for a significant proportion of total anthropogenic greenhouse gas emissions (perhaps 18% globally), and therefore has the potential to contribute to efforts to reduce emissions as a means of minimising the risk of dangerous climate change. The largest contributions to emissions are attributed to ruminant methane production and nitrous oxide from animal waste and fertilised soils. Further, livestock, including ruminants, are an important component of global and Australian food production and there is a growing demand for animal protein sources. At the same time as governments and the community strengthen objectives to reduce greenhouse gas emissions, there are growing concerns about global food security. This paper provides an overview of a number of options for reducing methane and nitrous oxide emissions from ruminant production systems in Australia, while maintaining productivity to contribute to both objectives. Options include strategies for feed modification, animal breeding and herd management, rumen manipulation and animal waste and fertiliser management. Using currently available strategies, some reductions in emissions can be achieved, but practical commercially available techniques for significant reductions in methane emissions, particularly from extensive livestock production systems, will require greater time and resource investment. Decreases in the levels of emissions from these ruminant systems (i.e., the amount of emissions per unit of product such as meat) have already been achieved. However, the technology has not yet been developed for eliminating production of methane from the rumen of cattle and sheep digesting the cellulose and lignin-rich grasses that make up a large part of the diet of animals grazing natural pastures, particularly in arid and semi-arid grazing lands. Nevertheless, the abatement that can be achieved will contribute significantly towards reaching greenhouse gas emissions reduction targets and research will achieve further advances.
Resumo:
This study compared proximal femoral morphology in patients living in soft and hard water regions. The proximal femoral morphology of two groups of 70 patients living in hard and soft water regions with a mean age of 72.3 (range 50 to 87 years) were measured using an antero-posterior radiograph of the non-operated hip with magnification adjusted. The medullary canal diameter at the level of the lesser trochanter (LT) was significantly wider in patients living in the hard water region (mean width 1.9 mm wider; p= 0.003). No statistical significant difference was found in the medullary canal width at 10 cm below the level of LT, Dorr index, or Canal Bone Ratio (CBR). In conclusion, the proximal femoral morphology does differ in patients living in soft and hard water areas. These results may have an important clinical bearing in patients undergoing total hip replacement surgery. Further research is needed to determine whether implant survivorship is affected in patients living in hard and soft water regions.
Resumo:
Social resilience concepts are gaining momentum in environmental planning through an emerging understanding of the socio-ecological nature of biophysical systems. There is a disconnect, however, between these concepts and the sociological and psychological literature related to social resilience. Further still, both schools of thought are not well connected to the concepts of social assessment (SA) and social impact assessment (SIA) that are the more standard tools supporting planning and decision-making. This raises questions as to how emerging social resilience concepts can translate into improved SA/SIA practices to inform regional-scale adaptation. Through a review of the literature, this paper suggests that more cross-disciplinary integration is needed if social resilience concepts are to have a genuine impact in helping vulnerable regions tackle climate change.
Resumo:
As urbanisation of the global population has increased above 50%, growing food in urban spaces increases in importance, as it can contribute to food security, reduce food miles, and improve people’s physical and mental health. Approaching the task of growing food in urban environments is a mixture of residential growers and groups. Permablitz Brisbane is an event-centric grassroots community that organises daylong ‘working bee’ events, drawing on permaculture design principles in the planning and design process. Permablitz Brisbane provides a useful contrast from other location-centric forms of urban agriculture communities (such as city farms or community gardens), as their aim is to help encourage urban residents to grow their own food. We present findings and design implications from a qualitative study with members of this group, using ethnographic methods to engage with and understand how this group operates. Our findings describe four themes that include opportunities, difficulties, and considerations for the creation of interventions by Human-Computer Interaction (HCI) designers.
Resumo:
Today, Australian agriculture is not where we hoped it would be. Despite being highly productive and the nation's only 'strongly competitive industry', it is struggling across the country. There are successes, as there always will be, but the bulk of our food and fibre production is from enterprises with minimal profitability and unstable or unsound finances. A debt-deflation spiral and subprime mortgage crisis are now being fuelled by property fire sales while leading bankers proclaim no problem and governments dance at the edges. However, it is not just the bush that has problems. National economic conditions are deteriorating with per capita incomes falling and real interest rates still high. Well-informed policy strategies and effective responses are needed quickly if Australians are to avoid needless losses of capacity and wealth destruction in the cities and the bush.
Resumo:
Three families of probe-foraging birds, Scolopacidae (sandpipers and snipes), Apterygidae (kiwi), and Threskiornithidae (ibises, including spoonbills) have independently evolved long, narrow bills containing clusters of vibration-sensitive mechanoreceptors (Herbst corpuscles) within pits in the bill-tip. These ‘bill-tip organs’ allow birds to detect buried or submerged prey via substrate-borne vibrations and/or interstitial pressure gradients. Shorebirds, kiwi and ibises are only distantly related, with the phylogenetic divide between kiwi and the other two taxa being particularly deep. We compared the bill-tip structure and associated somatosensory regions in the brains of kiwi and shorebirds to understand the degree of convergence of these systems between the two taxa. For comparison, we also included data from other taxa including waterfowl (Anatidae) and parrots (Psittaculidae and Cacatuidae), non-apterygid ratites, and other probe-foraging and non probe-foraging birds including non-scolopacid shorebirds (Charadriidae, Haematopodidae, Recurvirostridae and Sternidae). We show that the bill-tip organ structure was broadly similar between the Apterygidae and Scolopacidae, however some inter-specific variation was found in the number, shape and orientation of sensory pits between the two groups. Kiwi, scolopacid shorebirds, waterfowl and parrots all shared hypertrophy or near-hypertrophy of the principal sensory trigeminal nucleus. Hypertrophy of the nucleus basorostralis, however, occurred only in waterfowl, kiwi, three of the scolopacid species examined and a species of oystercatcher (Charadriiformes: Haematopodidae). Hypertrophy of the principal sensory trigeminal nucleus in kiwi, Scolopacidae, and other tactile specialists appears to have co-evolved alongside bill-tip specializations, whereas hypertrophy of nucleus basorostralis may be influenced to a greater extent by other sensory inputs. We suggest that similarities between kiwi and scolopacid bill-tip organs and associated somatosensory brain regions are likely a result of similar ecological selective pressures, with inter-specific variations reflecting finer-scale niche differentiation.
Resumo:
One of the Department of Defense's most pressing environmental problems is the efficient detection and identification of unexploded ordnance (UXO). In regions of highly magnetic soils, magnetic and electromagnetic sensors often detect anomalies that are of geologic origin, adding significantly to remediation costs. In order to develop predictive models for magnetic susceptibility, it is crucial to understand modes of formation and the spatial distribution of different iron oxides. Most rock types contain iron and their magnetic susceptibility is determined by the amount and form of iron oxides present. When rocks weather, the amount and form of the oxides change, producing concomitant changes in magnetic susceptibility. The type of iron oxide found in the weathered rock or regolith is a function of the duration and intensity of weathering, as well as the original content of iron in the parent material. The rate of weathering is controlled by rainfall and temperature; thus knowing the climate zone, the amount of iron in the lithology and the age of the surface will help predict the amount and forms of iron oxide. We have compiled analyses of the types, amounts, and magnetic properties of iron oxides from soils over a wide climate range, from semi arid grasslands, to temperate regions, and tropical forests. We find there is a predictable range of iron oxide type and magnetic susceptibility according to the climate zone, the age of the soil and the amount of iron in the unweathered regolith.
Resumo:
BACKGROUND Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. METHODS We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. FINDINGS Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. INTERPRETATION Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results.
Resumo:
The dynamic nature of tissue temperature and the subcutaneous properties, such as blood flow, fatness, and metabolic rate, leads to variation in local skin temperature. Therefore, we investigated the effects of using multiple regions of interest when calculating weighted mean skin temperature from four local sites. Twenty-six healthy males completed a single trial in a thermonetural laboratory (mean ± SD): 24.0 (1.2) °C; 56 (8%) relative humidity; < 0.1 m/s air speed). Mean skin temperature was calculated from four local sites (neck, scapula, hand and shin) in accordance with International Standards using digital infrared thermography. A 50 x 50 mm square, defined by strips of aluminium tape, created six unique regions of interest, top left quadrant, top right quadrant, bottom left quadrant, bottom right quadrant, centre quadrant and the entire region of interest, at each of the local sites. The largest potential error in weighted mean skin temperature was calculated using a combination of a) the coolest and b) the warmest regions of interest at each of the local sites. Significant differences between the six regions interest were observed at the neck (P < 0.01), scapula (P < 0.001) and shin (P < 0.05); but not at the hand (P = 0.482). The largest difference (± SEM) at each site was as follows: neck 0.2 (0.1) °C; scapula 0.2 (0.0) °C; shin 0.1 (0.0) °C and hand 0.1 (0.1) °C. The largest potential error (mean ± SD) in weighted mean skin temperature was 0.4 (0.1) °C (P < 0.001) and the associated 95% limits of agreement for these differences was 0.2 to 0.5 °C. Although we observed differences in local and mean skin temperature based on the region of interest employed, these differences were minimal and are not considered physiologically meaningful.