9 resultados para coupled natural and human systems
em Duke University
Resumo:
Environmental governance is more effective when the scales of ecological processes are well matched with the human institutions charged with managing human-environment interactions. The social-ecological systems (SESs) framework provides guidance on how to assess the social and ecological dimensions that contribute to sustainable resource use and management, but rarely if ever has been operationalized for multiple localities in a spatially explicit, quantitative manner. Here, we use the case of small-scale fisheries in Baja California Sur, Mexico, to identify distinct SES regions and test key aspects of coupled SESs theory. Regions that exhibit greater potential for social-ecological sustainability in one dimension do not necessarily exhibit it in others, highlighting the importance of integrative, coupled system analyses when implementing spatial planning and other ecosystem-based strategies.
Resumo:
Periods of drought and low streamflow can have profound impacts on both human and natural systems. People depend on a reliable source of water for numerous reasons including potable water supply and to produce economic value through agriculture or energy production. Aquatic ecosystems depend on water in addition to the economic benefits they provide to society through ecosystem services. Given that periods of low streamflow may become more extreme and frequent in the future, it is important to study the factors that control water availability during these times. In the absence of precipitation the slower hydrological response of groundwater systems will play an amplified role in water supply. Understanding the variability of the fraction of streamflow contribution from baseflow or groundwater during periods of drought provides insight into what future water availability may look like and how it can best be managed. The Mills River Basin in North Carolina is chosen as a case-study to test this understanding. First, obtaining a physically meaningful estimation of baseflow from USGS streamflow data via computerized hydrograph analysis techniques is carried out. Then applying a method of time series analysis including wavelet analysis can highlight signals of non-stationarity and evaluate the changes in variance required to better understand the natural variability of baseflow and low flows. In addition to natural variability, human influence must be taken into account in order to accurately assess how the combined system reacts to periods of low flow. Defining a combined demand that consists of both natural and human demand allows us to be more rigorous in assessing the level of sustainable use of a shared resource, in this case water. The analysis of baseflow variability can differ based on regional location and local hydrogeology, but it was found that baseflow varies from multiyear scales such as those associated with ENSO (3.5, 7 years) up to multi decadal time scales, but with most of the contributing variance coming from decadal or multiyear scales. It was also found that the behavior of baseflow and subsequently water availability depends a great deal on overall precipitation, the tracks of hurricanes or tropical storms and associated climate indices, as well as physiography and hydrogeology. Evaluating and utilizing the Duke Combined Hydrology Model (DCHM), reasonably accurate estimates of streamflow during periods of low flow were obtained in part due to the model’s ability to capture subsurface processes. Being able to accurately simulate streamflow levels and subsurface interactions during periods of drought can be very valuable to water suppliers, decision makers, and ultimately impact citizens. Knowledge of future droughts and periods of low flow in addition to tracking customer demand will allow for better management practices on the part of water suppliers such as knowing when they should withdraw more water during a surplus so that the level of stress on the system is minimized when there is not ample water supply.
Resumo:
The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.
Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.
In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.
For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of
Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of
In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.
Finally, for an industrial application, the use of phages to inhibit invasive
In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.
Resumo:
In recent years, most low and middle-income countries, have adopted different approaches to universal health coverage (UHC), to ensure equity and financial risk protection in accessing essential healthcare services. UHC-related policies and delivery strategies are largely based on existing healthcare systems, a result of gradual development (based on local factors and priorities). Most countries have emphasized on health financing, and human resources for health (HRH) reform policies, based on good practices of several healthcare plans to deliver UHC for their population.
Health financing and labor market frameworks were used, to understand health financing, HRH dynamics, and to analyze key health policies implemented over the past decade in Kenya’s effort to achieve UHC. Through the understanding, policy options are proposed to Kenya; analyzing, and generating lessons from health financing, and HRH reforms experiences in China. Data was collected using mixed methods approach, utilizing both quantitative (documents and literature review), and qualitative (in-depth interviews) data collection techniques.
The problems in Kenya are substantial: high levels of out-of-pocket health expenditure, slow progress in expanding health insurance among informal sector workers, inefficiencies in pulling of health are revenues, inadequate deployed HRH, maldistribution of HRH, and inadequate quality measures in training health worker. The government has identified the critical role of strengthening primary health care and the National Hospital Insurance Fund (NHIF) in Kenya’s move towards UHC. Strengthening primary health care requires; re-defining the role of hospitals, and health insurance schemes, and training, deploying and retaining primary care professionals according to the health needs of the population; concepts not emphasized in Kenya’s healthcare reforms or programs design. Kenya’s top leadership commitment is urgently needed for tougher reforms implementation, and important lessons from China’s extensive health reforms in the past decade are beneficial. Key lessons from China include health insurance expansion through rigorous research, monitoring, and evaluation, substantially increasing government health expenditure, innovative primary healthcare strengthening, designing, and implementing health policy reforms that are responsive to the population, and regional approaches to strengthening HRH.
Resumo:
Wetland restoration is a commonly used approach to reduce nutrient loading to freshwater and coastal ecosystems, with many wetland restoration efforts occurring in former agricultural fields. Restored wetlands are expected to be effective at retaining or removing both nitrogen and phosphorus (P), yet restoring wetland hydrology to former agricultural fields can lead to the release of legacy fertilizer P. Here, we examined P cycling and export following rewetting of the Timberlake Restoration Project, a 440 ha restored riverine wetland complex in the coastal plain of North Carolina. We also compared P cycling within the restored wetland to two minimally disturbed nearby wetlands and an adjacent active agricultural field. In the restored wetland we observed increased soluble reactive phosphorus (SRP) concentrations following initial flooding, consistent with our expectations that P bound to iron would be released under reducing conditions. SRP concentrations in spring were 2.5 times higher leaving the restored wetland than a forested wetland and an agricultural field. During two large-scale drawdown and rewetting experiments we decreased the water depth by 1 m in ∼10 ha of inundated wetland for 2 weeks, followed by reflooding. Rewetting following experimental drainage had no effect on SRP concentrations in winter, but SRP concentrations did increase when the experiment was repeated during summer. Our best estimates suggest that this restored wetland could release legacy fertilizer P for up to a decade following hydrologic restoration. The time lag between restoration and biogeochemical recovery should be incorporated into management strategies of restored wetlands. Copyright 2010 by the American Geophysical Union.
Resumo:
We provide evidence that college graduation plays a direct role in revealing ability to the labor market. Using the NLSY79, our results suggest that ability is observed nearly perfectly for college graduates, but is revealed to the labor market more gradually for high school graduates. Consequently, from the beginning of their careers, college graduates are paid in accordance with their own ability, while the wages of high school graduates are initially unrelated to their own ability. This view of ability revelation in the labor market has considerable power in explaining racial differences in wages, education, and returns to ability.
Resumo:
To make adaptive choices, individuals must sometimes exhibit patience, forgoing immediate benefits to acquire more valuable future rewards [1-3]. Although humans account for future consequences when making temporal decisions [4], many animal species wait only a few seconds for delayed benefits [5-10]. Current research thus suggests a phylogenetic gap between patient humans and impulsive, present-oriented animals [9, 11], a distinction with implications for our understanding of economic decision making [12] and the origins of human cooperation [13]. On the basis of a series of experimental results, we reject this conclusion. First, bonobos (Pan paniscus) and chimpanzees (Pan troglodytes) exhibit a degree of patience not seen in other animals tested thus far. Second, humans are less willing to wait for food rewards than are chimpanzees. Third, humans are more willing to wait for monetary rewards than for food, and show the highest degree of patience only in response to decisions about money involving low opportunity costs. These findings suggest that core components of the capacity for future-oriented decisions evolved before the human lineage diverged from apes. Moreover, the different levels of patience that humans exhibit might be driven by fundamental differences in the mechanisms representing biological versus abstract rewards.
Resumo:
Using A/J mice, which are susceptible to Staphylococcus aureus, we sought to identify genetic determinants of susceptibility to S. aureus, and evaluate their function with regard to S. aureus infection. One QTL region on chromosome 11 containing 422 genes was found to be significantly associated with susceptibility to S. aureus infection. Of these 422 genes, whole genome transcription profiling identified five genes (Dcaf7, Dusp3, Fam134c, Psme3, and Slc4a1) that were significantly differentially expressed in a) S. aureus -infected susceptible (A/J) vs. resistant (C57BL/6J) mice and b) humans with S. aureus blood stream infection vs. healthy subjects. Three of these genes (Dcaf7, Dusp3, and Psme3) were down-regulated in susceptible vs. resistant mice at both pre- and post-infection time points by qPCR. siRNA-mediated knockdown of Dusp3 and Psme3 induced significant increases of cytokine production in S. aureus-challenged RAW264.7 macrophages and bone marrow derived macrophages (BMDMs) through enhancing NF-κB signaling activity. Similar increases in cytokine production and NF-κB activity were also seen in BMDMs from CSS11 (C57BL/6J background with chromosome 11 from A/J), but not C57BL/6J. These findings suggest that Dusp3 and Psme3 contribute to S. aureus infection susceptibility in A/J mice and play a role in human S. aureus infection.
Resumo:
Credit scores are the most widely used instruments to assess whether or not a person is a financial risk. Credit scoring has been so successful that it has expanded beyond lending and into our everyday lives, even to inform how insurers evaluate our health. The pervasive application of credit scoring has outpaced knowledge about why credit scores are such useful indicators of individual behavior. Here we test if the same factors that lead to poor credit scores also lead to poor health. Following the Dunedin (New Zealand) Longitudinal Study cohort of 1,037 study members, we examined the association between credit scores and cardiovascular disease risk and the underlying factors that account for this association. We find that credit scores are negatively correlated with cardiovascular disease risk. Variation in household income was not sufficient to account for this association. Rather, individual differences in human capital factors—educational attainment, cognitive ability, and self-control—predicted both credit scores and cardiovascular disease risk and accounted for ∼45% of the correlation between credit scores and cardiovascular disease risk. Tracing human capital factors back to their childhood antecedents revealed that the characteristic attitudes, behaviors, and competencies children develop in their first decade of life account for a significant portion (∼22%) of the link between credit scores and cardiovascular disease risk at midlife. We discuss the implications of these findings for policy debates about data privacy, financial literacy, and early childhood interventions.