996 resultados para Michigan Tech


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation addresses sustainability of rapid provision of safe water and sanitation required to meet the Millennium Development Goals. Review of health-related literature and global statistics demonstrates engineers' role in achieving the MDGs. This review is followed by analyses relating to social, environmental, and health aspects of meeting MDG targets. Analysis of national indicators showed that inadequate investment, poor or nonexistent policies and governance are challenges to global sanitation coverage in addition to lack of financial resources and gender disparity. Although water availability was not found to be a challenge globally, geospatial analysis demonstrated that water availability is a potentially significant barrier for up to 46 million people living in urban areas and relying on already degraded water resources for environmental income. A daily water balance model incorporating the National Resources Conservation Services curve number method in Bolivian watersheds showed that local water stress is linked to climate change because of reduced recharge. Agricultural expansion in the region slightly exacerbates recharge reductions. Although runoff changes will range from -17% to 14%, recharge rates will decrease under all climate scenarios evaluated (-14% to -27%). Increasing sewer coverage may place stress on the readily accessible natural springs, but increased demand can be sustained if other sources of water supply are developed. This analysis provides a method for hydrological analysis in data scarce regions. Data required for the model were either obtained from publicly available data products or by conducting field work using low-cost methods feasible for local participants. Lastly, a methodology was developed to evaluate public health impacts of increased household water access resulting from domestic rainwater harvesting, incorporating knowledge of water requirements of sanitation and hygiene technologies. In 37 West African cities, domestic rainwater harvesting has the potential to reduce diarrheal disease burden by 9%, if implemented alone with 400 L storage. If implemented in conjunction with point of use treatment, this reduction could increase to 16%. The methodology will contribute to cost-effectiveness evaluations of interventions as well as evaluations of potential disease burden resulting from reduced water supply, such as reductions observed in the Bolivian communities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As the agricultural non-point source pollution(ANPSP) has become the most significant threat for water environmental deterioration and lake eutrophication in China, more and more scientists and technologists are focusing on the control countermeasure and pollution mechanism of agricultural non-point source pollution. The unreasonable rural production structure and limited scientific management measures are the main reasons for acute ANSPS problems in China. At present, the problem for pollution control is a lack of specific regulations, which affects the government's management efficiency. According to these characteristics and problems, this paper puts forward some corresponding policies. The status of the agricultural non-point source pollution of China is analyzed, and ANSPS prevention and control model is provided based on governance policy, environmental legislation, technical system and subsidy policy. At last, the case analysis of Qiandao Lake is given, and an economic policy is adopted based on its situation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The federally endangered Karner blue butterfly (Lycaeides melissa samuelis Nabokov) persists in rare oak/pine grassland communities spanning across the Great Lakes region, relying on host plant wild blue lupine (Lupinus perennis). Conservation efforts since 1992 have led to the development of several programs that restore and monitor habitat. This study aims to evaluate Karner blue habitat selection in the state of Wisconsin and develop high-resolution tools for use in conservation efforts. Spatial predictive models developed during this study accurately predicted potential habitat across state properties based on soils and canopy cover, and identified ~51-100% of Karner blue occurrences based on lupine and shrub/tree cover, and focal nectar plant abundance. When evaluated relative to American bison (Bison bison), Karner blues and lupine were more likely to occur in areas of low disturbance, but aggregated where bison were recently present in areas of moderate/high disturbance. Lupine C:N ratio increased relative to cover of shrubs/trees and focal nectar plant abundance and decreased relative to cover of groundlitter. Karner blue density increased with lupine C:N ratio, decreased with nitrogen content, and was not related to phenolic levels. We strongly suggest that areas of different soil textures must be managed differently and that maintenance techniques should generate a mix of shrubs/tree cover (10-45%), groundlitter cover (~10-40%), >5% cover of lupine, and establish an abundance of focal nectar plants. This study provides unique tools for use in conservation and should aid in focusing management efforts and recovery of this species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.