20 resultados para digital methods

em Digital Commons - Michigan Tech


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amount and type of ground cover is an important characteristic to measure when collecting soil disturbance monitoring data after a timber harvest. Estimates of ground cover and bare soil can be used for tracking changes in invasive species, plant growth and regeneration, woody debris loadings, and the risk of surface water runoff and soil erosion. A new method of assessing ground cover and soil disturbance was recently published by the U.S. Forest Service, the Forest Soil Disturbance Monitoring Protocol (FSDMP). This protocol uses the frequency of cover types in small circular (15cm) plots to compare ground surface in pre- and post-harvest condition. While both frequency and percent cover are common methods of describing vegetation, frequency has rarely been used to measure ground surface cover. In this study, three methods for assessing ground cover percent (step-point, 15cm dia. circular and 1x5m visual plot estimates) were compared to the FSDMP frequency method. Results show that the FSDMP method provides significantly higher estimates of ground surface condition for most soil cover types, except coarse wood. The three cover methods had similar estimates for most cover values. The FSDMP method also produced the highest value when bare soil estimates were used to model erosion risk. In a person-hour analysis, estimating ground cover percent in 15cm dia. plots required the least sampling time, and provided standard errors similar to the other cover estimates even at low sampling intensities (n=18). If ground cover estimates are desired in soil monitoring, then a small plot size (15cm dia. circle), or a step-point method can provide a more accurate estimate in less time than the current FSDMP method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is a study of the use of capital budgeting methods for investment decisions. It uses both the traditional methods and the newly introduced approach called the real options analysis to make a decision. The research elucidates how capital budgeting can be done when analysts encounter projects with high uncertainty and are capital intensive, for example oil and gas production. It then uses the oil and gas find in Ghana as a case study to support its argument. For a clear understanding a thorough literature review was done, which highlights the advantages and disadvantages of both methods. The revenue that the project will generate and the costs of production were obtained from the predictions by analysts from GNPC and compared to others experts’ opinion. It then applied both the traditional and real option valuation on the oil and gas find in Ghana to determine the project’s feasibility. Although, there are some short falls in real option analysis that are presented in this research, it is still helpful in valuing projects that are capital intensive with high volatility due to the strategic flexibility management possess in their decision making. It also suggests that traditional methods of evaluation should still be maintained and be used to value projects that have no options or those with options yet the options do not have significant impact on the project. The research points out the economic ripples the production of oil and gas will have on Ghana’s economy should the project be undertaken. These ripples include economic growth, massive job creation and reduction of the balance of trade deficit for the country. The long run effect is an eventually improvement of life of the citizens. It is also belief that the production of gas specifically can be used to generate electricity in Ghana which would enable the country to have a more stable and reliable power source necessary to attract more foreign direct investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Riparian zones are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well defined vegetation and soil characteristics. Development of an all-encompassing definition for riparian ecotones, because of their high variability, is challenging. However, there are two primary factors that all riparian ecotones are dependent on: the watercourse and its associated floodplain. Previous approaches to riparian boundary delineation have utilized fixed width buffers, but this methodology has proven to be inadequate as it only takes the watercourse into consideration and ignores critical geomorphology, associated vegetation and soil characteristics. Our approach offers advantages over other previously used methods by utilizing: the geospatial modeling capabilities of ArcMap GIS; a better sampling technique along the water course that can distinguish the 50-year flood plain, which is the optimal hydrologic descriptor of riparian ecotones; the Soil Survey Database (SSURGO) and National Wetland Inventory (NWI) databases to distinguish contiguous areas beyond the 50-year plain; and land use/cover characteristics associated with the delineated riparian zones. The model utilizes spatial data readily available from Federal and State agencies and geospatial clearinghouses. An accuracy assessment was performed to assess the impact of varying the 50-year flood height, changing the DEM spatial resolution (1, 3, 5 and 10m), and positional inaccuracies with the National Hydrography Dataset (NHD) streams layer on the boundary placement of the delineated variable width riparian ecotones area. The result of this study is a robust and automated GIS based model attached to ESRI ArcMap software to delineate and classify variable-width riparian ecotones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the canopy cover of an urban environment leads to better estimates of carbon storage and more informed management decisions by urban foresters. The most commonly used method for assessing urban forest cover type extent is ground surveys, which can be both timeconsuming and expensive. The analysis of aerial photos is an alternative method that is faster, cheaper, and can cover a larger number of sites, but may be less accurate. The objectives of this paper were (1) to compare three methods of cover type assessment for Los Angeles, CA: handdelineation of aerial photos in ArcMap, supervised classification of aerial photos in ERDAS Imagine, and ground-collected data using the Urban Forest Effects (UFORE) model protocol; (2) to determine how well remote sensing methods estimate carbon storage as predicted by the UFORE model; and (3) to explore the influence of tree diameter and tree density on carbon storage estimates. Four major cover types (bare ground, fine vegetation, coarse vegetation, and impervious surfaces) were determined from 348 plots (0.039 ha each) randomly stratified according to land-use. Hand-delineation was better than supervised classification at predicting ground-based measurements of cover type and UFORE model-predicted carbon storage. Most error in supervised classification resulted from shadow, which was interpreted as unknown cover type. Neither tree diameter or tree density per plot significantly affected the relationship between carbon storage and canopy cover. The efficiency of remote sensing rather than in situ data collection allows urban forest managers the ability to quickly assess a city and plan accordingly while also preserving their often-limited budget.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an important Civil Engineering material, asphalt concrete (AC) is commonly used to build road surfaces, airports, and parking lots. With traditional laboratory tests and theoretical equations, it is a challenge to fully understand such a random composite material. Based on the discrete element method (DEM), this research seeks to develop and implement computer models as research approaches for improving understandings of AC microstructure-based mechanics. In this research, three categories of approaches were developed or employed to simulate microstructures of AC materials, namely the randomly-generated models, the idealized models, and image-based models. The image-based models were recommended for accurately predicting AC performance, while the other models were recommended as research tools to obtain deep insight into the AC microstructure-based mechanics. A viscoelastic micromechanical model was developed to capture viscoelastic interactions within the AC microstructure. Four types of constitutive models were built to address the four categories of interactions within an AC specimen. Each of the constitutive models consists of three parts which represent three different interaction behaviors: a stiffness model (force-displace relation), a bonding model (shear and tensile strengths), and a slip model (frictional property). Three techniques were developed to reduce the computational time for AC viscoelastic simulations. It was found that the computational time was significantly reduced to days or hours from years or months for typical three-dimensional models. Dynamic modulus and creep stiffness tests were simulated and methodologies were developed to determine the viscoelastic parameters. It was found that the DE models could successfully predict dynamic modulus, phase angles, and creep stiffness in a wide range of frequencies, temperatures, and time spans. Mineral aggregate morphology characteristics (sphericity, orientation, and angularity) were studied to investigate their impacts on AC creep stiffness. It was found that aggregate characteristics significantly impact creep stiffness. Pavement responses and pavement-vehicle interactions were investigated by simulating pavement sections under a rolling wheel. It was found that wheel acceleration, steadily moving, and deceleration significantly impact contact forces. Additionally, summary and recommendations were provided in the last chapter and part of computer programming codes wree provided in the appendixes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High resolution digital elevation models (DEMs) of Santiaguito and Pacaya volcanoes, Guatemala, were used to estimate volume changes and eruption rates between 1954 and 2001. The DEMs were generated from contour maps and aerial photography, which were analyzed in ArcGIS 9.0®. Because both volcanoes were growing substantially over the five decade period, they provide a good data set for exploring effective methodology for estimating volume changes. The analysis shows that the Santiaguito dome complex grew by 0.78 ± 0.07 km3 (0.52 ± 0.05 m3 s-1) over the 1954-2001 period with nearly all the growth occurring on the El Brujo (1958-75) and Caliente domes (1971-2001). Adding information from field data prior to 1954, the total volume extruded from Santiaguito since 1922 is estimated at 1.48 ± 0.19 km3. Santiaguito’s growth rate is lower than most other volcanic domes, but it has been sustained over a much longer period and has undergone a change toward more exogenous and progressively slower extrusion with time. At Santiaguito some of the material being added at the dome is subsequently transported downstream by block and ash flows, mudflows and floods, creating channel shifting and areas of aggradation and erosion. At Pacaya volcano a total volume of 0.21 ± 0.05 km3 was erupted between 1961 and 2001 for an average extrusion rate of 0.17 ± 0.04 m3 s-1. Both the Santiaguito and Pacaya eruption rate estimates reported here are minima, because they do not include estimates of materials which are transported downslope after eruption and data on ashfall which may result in significant volumes of material spread over broad areas. Regular analysis of high resolution DEMs using the methods outlined here, would help quantify the effects of fluvial changes to downstream populated areas, as well as assist in tracking hazards related to dome collapse and eruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant cost for foundations is the design and installation of piles when they are required due to poor ground conditions. Not only is it important that piles be designed properly, but also that the installation equipment and total cost be evaluated. To assist in the evaluation of piles a number of methods have been developed. In this research three of these methods were investigated, which were developed by the Federal Highway Administration, the US Corps of Engineers and the American Petroleum Institute (API). The results from these methods were entered into the program GRLWEAPTM to assess the pile drivability and to provide a standard base for comparing the three methods. An additional element of this research was to develop EXCEL spreadsheets to implement these three methods. Currently the Army Corps and API methods do not have publicly available software and must be performed manually, which requires that data is taken off of figures and tables, which can introduce error in the prediction of pile capacities. Following development of the EXCEL spreadsheet, they were validated with both manual calculations and existing data sets to ensure that the data output is correct. To evaluate the three pile capacity methods data was utilized from four project sites from North America. The data included site geotechnical data along with field determined pile capacities. In order to achieve a standard comparison of the data, the pile capacities and geotechnical data from the three methods were entered into GRLWEAPTM. The sites consisted of both cohesive and cohesionless soils; where one site was primarily cohesive, one was primarily cohesionless, and the other two consisted of inter-bedded cohesive and cohesionless soils. Based on this limited set of data the results indicated that the US Corps of Engineers method more closely compared with the field test data, followed by the API method to a lesser degree. The DRIVEN program compared favorably in cohesive soils, but over predicted in cohesionless material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex human diseases are a major challenge for biological research. The goal of my research is to develop effective methods for biostatistics in order to create more opportunities for the prevention and cure of human diseases. This dissertation proposes statistical technologies that have the ability of being adapted to sequencing data in family-based designs, and that account for joint effects as well as gene-gene and gene-environment interactions in the GWA studies. The framework includes statistical methods for rare and common variant association studies. Although next-generation DNA sequencing technologies have made rare variant association studies feasible, the development of powerful statistical methods for rare variant association studies is still underway. Chapter 2 demonstrates two adaptive weighting methods for rare variant association studies based on family data for quantitative traits. The results show that both proposed methods are robust to population stratification, robust to the direction and magnitude of the effects of causal variants, and more powerful than the methods using weights suggested by Madsen and Browning [2009]. In Chapter 3, I extended the previously proposed test for Testing the effect of an Optimally Weighted combination of variants (TOW) [Sha et al., 2012] for unrelated individuals to TOW &ndash F, TOW for Family &ndash based design. Simulation results show that TOW &ndash F can control for population stratification in wide range of population structures including spatially structured populations, is robust to the directions of effect of causal variants, and is relatively robust to percentage of neutral variants. In GWA studies, this dissertation consists of a two &ndash locus joint effect analysis and a two-stage approach accounting for gene &ndash gene and gene &ndash environment interaction. Chapter 4 proposes a novel two &ndash stage approach, which is promising to identify joint effects, especially for monotonic models. The proposed approach outperforms a single &ndash marker method and a regular two &ndash stage analysis based on the two &ndash locus genotypic test. In Chapter 5, I proposed a gene &ndash based two &ndash stage approach to identify gene &ndash gene and gene &ndash environment interactions in GWA studies which can include rare variants. The two &ndash stage approach is applied to the GAW 17 dataset to identify the interaction between KDR gene and smoking status.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waste effluents from the forest products industry are sources of lignocellulosic biomass that can be converted to ethanol by yeast after pretreatment. However, the challenge of improving ethanol yields from a mixed pentose and hexose fermentation of a potentially inhibitory hydrolysate still remains. Hardboard manufacturing process wastewater (HPW) was evaluated at a potential feedstream for lignocellulosic ethanol production by native xylose-fermenting yeast. After screening of xylose-fermenting yeasts, Scheffersomyces stipitis CBS 6054 was selected as the ideal organism for conversion of the HPW hydrolysate material. The individual and synergistic effects of inhibitory compounds present in the hydrolysate were evaluated using response surface methodology. It was concluded that organic acids have an additive negative effect on fermentations. Fermentation conditions were also optimized in terms of aeration and pH. Methods for improving productivity and achieving higher ethanol yields were investigated. Adaptation to the conditions present in the hydrolysate through repeated cell sub-culturing was used. The objectives of this present study were to adapt S. stipitis CBS6054 to a dilute-acid pretreated lignocellulosic containing waste stream; compare the physiological, metabolic, and proteomic profiles of the adapted strain to its parent; quantify changes in protein expression/regulation, metabolite abundance, and enzyme activity; and determine the biochemical and molecular mechanism of adaptation. The adapted culture showed improvement in both substrate utilization and ethanol yields compared to the unadapted parent strain. The adapted strain also represented a growth phenotype compared to its unadapted parent based on its physiological and proteomic profiles. Several potential targets that could be responsible for strain improvement were identified. These targets could have implications for metabolic engineering of strains for improved ethanol production from lignocellulosic feedstocks. Although this work focuses specifically on the conversion of HPW to ethanol, the methods developed can be used for any feedstock/product systems that employ a microbial conversion step. The benefit of this research is that the organisms will the optimized for a company's specific system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of genotyping and next-generation sequencing technologies, multi-marker testing in genome-wide association study and rare variant association study became active research areas in statistical genetics. This dissertation contains three methodologies for association study by exploring different genetic data features and demonstrates how to use those methods to test genetic association hypothesis. The methods can be categorized into in three scenarios: 1) multi-marker testing for strong Linkage Disequilibrium regions, 2) multi-marker testing for family-based association studies, 3) multi-marker testing for rare variant association study. I also discussed the advantage of using these methods and demonstrated its power by simulation studies and applications to real genetic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project consists of a proposed curriculum for a semester-long, community-based workshop for LGBTQIA+ (lesbian, gay, bisexual, trans*, queer or questioning, intersex, asexual or ally, "+" indicating other identifications that deviate from heterosexual) youth ages 16-18. The workshop focuses on an exploration of LGBTQIA+ identity and community through discussion and collaborative rhetorical analysis of visual and social media. Informed by queer theory and history, studies on youth work, and visual media studies and incorporating rhetorical criticism as well as liberatory pedagogy and community literacy practices, the participation-based design of the workshop seeks to involve participants in selection of media texts, active analytical viewership, and multimodal response. The workshop is designed to engage participants in reflection on questions of individual and collective responsibility and agency as members and allies of various communities. The goal of the workshop is to strengthen participants' abilities to analyze the complex ways in which television, film, and social media influence their own and others’ perceptions of issues surrounding queer identities. As part of the reflective process, participants are challenged to consider how they can in turn actively and collaboratively respond to and potentially help to shape these perceptions. My project report details the theoretical framework, pedagogical rationale, methods of text selection and critical analysis, and guidelines for conduct that inform and structure the workshop.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The persuasive power of music is often relegated to the dimension of pathos: that which moves us emotionally. Yet, the music commodity is now situated in and around the liminal spaces of digitality. To think about how music functions, how it argues across media, and how it moves us, we must examine its material and immaterial realities as they present themselves to us and as we so create them. This dissertation rethinks the relationship between rhetoric and music by examining the creation, performance, and distribution of music in its material and immaterial forms to demonstrate its persuasive power. While both Plato and Aristotle understood music as a means to move men toward virtue, Aristotle tells us in his Laws, through the Athenian Stranger, that the very best kinds of music can help guide us to truth. From this starting point, I assess the historical problem of understanding the rhetorical potential of music as merely that which directs or imitates the emotions: that which “Soothes the savage breast,” as William Congreve writes. By furthering work by Vickers and Farnsworth, who suggest that the Baroque fascination with applying rhetorical figures to musical figures is an insufficient framework for assessing the rhetorical potential of music, I demonstrate the gravity of musical persuasion in its political weight, in its violence—the subjective violence of musical torture at Guantanamo and the objective, ideological violence of music—and in what Jacques Attali calls the prophetic nature of music. I argue that music has a significant function, and as a non-discursive form of argumentation, works on us beyond affect. Moreover, with the emergence of digital music distribution and domestic digital recording technologies, the digital music commodity in its material and immaterial forms allows for ruptures in the former methods of musical composition, production, and distribution and in the political potential of music which Jacques Attali describes as being able to foresee new political realities. I thus suggest a new theoretical framework for thinking about rhetoric and music by expanding on Lloyd Bitzer’s rhetorical situation, by offering the idea of “openings” to the existing exigence, audience, and constraints. The prophetic and rhetorical power of music in the aleatoric moment can help provide openings from which new exigencies can be conceived. We must, therefore, reconsider the role of rhetorical-musical composition for the citizen, not merely as a tool for entertainment or emotional persuasion, but as an arena for engaging with the political.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past several decades, it has become apparent that anthropogenic activities have resulted in the large-scale enhancement of the levels of many trace gases throughout the troposphere. More recently, attention has been given to the transport pathway taken by these emissions as they are dispersed throughout the atmosphere. The transport pathway determines the physical characteristics of emissions plumes and therefore plays an important role in the chemical transformations that can occur downwind of source regions. For example, the production of ozone (O3) is strongly dependent upon the transport its precursors undergo. O3 can initially be formed within air masses while still over polluted source regions. These polluted air masses can experience continued O3 production or O3 destruction downwind, depending on the air mass's chemical and transport characteristics. At present, however, there are a number of uncertainties in the relationships between transport and O3 production in the North Atlantic lower free troposphere. The first phase of the study presented here used measurements made at the Pico Mountain observatory and model simulations to determine transport pathways for US emissions to the observatory. The Pico Mountain observatory was established in the summer of 2001 in order to address the need to understand the relationships between transport and O3 production. Measurements from the observatory were analyzed in conjunction with model simulations from the Lagrangian particle dispersion model (LPDM), FLEX-PART, in order to determine the transport pathway for events observed at the Pico Mountain observatory during July 2003. A total of 16 events were observed, 4 of which were analyzed in detail. The transport time for these 16 events varied from 4.5 to 7 days, while the transport altitudes over the ocean ranged from 2-8 km, but were typically less than 3 km. In three of the case studies, eastward advection and transport in a weak warm conveyor belt (WCB) airflow was responsible for the export of North American emissions into the FT, while transport in the FT was governed by easterly winds driven by the Azores/Bermuda High (ABH) and transient northerly lows. In the fourth case study, North American emissions were lofted to 6-8 km in a WCB before being entrained in the same cyclone's dry airstream and transported down to the observatory. The results of this study show that the lower marine FT may provide an important transport environment where O3 production may continue, in contrast to transport in the marine boundary layer, where O3 destruction is believed to dominate. The second phase of the study presented here focused on improving the analysis methods that are available with LPDMs. While LPDMs are popular and useful for the analysis of atmospheric trace gas measurements, identifying the transport pathway of emissions from their source to a receptor (the Pico Mountain observatory in our case) using the standard gridded model output, particularly during complex meteorological scenarios can be difficult can be difficult or impossible. The transport study in phase 1 was limited to only 1 month out of more than 3 years of available data and included only 4 case studies out of the 16 events specifically due to this confounding factor. The second phase of this study addressed this difficulty by presenting a method to clearly and easily identify the pathway taken by only those emissions that arrive at a receptor at a particular time, by combining the standard gridded output from forward (i.e., concentrations) and backward (i.e., residence time) LPDM simulations, greatly simplifying similar analyses. The ability of the method to successfully determine the source-to-receptor pathway, restoring this Lagrangian information that is lost when the data are gridded, is proven by comparing the pathway determined from this method with the particle trajectories from both the forward and backward models. A sample analysis is also presented, demonstrating that this method is more accurate and easier to use than existing methods using standard LPDM products. Finally, we discuss potential future work that would be possible by combining the backward LPDM simulation with gridded data from other sources (e.g., chemical transport models) to obtain a Lagrangian sampling of the air that will eventually arrive at a receptor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As continued global funding and coordination are allocated toward the improvement of access to safe sources of drinking water, alternative solutions may be necessary to expand implementation to remote communities. This report evaluates two technologies used in a small water distribution system in a mountainous region of Panama; solar powered pumping and flow-reducing discs. The two parts of the system function independently, but were both chosen for their ability to mitigate unique issues in the community. The design program NeatWork and flow-reducing discs were evaluated because they are tools taught to Peace Corps Volunteers in Panama. Even when ample water is available, mountainous terrains affect the pressure available throughout a water distribution system. Since the static head in the system only varies with the height of water in the tank, frictional losses from pipes and fittings must be exploited to balance out the inequalities caused by the uneven terrain. Reducing the maximum allowable flow to connections through the installation of flow-reducing discs can help to retain enough residual pressure in the main distribution lines to provide reliable service to all connections. NeatWork was calibrated to measured flow rates by changing the orifice coefficient (θ), resulting in a value of 0.68, which is 10-15% higher than typical values for manufactured flow-reducing discs. NeatWork was used to model various system configurations to determine if a single-sized flow-reducing disc could provide equitable flow rates throughout an entire system. There is a strong correlation between the optimum single-sized flow- reducing disc and the average elevation change throughout a water distribution system; the larger the elevation change across the system, the smaller the recommended uniform orifice size. Renewable energy can jump the infrastructure gap and provide basic services at a fraction of the cost and time required to install transmission lines. Methods for the assessment of solar powered pumping systems as a means for rural water supply are presented and assessed. It was determined that manufacturer provided product specifications can be used to appropriately design a solar pumping system, but care must be taken to ensure that sufficient water can be provided to the system despite variations in solar intensity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.