9 resultados para Experience Sampling Methods
em Digital Commons - Michigan Tech
Resumo:
Proteins are linear chain molecules made out of amino acids. Only when they fold to their native states, they become functional. This dissertation aims to model the solvent (environment) effect and to develop & implement enhanced sampling methods that enable a reliable study of the protein folding problem in silico. We have developed an enhanced solvation model based on the solution to the Poisson-Boltzmann equation in order to describe the solvent effect. Following the quantum mechanical Polarizable Continuum Model (PCM), we decomposed net solvation free energy into three physical terms– Polarization, Dispersion and Cavitation. All the terms were implemented, analyzed and parametrized individually to obtain a high level of accuracy. In order to describe the thermodynamics of proteins, their conformational space needs to be sampled thoroughly. Simulations of proteins are hampered by slow relaxation due to their rugged free-energy landscape, with the barriers between minima being higher than the thermal energy at physiological temperatures. In order to overcome this problem a number of approaches have been proposed of which replica exchange method (REM) is the most popular. In this dissertation we describe a new variant of canonical replica exchange method in the context of molecular dynamic simulation. The advantage of this new method is the easily tunable high acceptance rate for the replica exchange. We call our method Microcanonical Replica Exchange Molecular Dynamic (MREMD). We have described the theoretical frame work, comment on its actual implementation, and its application to Trp-cage mini-protein in implicit solvent. We have been able to correctly predict the folding thermodynamics of this protein using our approach.
Resumo:
Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.
Resumo:
Peatlands cover only ~3% of the global land area, but store ~30% of the worlds' soil carbon. There are many different peat types that store different amounts of carbon. Most inventories of carbon storage in northern peatlands have been conducted in the expansive Sphagnum dominated peatlands. Although, northern white cedar peatlands (NW cedar, Thuja occidentalis L.) are also one of the most common peatland types in the Great Lakes Region, occupying more than 2 million hectares. NW cedar swamps are understudied, due in part to the difficulties in collection methods. General lack of rapid and consistent sampling methods has also contributed in a lack of carbon stock quantification for many peatlands. The main objective of this thesis is to quantify: 1) to evaluate peat sampling methods 2) the amount of C-stored and the rates of long-term carbon accumulation in NW cedar peatlands. We sampled 38 peatlands separated into four categories (black ash, NW cedar swamp, sedge, and Sphagnum) during the summers of 2011/2012 across northern MN and the Upper Peninsula of MI. Basal dates of peat indicate that cedar peatlands were between 1970-7790 years old. Cedar peatlands are generally shallower than Sphagnum peat, but due to their higher bulk density, hold similar amounts of carbon with our sites averaging ~800 MgC ha-1. We estimate that NW cedar peatlands store over 1.7 Gt of carbon in the Great Lakes Region. Each of the six methods evaluated had a different level of accuracy and requires varying levels of effort and resources. The depth only method and intermittent sampling method were the most accurate methods of peatland sampling.
Resumo:
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.
Resumo:
Over the past several decades, it has become apparent that anthropogenic activities have resulted in the large-scale enhancement of the levels of many trace gases throughout the troposphere. More recently, attention has been given to the transport pathway taken by these emissions as they are dispersed throughout the atmosphere. The transport pathway determines the physical characteristics of emissions plumes and therefore plays an important role in the chemical transformations that can occur downwind of source regions. For example, the production of ozone (O3) is strongly dependent upon the transport its precursors undergo. O3 can initially be formed within air masses while still over polluted source regions. These polluted air masses can experience continued O3 production or O3 destruction downwind, depending on the air mass's chemical and transport characteristics. At present, however, there are a number of uncertainties in the relationships between transport and O3 production in the North Atlantic lower free troposphere. The first phase of the study presented here used measurements made at the Pico Mountain observatory and model simulations to determine transport pathways for US emissions to the observatory. The Pico Mountain observatory was established in the summer of 2001 in order to address the need to understand the relationships between transport and O3 production. Measurements from the observatory were analyzed in conjunction with model simulations from the Lagrangian particle dispersion model (LPDM), FLEX-PART, in order to determine the transport pathway for events observed at the Pico Mountain observatory during July 2003. A total of 16 events were observed, 4 of which were analyzed in detail. The transport time for these 16 events varied from 4.5 to 7 days, while the transport altitudes over the ocean ranged from 2-8 km, but were typically less than 3 km. In three of the case studies, eastward advection and transport in a weak warm conveyor belt (WCB) airflow was responsible for the export of North American emissions into the FT, while transport in the FT was governed by easterly winds driven by the Azores/Bermuda High (ABH) and transient northerly lows. In the fourth case study, North American emissions were lofted to 6-8 km in a WCB before being entrained in the same cyclone's dry airstream and transported down to the observatory. The results of this study show that the lower marine FT may provide an important transport environment where O3 production may continue, in contrast to transport in the marine boundary layer, where O3 destruction is believed to dominate. The second phase of the study presented here focused on improving the analysis methods that are available with LPDMs. While LPDMs are popular and useful for the analysis of atmospheric trace gas measurements, identifying the transport pathway of emissions from their source to a receptor (the Pico Mountain observatory in our case) using the standard gridded model output, particularly during complex meteorological scenarios can be difficult can be difficult or impossible. The transport study in phase 1 was limited to only 1 month out of more than 3 years of available data and included only 4 case studies out of the 16 events specifically due to this confounding factor. The second phase of this study addressed this difficulty by presenting a method to clearly and easily identify the pathway taken by only those emissions that arrive at a receptor at a particular time, by combining the standard gridded output from forward (i.e., concentrations) and backward (i.e., residence time) LPDM simulations, greatly simplifying similar analyses. The ability of the method to successfully determine the source-to-receptor pathway, restoring this Lagrangian information that is lost when the data are gridded, is proven by comparing the pathway determined from this method with the particle trajectories from both the forward and backward models. A sample analysis is also presented, demonstrating that this method is more accurate and easier to use than existing methods using standard LPDM products. Finally, we discuss potential future work that would be possible by combining the backward LPDM simulation with gridded data from other sources (e.g., chemical transport models) to obtain a Lagrangian sampling of the air that will eventually arrive at a receptor.
Resumo:
The amount and type of ground cover is an important characteristic to measure when collecting soil disturbance monitoring data after a timber harvest. Estimates of ground cover and bare soil can be used for tracking changes in invasive species, plant growth and regeneration, woody debris loadings, and the risk of surface water runoff and soil erosion. A new method of assessing ground cover and soil disturbance was recently published by the U.S. Forest Service, the Forest Soil Disturbance Monitoring Protocol (FSDMP). This protocol uses the frequency of cover types in small circular (15cm) plots to compare ground surface in pre- and post-harvest condition. While both frequency and percent cover are common methods of describing vegetation, frequency has rarely been used to measure ground surface cover. In this study, three methods for assessing ground cover percent (step-point, 15cm dia. circular and 1x5m visual plot estimates) were compared to the FSDMP frequency method. Results show that the FSDMP method provides significantly higher estimates of ground surface condition for most soil cover types, except coarse wood. The three cover methods had similar estimates for most cover values. The FSDMP method also produced the highest value when bare soil estimates were used to model erosion risk. In a person-hour analysis, estimating ground cover percent in 15cm dia. plots required the least sampling time, and provided standard errors similar to the other cover estimates even at low sampling intensities (n=18). If ground cover estimates are desired in soil monitoring, then a small plot size (15cm dia. circle), or a step-point method can provide a more accurate estimate in less time than the current FSDMP method.
Resumo:
In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.
Resumo:
Anthropogenic activities have increased phosphorus (P) loading in tributaries to the Laurentian Great Lakes resulting in eutrophication in small bays to most notably, Lake Erie. Changes to surface water quality from P loading have resulted in billions of dollars in damage and threaten the health of the world’s largest freshwater resource. To understand the factors affecting P delivery with projected increasing urban lands and biofuels expansion, two spatially explicit models were coupled. The coupled models predict that the majority of the basin will experience a significant increase in urban area P sources while the agriculture intensity and forest sources of P will decrease. Changes in P loading across the basin will be highly variable spatially. Additionally, the impacts of climate change on high precipitation events across the Great Lakes were examined. Using historical regression relationships on phosphorus concentrations, key Great Lakes tributaries were found to have future changes including decreasing total loads and increases to high-flow loading events. The urbanized Cuyahoga watersheds exhibits the most vulnerability to these climate-induced changes with increases in total loading and storm loading , while the forested Au Sable watershed exhibits greater resilience. Finally, the monitoring network currently in place for sampling the amount of phosphorus entering the U.S. Great Lakes was examined with a focus on the challenges to monitoring. Based on these interviews, the research identified three issues that policy makers interested in maintaining an effective phosphorus monitoring network in the Great Lakes should consider: first, that the policy objectives driving different monitoring programs vary, which results in different patterns of sampling design and frequency; second, that these differences complicate efforts to encourage collaboration; and third, that methods of funding sampling programs vary from agency to agency, further complicating efforts to generate sufficient long-term data to improve our understanding of phosphorus into the Great Lakes. The dissertation combines these three areas of research to present the potential future impacts of P loading in the Great Lakes as anthropogenic activities, climate and monitoring changes. These manuscripts report new experimental data for future sources, loading and climate impacts on phosphorus.
Resumo:
Chapter 1. The Action Research in this report was to focus on improving the reading comprehension of students with expository text in relation to identifying the main idea and supporting details. Students were given an expository text to read and identify main idea and 2 -3 supporting details as a pre assessment. Students were provided instruction and support in DRTA (Directed Reading Thinking Activity) and SQ3R (Survey, Question, Read, Recite, Review) methodology to identify the Main Idea and supporting details of a selected expository text for both pre & posttest. Results were compiled and analyzed on the effectiveness of the strategies by overall student growth in accurately identifying the Main Idea and being able to state at least 2 supporting details. Analysis of the data will show that the methods were effective in middle school students’ ability to read and extrapolate the necessary information from expository text. Chapter 2 is a reflective essay on the MiTEP Michigan Teacher Excellence Program and its impact on my teaching practices, lesson delivery and leadership development.