12 resultados para Digital mapping -- Case studies -- Congresses

em Digital Commons - Michigan Tech


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forested wetlands throughout the world are valuable habitats; especially in relatively species-poor northern regions, they can be considered biological hotspots. Unfortunately, these areas have been degraded and destroyed. In recent years, however, the biological importance of wetlands has been increasingly recognized, resulting in the desire to restore disturbed habitats or create in place of destroyed ones. Restoration work is taking place across the globe in a diversity of wetland types, and research must be conducted to determine successful techniques. As a result, two studies of the effects of wetland restoration and creation were conducted in forested wetlands in northern Michigan and southern Finland. In North America, northern white-cedar wetlands have been declining in area, despite attempts to regenerate them. Improved methods for successfully establishing northern white-cedar are needed; as a result, the target of the first study was to determine if creating microtopography could be beneficial for white-cedar recruitment and growth. In northern Europe, spruce swamp forests have become a threatened ecosystem due to extensive drainage for forestry. As part of the restoration of these habitats, i.e. rewetting through ditch blocking, Sphagnum mosses are considered to be a critical element to re-establish, and an in-depth analysis of how Sphagnum is responding to restoration in spruce swamp forests has not been previously done. As a result, the aim of the second study was to investigate the ecophysiological functioning of Sphagnum and feather mosses across a gradient of pristine, drained, and restored boreal spruce swamp forests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Former Finnish Minister of Foreign Affairs, Rudolf Holsti, ended his professional career as a professor at Stanford University. In spring 1941, he encountered a news article on Alexandra Kropotkina and was encouraged to send her a letter. In this letter, Holsti revealed his admiration for her father, "anarchist prince" Pjotr Kropotkin. Holsti’s letter to Alexandra Kropotkina further related that as foreign minister he had even sent food from the Finnish embassy in Moscow to Kropotkin while he was being held in custody by the Soviet authorities. The notion of an anarchist foreign minister is profoundly paradoxical, but the aim of my research is to find Kropotkin’s influences in Holsti's work and publications. Before entering politics, Holsti defended his thesis for PhD at the University of Helsinki in 1913 with a rather anarchist theme, “The Relation of War to the Origin of the State.” My paper and presentation will attempt to answer: how are Kropotkin's ideas present in Holsti's academic work? In addition, Holsti and Kropotkin are case studies who guide my interests in the co-relation between the scientific revolution and social thinking in the 19th century.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable management of solid waste is a global concern, as exemplified by the United Nations Millennium Development Goals (MDG) that 191 member states support. The seventh MDG indirectly advocates for municipal solid waste management (MSWM) by aiming to ensure environmental sustainability into countries’ policies and programs and reverse negative environmental impact. Proper MSWM will likely result in relieving poverty, reducing child mortality, improving maternal health, and preventing disease, which are MDG goals one, four, five, and six, respectively (UNMDG, 2005). Solid waste production is increasing worldwide as the global society strives to obtain a decent quality of life. Several means exist in which the amount of solid waste going to a landfill can be reduced, such as incineration with energy production, composting of organic wastes, and material recovery through recycling, which are all considered sustainable methods by which to manage MSW. In the developing world, composting is already a widely-accepted method to reduce waste fated for the landfill, and incineration for energy recovery can be a costly capital investment for most communities. Therefore, this research focuses on recycling as a solution to the municipal solid waste production problem while considering the three dimensions of sustainability environment, society, and economy. First, twenty-three developing country case studies were quantitatively and qualitatively examined for aspects of municipal solid waste management. The municipal solid waste (MSW) generation and recovery rates, as well as the composition were compiled and assessed. The average MSW generation rate was 0.77 kg/person/day, with recovery rates varying from 5 – 40%. The waste streams of nineteen of these case studies consisted of 0 – 70% recyclable material and 17 – 80% organic material. All twenty-three case studies were analyzed qualitatively by identifying any barriers or incentives to recycling, which justified the creation of twelve factors influencing sustainable municipal solid waste management (MSWM) in developing countries. The presence of regulations, enforcement of laws, and use of incentive schemes constitutes the first factor, Government Policy. Cost of MSWM operations, the budget allocated to MSWM by local to national governments, as well as the stability and reliability of funds comprise the Government Finances factor influencing recycling in the third world. Many case studies indicated that understanding features of a waste stream such as the generation and recovery rates and composition is the first measure in determining proper management solutions, which forms the third factor Waste Characterization. The presence and efficiency of waste collection and segregation by scavengers, municipalities, or private contractors was commonly addressed by the case studies, which justified Waste Collection and Segregation as the fourth factor. Having knowledge of MSWM and an understanding of the linkages between human behavior, waste handling, and health/sanitation/environment comprise the Household Education factor. Individuals’ income influencing waste handling behavior (e.g., reuse, recycling, and illegal dumping), presence of waste collection/disposal fees, and willingness to pay by residents were seen as one of the biggest incentives to recycling, which justified them being combined into the Household Economics factor. The MSWM Administration factor was formed following several references to the presence and effectiveness of private and/or public management of waste through collection, recovery, and disposal influencing recycling activity. Although the MSWM Personnel Education factor was only recognized by six of the twenty-two case studies, the lack of trained laborers and skilled professionals in MSWM positions was a barrier to sustainable MSWM in every case but one. The presence and effectiveness of a comprehensive, integrative, long-term MSWM strategy was highly encouraged by every case study that addressed the tenth factor, MSWM Plan. Although seemingly a subset of private MSWM administration, the existence and profitability of market systems relying on recycled-material throughput, involvement of small businesses, middlemen, and large industries/exporters is deserving of the factor Local Recycled-Material Market. Availability and effective use of technology and/or human workforce and the safety considerations of each were recurrent barriers and incentives to recycling to warrant the Technological and Human Resources factor. The Land Availability factor takes into consideration land attributes such as terrain, ownership, and development which can often times dictate MSWM. Understanding the relationships among the twelve factors influencing recycling in developing countries, made apparent the collaborative nature required of sustainable MSWM. Factors requiring the greatest collaborative inputs include waste collection and segregation, MSWM plan, and local recycled-material market. Aligning each factor to the societal, environmental, and economic dimensions of sustainability revealed the motives behind the institutions contributing to each factor. A correlation between stakeholder involvement and sustainability existed, as supported by the fact that the only three factors driven by all three dimensions of sustainability were the same three that required the greatest collaboration with other factors. With increasing urbanization, advocating for improved health for all through the MDG, and changing consumption patterns resulting in increasing and more complex waste streams, the utilization of the collaboration web offered by this research is ever needed in the developing world. Through its use, the institutions associated with each of the twelve factors can achieve a better understanding of the collaboration necessary and beneficial for more sustainable MSWM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a report on a collaborative project between the Computer Science and the Humanities Departments to develop case studies that focus on issues of communication in the workplace, and the results of their use in the classroom. My argument is that case study teaching simulates real-world experience in a meaningful way, essentially developing a teachable way of developing phronesis, the reasoned capacity to act for the good in public. In addition, it can be read as a "how-to" guide for educators who may wish to construct their own case studies. To that end, I have included a discussion of the ethnographic methodologies employed, and how it was adapted to our more pragmatic ends. Finally, I present my overarching argument for a new appraisal of the concept of techné. This reappraisal emphasizes its productive activity, poiesis, rather than focusing on its knowledge as has been the case in the past. I propose that focusing on the telos, the end outside the production, contributes to the diminishment, if not complete foreclosure, of a rich concept of techné.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Writing center scholarship and practice have approached how issues of identity influence communication but have not fully considered ways of making identity a key feature of writing center research or practice. This dissertation suggests a new way to view identity -- through an experience of "multimembership" or the consideration that each identity is constructed based on the numerous community memberships that make up that identity. Etienne Wenger (1998) proposes that a fully formed identity is ultimately impossible, but it is through the work of reconciling memberships that important individual and community transformations can occur. Since Wenger also argues that reconciliation "is the most significant challenge" for those moving into new communities of practice (or, "engage in a process of collective learning in a shared domain of human endeavor" (4)), yet this challenge often remains tacit, this dissertation examines and makes explicit how this important work is done at two different research sites - a university writing center (the Michigan Tech Multiliteracies Center) and at a multinational corporation (Kimberly-Clark Corporation). Drawing extensively on qualitative ethnographic methods including interview transcriptions, observations, and case studies, as well as work from scholars in writing center studies (Grimm, Denney, Severino), literacy studies (New London Group, Street, Gee), composition (Horner and Trimbur, Canagarajah, Lu), rhetoric (Crowley), and identity studies (Anzaldua, Pratt), I argue that, based on evidence from the two sites, writing centers need to educate tutors to not only take identity into consideration, but to also make individuals' reconciliation work more visible, as it will continue once students and tutors leave the university. Further, as my research at the Michigan Tech Multiliteracies Center and Kimberly-Clark will show, communities can (and should) change their practices in ways that account for reconciliation work as identity, communication, and learning are inextricably bound up with one another.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of optimal design of a multi-gravity-assist space trajectories, with free number of deep space maneuvers (MGADSM) poses multi-modal cost functions. In the general form of the problem, the number of design variables is solution dependent. To handle global optimization problems where the number of design variables varies from one solution to another, two novel genetic-based techniques are introduced: hidden genes genetic algorithm (HGGA) and dynamic-size multiple population genetic algorithm (DSMPGA). In HGGA, a fixed length for the design variables is assigned for all solutions. Independent variables of each solution are divided into effective and ineffective (hidden) genes. Hidden genes are excluded in cost function evaluations. Full-length solutions undergo standard genetic operations. In DSMPGA, sub-populations of fixed size design spaces are randomly initialized. Standard genetic operations are carried out for a stage of generations. A new population is then created by reproduction from all members based on their relative fitness. The resulting sub-populations have different sizes from their initial sizes. The process repeats, leading to increasing the size of sub-populations of more fit solutions. Both techniques are applied to several MGADSM problems. They have the capability to determine the number of swing-bys, the planets to swing by, launch and arrival dates, and the number of deep space maneuvers as well as their locations, magnitudes, and directions in an optimal sense. The results show that solutions obtained using the developed tools match known solutions for complex case studies. The HGGA is also used to obtain the asteroids sequence and the mission structure in the global trajectory optimization competition (GTOC) problem. As an application of GA optimization to Earth orbits, the problem of visiting a set of ground sites within a constrained time frame is solved. The J2 perturbation and zonal coverage are considered to design repeated Sun-synchronous orbits. Finally, a new set of orbits, the repeated shadow track orbits (RSTO), is introduced. The orbit parameters are optimized such that the shadow of a spacecraft on the Earth visits the same locations periodically every desired number of days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Disturbances in power systems may lead to electromagnetic transient oscillations due to mismatch of mechanical input power and electrical output power. Out-of-step conditions in power system are common after the disturbances where the continuous oscillations do not damp out and the system becomes unstable. Existing out-of-step detection methods are system specific as extensive off-line studies are required for setting of relays. Most of the existing algorithms also require network reduction techniques to apply in multi-machine power systems. To overcome these issues, this research applies Phasor Measurement Unit (PMU) data and Zubov’s approximation stability boundary method, which is a modification of Lyapunov’s direct method, to develop a novel out-of-step detection algorithm. The proposed out-of-step detection algorithm is tested in a Single Machine Infinite Bus system, IEEE 3-machine 9-bus, and IEEE 10-machine 39-bus systems. Simulation results show that the proposed algorithm is capable of detecting out-of-step conditions in multi-machine power systems without using network reduction techniques and a comparative study with an existing blinder method demonstrate that the decision times are faster. The simulation case studies also demonstrate that the proposed algorithm does not depend on power system parameters, hence it avoids the need of extensive off-line system studies as needed in other algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this report, we attempt to define the capabilities of the infrared satellite remote sensor, Multifunctional Transport Satellite-2 (MTSAT-2) (i.e. a geosynchronous instrument), in characterizing volcanic eruptive behavior in the highly active region of Indonesia. Sulfur dioxide data from NASA's Ozone Monitoring Instrument (OMI) (i.e. a polar orbiting instrument) are presented here for validation of the processes interpreted using the thermal infrared datasets. Data provided from two case studies are analyzed specifically for eruptive products producing large thermal anomalies (i.e. lava flows, lava domes, etc.), volcanic ash and SO2 clouds; three distinctly characteristic and abundant volcanic emissions. Two primary methods used for detection of heat signatures are used and compared in this report including, single-channel thermal radiance (4-µm) and the normalized thermal index (NTI) algorithm. For automated purposes, fixed thresholds must be determined for these methods. A base minimum detection limit (MDL) for single-channel thermal radiance of 2.30E+05 Wm- 2sr-1m-1 and -0.925 for NTI generate false alarm rates of 35.78% and 34.16%, respectively. A spatial comparison method, developed here specifically for use in Indonesia and used as a second parameter for detection, is implemented to address the high false alarm rate. For the single-channel thermal radiance method, the utilization of the spatial comparison method eliminated 100% of the false alarms while maintaining every true anomaly. The NTI algorithm showed similar results with only 2 false alarms remaining. No definitive difference is observed between the two thermal detection methods for automated use; however, the single-channel thermal radiance method coupled with the SO2 mass abundance data can be used to interpret volcanic processes including the identification of lava dome activity at Sinabung as well as the mechanism for the dome emplacement (i.e. endogenous or exogenous). Only one technique, the brightness temperature difference (BTD) method, is used for the detection of ash. Trends of ash area, water/ice area, and their respective concentrations yield interpretations of increased ice formation, aggregation, and sedimentation processes that only a high-temporal resolution instrument like the MTSAT-2 can analyze. A conceptual model of a secondary zone of aggregation occurring in the migrating Kelut ash cloud, which decreases the distal fine-ash component and hazards to flight paths, is presented in this report. Unfortunately, SO2 data was unable to definitively reinforce the concept of a secondary zone of aggregation due to the lack of a sufficient temporal resolution. However, a detailed study of the Kelut SO2 cloud is used to determine that there was no climatic impacts generated from this eruption due to the atmospheric residence times and e-folding rate of ~14 days for the SO2. This report applies the complementary assets offered by utilizing a high-temporal and a high-spatial resolution satellite, and it demonstrates that these two instruments can provide unparalleled observations of dynamic volcanic processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past several decades, it has become apparent that anthropogenic activities have resulted in the large-scale enhancement of the levels of many trace gases throughout the troposphere. More recently, attention has been given to the transport pathway taken by these emissions as they are dispersed throughout the atmosphere. The transport pathway determines the physical characteristics of emissions plumes and therefore plays an important role in the chemical transformations that can occur downwind of source regions. For example, the production of ozone (O3) is strongly dependent upon the transport its precursors undergo. O3 can initially be formed within air masses while still over polluted source regions. These polluted air masses can experience continued O3 production or O3 destruction downwind, depending on the air mass's chemical and transport characteristics. At present, however, there are a number of uncertainties in the relationships between transport and O3 production in the North Atlantic lower free troposphere. The first phase of the study presented here used measurements made at the Pico Mountain observatory and model simulations to determine transport pathways for US emissions to the observatory. The Pico Mountain observatory was established in the summer of 2001 in order to address the need to understand the relationships between transport and O3 production. Measurements from the observatory were analyzed in conjunction with model simulations from the Lagrangian particle dispersion model (LPDM), FLEX-PART, in order to determine the transport pathway for events observed at the Pico Mountain observatory during July 2003. A total of 16 events were observed, 4 of which were analyzed in detail. The transport time for these 16 events varied from 4.5 to 7 days, while the transport altitudes over the ocean ranged from 2-8 km, but were typically less than 3 km. In three of the case studies, eastward advection and transport in a weak warm conveyor belt (WCB) airflow was responsible for the export of North American emissions into the FT, while transport in the FT was governed by easterly winds driven by the Azores/Bermuda High (ABH) and transient northerly lows. In the fourth case study, North American emissions were lofted to 6-8 km in a WCB before being entrained in the same cyclone's dry airstream and transported down to the observatory. The results of this study show that the lower marine FT may provide an important transport environment where O3 production may continue, in contrast to transport in the marine boundary layer, where O3 destruction is believed to dominate. The second phase of the study presented here focused on improving the analysis methods that are available with LPDMs. While LPDMs are popular and useful for the analysis of atmospheric trace gas measurements, identifying the transport pathway of emissions from their source to a receptor (the Pico Mountain observatory in our case) using the standard gridded model output, particularly during complex meteorological scenarios can be difficult can be difficult or impossible. The transport study in phase 1 was limited to only 1 month out of more than 3 years of available data and included only 4 case studies out of the 16 events specifically due to this confounding factor. The second phase of this study addressed this difficulty by presenting a method to clearly and easily identify the pathway taken by only those emissions that arrive at a receptor at a particular time, by combining the standard gridded output from forward (i.e., concentrations) and backward (i.e., residence time) LPDM simulations, greatly simplifying similar analyses. The ability of the method to successfully determine the source-to-receptor pathway, restoring this Lagrangian information that is lost when the data are gridded, is proven by comparing the pathway determined from this method with the particle trajectories from both the forward and backward models. A sample analysis is also presented, demonstrating that this method is more accurate and easier to use than existing methods using standard LPDM products. Finally, we discuss potential future work that would be possible by combining the backward LPDM simulation with gridded data from other sources (e.g., chemical transport models) to obtain a Lagrangian sampling of the air that will eventually arrive at a receptor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.