996 resultados para Michigan Tech


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Smallholders in eastern Paraguay plant small stands of Eucalyptus grandis W. Hill ex Maiden intended for sale on the local market. Smallholders have been encouraged to plant E. grandis by local forestry extension agents who offer both forestry education and incentive programs. Smallholders who practice recommended forestry techniques geared towards growing large diameter trees of good form are financially rewarded by the local markets which desire saw log quality trees. The question was posed, are smallholders engaging in recommended silvicultural practices and producing reasonable volume yields? It was hypothesized that smallholders, having received forestry education and having financial incentives from the local market, would engage in silvicultural practices resulting in trees of good form and volume yields that were reasonable for the local climate and soil characteristics. Yield volume results from this study support this hypothesis. Mean volume yield was estimated at 70 cubic meters per hectare at age four and 225 cubic meters per hectare at age eight. These volume yields compare favorably to volume yields from other studies of E. grandis grown in similar climates, with similar stocking levels and site qualities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Volcán Pacaya is one of three currently active volcanoes in Guatemala. Volcanic activity originates from the local tectonic subduction of the Cocos plate beneath the Caribbean plate along the Pacific Guatemalan coast. Pacaya is characterized by generally strombolian type activity with occasional larger vulcanian type eruptions approximately every ten years. One particularly large eruption occurred on May 27, 2010. Using GPS data collected for approximately 8 years before this eruption and data from an additional three years of collection afterwards, surface movement covering the period of the eruption can be measured and used as a tool to help understand activity at the volcano. Initial positions were obtained from raw data using the Automatic Precise Positioning Service provided by the NASA Jet Propulsion Laboratory. Forward modeling of observed 3-D displacements for three time periods (before, covering and after the May 2010 eruption) revealed that a plausible source for deformation is related to a vertical dike or planar surface trending NNW-SSE through the cone. For three distinct time periods the best fitting models describe deformation of the volcano: 0.45 right lateral movement and 0.55 m tensile opening along the dike mentioned above from October 2001 through January 2009 (pre-eruption); 0.55 m left lateral slip along the dike mentioned above for the period from January 2009 and January 2011 (covering the eruption); -0.025 m dip slip along the dike for the period from January 2011 through March 2013 (post-eruption). In all bestfit models the dike is oriented with a 75° westward dip. These data have respective RMS misfit values of 5.49 cm, 12.38 cm and 6.90 cm for each modeled period. During the time period that includes the eruption the volcano most likely experienced a combination of slip and inflation below the edifice which created a large scar at the surface down the northern flank of the volcano. All models that a dipping dike may be experiencing a combination of inflation and oblique slip below the edifice which augments the possibility of a westward collapse in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the effect that the video game Portal 2 had on students understanding of Newton’s Laws and their attitudes towards learning science during a two-week afterschool program at a science museum. Using a pre/posttest and survey design, along with instructor observations, the results showed a statistically relevant increase in understanding of Newton’s Laws (p=.02<.05) but did not measure a relevant change in attitude scores. The data and observations suggest that future research should pay attention to non-educational aspects of video games, be careful about the amount of time students spend in the game, and encourage positive relationships with game developers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

FEAST is a recently developed eigenvalue algorithm which computes selected interior eigenvalues of real symmetric matrices. It uses contour integral resolvent based projections. A weakness is that the existing algorithm relies on accurate reasoned estimates of the number of eigenvalues within the contour. Examining the singular values of the projections on moderately-sized, randomly-generated test problems motivates orthogonalization-based improvements to the algorithm. The singular value distributions provide experimentally robust estimates of the number of eigenvalues within the contour. The algorithm is modified to handle both Hermitian and general complex matrices. The original algorithm (based on circular contours and Gauss-Legendre quadrature) is extended to contours and quadrature schemes that are recursively subdividable. A general complex recursive algorithm is implemented on rectangular and diamond contours. The accuracy of different quadrature schemes for various contours is investigated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A distinguishing feature of the discipline of archaeology is its reliance upon sensory dependant investigation. As perceived by all of the senses, the felt environment is a unique area of archaeological knowledge. It is generally accepted that the emergence of industrial processes in the recent past has been accompanied by unprecedented sonic extremes. The work of environmental historians has provided ample evidence that the introduction of much of this unwanted sound, or "noise" was an area of contestation. More recent research in the history of sound has called for more nuanced distinctions than the noisy/quiet dichotomy. Acoustic archaeology tends to focus upon a reconstruction of sound producing instruments and spaces with a primary goal of ascertaining intentionality. Most archaeoacoustic research is focused on learning more about the sonic world of people within prehistoric timeframes while some research has been done on historic sites. In this thesis, by way of a meditation on industrial sound and the physical remains of the Quincy Mining Company blacksmith shop (Hancock, MI) in particular, I argue for an acceptance and inclusion of sound as artifact in and of itself. I am introducing the concept of an individual sound-form, or sonifact, as a reproducible, repeatable, representable physical entity, created by tangible, perhaps even visible, host-artifacts. A sonifact is a sound that endures through time, with negligible variability. Through the piecing together of historical and archaeological evidence, in this thesis I present a plausible sonifactual assemblage at the blacksmith shop in April 1916 as it may have been experienced by an individual traversing the vicinity on foot: an 'historic soundwalk.' The sensory apprehension of abandoned industrial sites is multi-faceted. In this thesis I hope to make the case for an acceptance of sound as a primary heritage value when thinking about the industrial past, and also for an increased awareness and acceptance of sound and listening as a primary mode of perception.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mt Etna's activity has increased during the last decade with a tendency towards more explosive eruptions that produce paroxysmal lava fountains. From January 2011 to April 2012, 25 lava fountaining episodes took place at Etna's New South-East Crater (NSEC). Improved understanding of the mechanism driving these explosive basaltic eruptions is needed to reduce volcanic hazards. This type of activity produces high sulfur dioxide (SO2) emissions, associated with lava flows and ash fall-out, but to date the SO2 emissions associated with Etna's lava fountains have been poorly constrained. The Ultraviolet (UV) Ozone Monitoring Instrument (OMI) on NASA's Aura satellite and the Atmospheric Infrared Sounder (AIRS) on Aqua were used to measure the SO2 loadings. Ground-based data from the Observatoire de Physique du Globe de Clermont-Ferrand (OPGC) L-band Doppler radar, VOLDORAD 2B, used in collaboration with the Italian National Institute of Geophysics and Volcanology in Catania (INGV-CT), also detected the associated ash plumes, giving precise timing and duration for the lava fountains. This study resulted in the first detailed analysis of the OMI and AIRS SO2 data for Etna's lava fountains during the 2011-2012 eruptive cycle. The HYSPLIT trajectory model is used to constrain the altitude of the observed SO2 clouds, and results show that the SO2 emission usually coincided with the lava fountain peak intensity as detected by VOLDORAD. The UV OMI and IR AIRS SO2 retrievals permit quantification of the SO2 loss rate in the volcanic SO2 clouds, many of which were tracked for several days after emission. A first attempt to quantitatively validate AIRS SO2 retrievals with OMI data revealed a good correlation for high altitude SO2 clouds. Using estimates of the emitted SO2 at the time each paroxysm, we observe a correlation with the inter-paroxysm repose time. We therefore suggest that our data set supports the collapsing foam (CF) model [1] as driving mechanism for the paroxysmal events at the NSEC. Using VOLDORAD-based estimates of the erupted magma mass, we observe a large excess of SO2 in the eruption clouds. Satellite measurements indicate that SO2 emissions from Etnean lava fountains can reach the lower stratosphere and hence could pose a hazard to aviation. [1] Parfitt E.A (2004). A discussion of the mechanisms of explosive basaltic eruptions. J. Volcanol. Geotherm. Res. 134, 77-107.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation serves as a call to geoscientists to share responsibility with K-12 educators for increasing Earth science literacy. When partnerships are created among K-12 educators and geoscientists, the synergy created can promote Earth science literacy in students, teachers, and the broader community. The research described here resulted in development of tools that can support effective professional development for teachers. One tool is used during the planning stages to structure a professional development program, another set of tools supports measurement of the effectiveness of a development program, and the third tool supports sustainability of professional development programs. The Michigan Teacher Excellence Program (MiTEP), a Math/Science Partnership project funded by the National Science Foundation, served as the test bed for developing and testing these tools. The first tool, the planning tool, is the Earth Science Literacy Principles (ESLP). The ESLP served as a planning tool for the two-week summer field courses as part of the MiTEP program. The ESLP, published in 2009, clearly describe what an Earth science literate person should know. The ESLP consists of nine big ideas and their supporting fundamental concepts. Using the ESLP for planning a professional development program assisted both instructors and teacher-participants focus on important concepts throughout the professional development activity. The measurement tools were developed to measure change in teachers’ Earth science content-area knowledge and perceptions related to teaching and learning that result from participating in a professional development program. The first measurement tool, the Earth System Concept Inventory (ESCI), directly measures content-area knowledge through a succession of multiple-choice questions that are aligned with the content of the professional development experience. The second measurement, an exit survey, collects qualitative data from teachers regarding their impression of the professional development. Both the ESCI and the exit survey were tested for validity and reliability. Lesson study is discussed here as a strategy for sustaining professional development in a school or a district after the end of a professional development activity. Lesson study, as described here, was offered as a formal course. Teachers engaged in lesson study worked collaboratively to design and test lessons that improve the teachers’ classroom practices. Data regarding the impact of the lesson study activity were acquired through surveys, written documents, and group interviews. The data are interpreted to indicate that the lesson study process improved teacher quality and classroom practices. In the case described here, the lesson study process was adopted by the teachers’ district and currently serves as part of the district’s work in Professional Learning Communities, resulting in ongoing professional development throughout the district.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Large quantities of pure synthetic oligodeoxynucleotides (ODNs) are important for preclinical research, drug development, and biological studies. These ODNs are synthesized on an automated synthesizer. It is inevitable that the crude ODN product contains failure sequences which are not easily removed because they have the same properties as the full length ODNs. Current ODN purification methods such as polyacrylamide gel electrophoresis (PAGE), reversed-phase high performance liquid chromatography (RP HPLC), anion exchange HPLC, and affinity purification can remove those impurities. However, they are not suitable for large scale purification due to the expensive aspects associated with instrumentation, solvent demand, and high labor costs. To solve these problems, two non-chromatographic ODN purification methods have been developed. In the first method, the full-length ODN was tagged with the phosphoramidite containing a methacrylamide group and a cleavable linker while the failure sequences were not. The full-length ODN was incorporated into a polymer through radical acrylamide polymerization whereas failure sequences and other impurities were removed by washing. Pure full-length ODN was obtained by cleaving it from the polymer. In the second method, the failure sequences were capped by a methacrylated phosphoramidite in each synthetic cycle. During purification, the failure sequences were separated from the full-length ODN by radical acrylamide polymerization. The full-length ODN was obtained via water extraction. For both methods, excellent purification yields were achieved and the purity of ODNs was very satisfactory. Thus, this new technology is expected to be beneficial for large scale ODN purification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to analyze the evolution of an early 20th century mining system in Spitsbergen as applied by Boston-based Arctic Coal Company (ACC). This analysis will address the following questions: Did the system evolve in a linear, technological-based fashion? Or was the progression more a product of interactions and negotiations with the natural and human landscapes present during the time of occupation? Answers to these questions will be sought through review of historical records and material residues identified during the 2008 field examination on Spitsbergen. The Arctic Coal Company’s flagship mine, ACC Mine No. 1, will serve as the focus for this analysis. The mine was the company’s largest undertaking during its occupation of Longyear Valley and today exhibits a large collection of related features and artifacts. The study will emphasize on the material record within an analysis of technical, environmental and social influences that guided the course of the mining system. The intent of this thesis is a better understanding of how a particular resource extraction industry took root in the Arctic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the twenty-first century, the issue of privacy--particularly the privacy of individuals with regard to their personal information and effects--has become highly contested terrain, producing a crisis that affects both national and global social formations. This crisis, or problematic, characterizes a particular historical conjuncture I term the namespace. Using cultural studies and the theory of articulation, I map the emergent ways that the namespace articulates economic, juridical, political, cultural, and technological forces, materials, practices and protocols. The cohesive articulation of the namespace requires that privacy be reframed in ways that make its diminution seem natural and inevitable. In the popular media, privacy is often depicted as the price we pay as citizens and consumers for security and convenience, respectively. This discursive ideological shift supports and underwrites the interests of state and corporate actors who leverage the ubiquitous network of digitally connected devices to engender a new regime of informational surveillance, or dataveillance. The widespread practice of dataveillance represents a strengthening of the hegemonic relations between these actors--each shares an interest in promoting an emerging surveillance society, a burgeoning security politics, and a growing information economy--that further empowers them to capture and store the personal information of citizens/consumers. In characterizing these shifts and the resulting crisis, I also identify points of articulation vulnerable to rearticulation and suggest strategies for transforming the namespace in ways that might empower stronger protections for privacy and related civil rights.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Rooted in critical scholarship this dissertation is an interdisciplinary study, which contends that having a history is a basic human right. Advocating a newly conceived and termed, Solidarity-inspired History framework/practice perspective, the dissertation argues for and then delivers a restorative voice to working-class historical actors during the 1916 Minnesota Iron Ore Strike. Utilizing an interdisciplinary methodological framework the dissertation combines research methods from the Humanities and the Social Sciences to form a working-class history that is a corrective to standardized studies of labor in the late 19th and early 20th centuries. Oftentimes class interests and power relationships determine the dominant perspectives or voices established in history and disregard people and organizations that run counter to, or in the face of, customary or traditional American themes of patriotism, the Protestant work ethic, adherence to capitalist dogma, or United States exceptionalism. This dissertation counteracts these traditional narratives with a unique, perhaps even revolutionary, examination of the 1916 Minnesota Iron Ore Strike. The intention of this dissertation's critical perspective is to poke, prod, and prompt academics, historians, and the general public to rethink, and then think again, about the place of those who have been dislocated from or altogether forgotten, misplaced, or underrepresented in the historical record. Thus, the purpose of the dissertation is to give voice to historical actors in the dismembered past. Historical actors who have run counter to traditional American narratives often have their body of "evidence" disjointed or completely dislocated from the story of our nation. This type of disremembering creates an artificial recollection of our collective past, which de-articulates past struggles from contemporary groups seeking solidarity and social justice in the present. Class-conscious actors, immigrants, women, the GLBTQ community, and people of color have the right to be remembered on their own terms using primary sources and resources they produced. Therefore, similar to the Wobblies industrial union and its rank-and-file, this dissertation seeks to fan the flames of discontented historical memory by offering a working-class perspective of the 1916 Strike that seeks to interpret the actions, events, people, and places of the strike anew, thus restoring the voices of these marginalized historical actors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Acer saccharum Marsh., is one of the most valuable trees in the northern hardwood forests. Severe dieback was recently reported by area foresters in the western Upper Great Lakes Region. Sugar Maple has had a history of dieback over the last 100 years throughout its range and different variables have been identified as being the predisposing and inciting factors in different regions at different times. Some of the most common factors attributed to previous maple dieback episodes were insect defoliation outbreaks, inadequate precipitation, poor soils, atmospheric deposition, fungal pathogens, poor management, or a combination of these. The current sugar maple dieback was evaluated to determine the etiology, severity, and change in dieback on both industry and public lands. A network of 120 sugar maple health evaluation plots was established in the Upper Peninsula, Michigan, northern Wisconsin, and eastern Minnesota and evaluated annually from 2009-2012. Mean sugar maple crown dieback between 2009-2012 was 12.4% (ranging from 0.8-75.5%) across the region. Overall, during the sampling period, mean dieback decreased by 5% but individual plots and trees continued to decline. Relationships were examined between sugar maple dieback and growth, habitat conditions, ownership, climate, soil, foliage nutrients, and the maple pathogen sapstreak. The only statistically significant factor was found to be a high level of forest floor impacts due to exotic earthworm activity. Sugar maple on soils with lower pH had less earthworm impacts, less dieback, and higher growth rates than those on soils more favorable to earthworms. Nutritional status of foliage and soil was correlated with dieback and growth suggesting perturbation of nutrient cycling may be predisposing or contributing to dieback. The previous winter's snowfall totals, length of stay on the ground, and number of days with freezing temperatures had a significant positive relationship to sugar maple growth rates. Sapstreak disease, Ceratocystis virescens, may be contributing to dieback in some stands but was not related to the amount of dieback in the region. The ultimate goal of this research is to help forest managers in the Great Lakes Region prevent, anticipate, reduce, and/or salvage stands with dieback and loss in the future. An improved understanding of the complex etiology associated with sugar maple dieback in the Upper Great Lakes Region is necessary to make appropriate silvicultural decisions. Forest Health education helps increase awareness and proactive forest management in the face of changing forest ecosystems. Lessons are included to assist educators in incorporating forest health into standard biological disciplines at the secondary school curricula.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Heuristic optimization algorithms are of great importance for reaching solutions to various real world problems. These algorithms have a wide range of applications such as cost reduction, artificial intelligence, and medicine. By the term cost, one could imply that that cost is associated with, for instance, the value of a function of several independent variables. Often, when dealing with engineering problems, we want to minimize the value of a function in order to achieve an optimum, or to maximize another parameter which increases with a decrease in the cost (the value of this function). The heuristic cost reduction algorithms work by finding the optimum values of the independent variables for which the value of the function (the “cost”) is the minimum. There is an abundance of heuristic cost reduction algorithms to choose from. We will start with a discussion of various optimization algorithms such as Memetic algorithms, force-directed placement, and evolution-based algorithms. Following this initial discussion, we will take up the working of three algorithms and implement the same in MATLAB. The focus of this report is to provide detailed information on the working of three different heuristic optimization algorithms, and conclude with a comparative study on the performance of these algorithms when implemented in MATLAB. In this report, the three algorithms we will take in to consideration will be the non-adaptive simulated annealing algorithm, the adaptive simulated annealing algorithm, and random restart hill climbing algorithm. The algorithms are heuristic in nature, that is, the solution these achieve may not be the best of all the solutions but provide a means to reach a quick solution that may be a reasonably good solution without taking an indefinite time to implement.