8 resultados para Job Demands-Resources model
em Digital Commons - Michigan Tech
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.
Resumo:
Peru is a developing country with abundant fresh water resources, yet the lack of infrastructure leaves much of the population without access to safe water for domestic uses. The author of this report was a Peace Corps Volunteer in the sector of water & sanitation in the district of Independencia, Ica, Peru. Independencia is located in the arid coastal region of the country, receiving on average 15 mm of rain annually. The water source for this district comes from the Pisco River, originating in the Andean highlands and outflowing into the Pacific Ocean near the town of Pisco, Peru. The objectives of this report are to assess the water supply and sanitation practices, model the existing water distribution system, and make recommendations for future expansion of the distribution system in the district of Independencia, Peru. The assessment of water supply will be based on the results from community surveys done in the district of Independencia, water quality testing done by a detachment of the U.S. Navy, as well as on the results of a hydraulic model built in EPANET 2.0 to represent the distribution system. Sanitation practice assessments will be based on the surveys as well as observations from the author while living in Peru. Recommendations for system expansions will be made based on results from the EPANET model and the municipality’s technical report for the existing distribution system. Household water use and sanitation surveys were conducted with 84 families in the district revealing that upwards of 85% store their domestic water in regularly washed containers with lids. Over 80% of those surveyed are drinking water that is treated, mostly boiled. Of those surveyed, over 95% reported washing their hands and over 60% mentioned at least one critical time for hand washing when asked for specific instances. From the surveys, it was also discovered that over 80% of houses are properly disposing of excrement, in either latrines or septic tanks. There were 43 families interviewed with children five years of age or under, and just over 18% reported the child had a case of diarrhea within the last month at the time of the interview. Finally, from the surveys it was calculated that the average water use per person per day is about 22 liters. Water quality testing carried out by a detachment of the U.S. Navy revealed that the water intended for consumption in the houses surveyed was not suitable for consumption, with a median E. coli most probable number of 47/100 ml for the 61 houses sampled. The median total coliforms was 3,000 colony forming units per 100 ml. EPANET was used to simulate the water delivery system and evaluate its performance. EPANET is designed for continuous water delivery systems, assuming all pipes are always flowing full. To account for the intermittent nature of the system, multiple EPANET network models were created to simulate how water is routed to the different parts of the system throughout the day. The models were created from interviews with the water technicians and a map of the system created using handheld GPS units. The purpose is to analyze the performance of the water system that services approximately 13,276 people in the district of Independencia, Peru, as well as provide recommendations for future growth and improvement of the service level. Performance evaluation of the existing system is based on meeting 25 liters per person per day while maintaining positive pressure at all nodes in the network. The future performance is based on meeting a minimum pressure of 20 psi in the main line, as proposed by Chase (2000). The EPANET model results yield an average nodal pressure for all communities of 71 psi, with a range from 1.3 – 160 psi. Thus, if the current water delivery schedule obtained from the local municipality is followed, all communities should have sufficient pressure to deliver 25 l/p/d, with the exception of Los Rosales, which can only supply 3.25 l/p/d. However, if the line to Los Rosales were increased from one to four inches, the system could supply this community with 25 l/p/d. The district of Independencia could greatly benefit from increasing the service level to 24-hour water delivery and a minimum of 50 l/p/d, so that communities without reliable access due to insufficient pressure would become equal beneficiaries of this invaluable resource. To evaluate the feasibility of this, EPANET was used to model the system with a range of population growth rates, system lifetimes, and demands. In order to meet a minimum pressure of 20 psi in the main line, the 6-inch diameter main line must be increased and approximately two miles of trench must be excavated up to 30 feet deep. The sections of the main line that must be excavated are mile 0-1 and 1.5-2.5, and the first 3.4 miles of the main line must be increased from 6 to 16 inches, contracting to 10 inches for the remaining 5.8 miles. Doing this would allow 24-hour water delivery and provide 50 l/p/d for a range of population growth rates and system lifetimes. It is expected that improving the water delivery service would reduce the morbidity and mortality from diarrheal diseases by decreasing the recontamination of the water due to transport and household storage, as well as by maintaining continuous pressure in the system to prevent infiltration of contaminated groundwater. However, this expansion must be carefully planned so as not to affect aquatic ecosystems or other districts utilizing water from the Pisco River. It is recommended that stream gaging of the Pisco River and precipitation monitoring of the surrounding watershed is initiated in order to begin a hydrological study that would be integrated into the district’s water resource planning. It is also recommended that the district begin routine water quality testing, with the results available to the public.
Resumo:
Heterogeneous materials are ubiquitous in nature and as synthetic materials. These materials provide unique combination of desirable mechanical properties emerging from its heterogeneities at different length scales. Future structural and technological applications will require the development of advanced light weight materials with superior strength and toughness. Cost effective design of the advanced high performance synthetic materials by tailoring their microstructure is the challenge facing the materials design community. Prior knowledge of structure-property relationships for these materials is imperative for optimal design. Thus, understanding such relationships for heterogeneous materials is of primary interest. Furthermore, computational burden is becoming critical concern in several areas of heterogeneous materials design. Therefore, computationally efficient and accurate predictive tools are highly essential. In the present study, we mainly focus on mechanical behavior of soft cellular materials and tough biological material such as mussel byssus thread. Cellular materials exhibit microstructural heterogeneity by interconnected network of same material phase. However, mussel byssus thread comprises of two distinct material phases. A robust numerical framework is developed to investigate the micromechanisms behind the macroscopic response of both of these materials. Using this framework, effect of microstuctural parameters has been addressed on the stress state of cellular specimens during split Hopkinson pressure bar test. A voronoi tessellation based algorithm has been developed to simulate the cellular microstructure. Micromechanisms (microinertia, microbuckling and microbending) governing macroscopic behavior of cellular solids are investigated thoroughly with respect to various microstructural and loading parameters. To understand the origin of high toughness of mussel byssus thread, a Genetic Algorithm (GA) based optimization framework has been developed. It is found that two different material phases (collagens) of mussel byssus thread are optimally distributed along the thread. These applications demonstrate that the presence of heterogeneity in the system demands high computational resources for simulation and modeling. Thus, Higher Dimensional Model Representation (HDMR) based surrogate modeling concept has been proposed to reduce computational complexity. The applicability of such methodology has been demonstrated in failure envelope construction and in multiscale finite element techniques. It is observed that surrogate based model can capture the behavior of complex material systems with sufficient accuracy. The computational algorithms presented in this thesis will further pave the way for accurate prediction of macroscopic deformation behavior of various class of advanced materials from their measurable microstructural features at a reasonable computational cost.
Resumo:
Madagascar’s terrestrial and aquatic ecosystems have long supported a unique set of ecological communities, many of whom are endemic to the tropical island. Those same ecosystems have been a source of valuable natural resources to some of the poorest people in the world. Nevertheless, with pride, ingenuity and resourcefulness, the Malagasy people of the southwest coast, being of Vezo identity, subsist with low development fishing techniques aimed at an increasingly threatened host of aquatic seascapes. Mangroves, sea grass bed, and coral reefs of the region are under increased pressure from the general populace for both food provisions and support of economic opportunity. Besides purveyors and extractors, the coastal waters are also subject to a number of natural stressors, including cyclones and invasive, predator species of both flora and fauna. In addition, the aquatic ecosystems of the region are undergoing increased nutrient and sediment runoff due, in part, to Madagascar’s heavy reliance on land for agricultural purposes (Scales, 2011). Moreover, its coastal waters, like so many throughout the world, have been proven to be warming at an alarming rate over the past few decades. In recognizing the intimate interconnectedness of the both the social and ecological systems, conservation organizations have invoked a host of complimentary conservation and social development efforts with the dual aim of preserving or restoring the health of both the coastal ecosystems and the people of the region. This paper provides a way of thinking more holistically about the social-ecological system within a resiliency frame of understanding. Secondly, it applies a platform known as state-and-transition modeling to give form to the process. State-and-transition modeling is an iterative investigation into the physical makeup of a system of study as well as the boundaries and influences on that state, and has been used in restorative ecology for more than a decade. Lastly, that model is sited within an adaptive management scheme that provides a structured, cyclical, objective-oriented process for testing stakeholders cognitive understanding of the ecosystem through a pragmatic implementation and monitoring a host of small-scale interventions developed as part of the adaptive management process. Throughout, evidence of the application of the theories and frameworks are offered, with every effort made to retool conservation-minded development practitioners with a comprehensive strategy for addressing the increasingly fragile social-ecological systems of southwest Madagascar. It is offered, in conclusion, that the seascapes of the region would be an excellent case study worthy of future application of state-and-transition modeling and adaptive management as frameworks for conservation-minded development practitioners whose multiple projects, each with its own objective, have been implemented with a single goal in mind: preserve and protect the state of the supporting environment while providing for the basic needs of the local Malagasy people.
Resumo:
Two important and upcoming technologies, microgrids and electricity generation from wind resources, are increasingly being combined. Various control strategies can be implemented, and droop control provides a simple option without requiring communication between microgrid components. Eliminating the single source of potential failure around the communication system is especially important in remote, islanded microgrids, which are considered in this work. However, traditional droop control does not allow the microgrid to utilize much of the power available from the wind. This dissertation presents a novel droop control strategy, which implements a droop surface in higher dimension than the traditional strategy. The droop control relationship then depends on two variables: the dc microgrid bus voltage, and the wind speed at the current time. An approach for optimizing this droop control surface in order to meet a given objective, for example utilizing all of the power available from a wind resource, is proposed and demonstrated. Various cases are used to test the proposed optimal high dimension droop control method, and demonstrate its function. First, the use of linear multidimensional droop control without optimization is demonstrated through simulation. Next, an optimal high dimension droop control surface is implemented with a simple dc microgrid containing two sources and one load. Various cases for changing load and wind speed are investigated using simulation and hardware-in-the-loop techniques. Optimal multidimensional droop control is demonstrated with a wind resource in a full dc microgrid example, containing an energy storage device as well as multiple sources and loads. Finally, the optimal high dimension droop control method is applied with a solar resource, and using a load model developed for a military patrol base application. The operation of the proposed control is again investigated using simulation and hardware-in-the-loop techniques.
Resumo:
Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.
Resumo:
File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.