6 resultados para model complexity

em Digital Commons - Michigan Tech


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on rehabilitation showed that appropriate and repetitive mechanical movements can help spinal cord injured individuals to restore their functional standing and walking. The objective of this paper was to achieve appropriate and repetitive joint movements and approximately normal gait through the PGO by replicating normal walking, and to minimize the energy consumption for both patients and the device. A model based experimental investigative approach is presented in this dissertation. First, a human model was created in Ideas and human walking was simulated in Adams. The main feature of this model was the foot ground contact model, which had distributed contact points along the foot and varied viscoelasticity. The model was validated by comparison of simulated results of normal walking and measured ones from the literature. It was used to simulate current PGO walking to investigate the real causes of poor function of the current PGO, even though it had joint movements close to normal walking. The direct cause was one leg moving at a time, which resulted in short step length and no clearance after toe off. It can not be solved by simply adding power on both hip joints. In order to find a better answer, a PGO mechanism model was used to investigate different walking mechanisms by locking or releasing some joints. A trade-off between energy consumption, control complexity and standing position was found. Finally a foot release PGO virtual model was created and simulated and only foot release mechanism was developed into a prototype. Both the release mechanism and the design of foot release were validated through the experiment by adding the foot release on the current PGO. This demonstrated an advancement in improving functional aspects of the current PGO even without a whole physical model of foot release PGO for comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eutrophication is a persistent problem in many fresh water lakes. Delay in lake recovery following reductions in external loading of phosphorus, the limiting nutrient in fresh water ecosystems, is often observed. Models have been created to assist with lake remediation efforts, however, the application of management tools to sediment diagenesis is often neglected due to conceptual and mathematical complexity. SED2K (Chapra et al. 2012) is proposed as a "middle way", offering engineering rigor while being accessible to users. An objective of this research is to further support the development and application SED2K for sediment phosphorus diagenesis and release to the water column of Onondaga Lake. Application of SED2K has been made to eutrophic Lake Alice in Minnesota. The more homogenous sediment characteristics of Lake Alice, compared with the industrially polluted sediment layers of Onondaga Lake, allowed for an invariant rate coefficient to be applied to describe first order decay kinetics of phosphorus. When a similar approach was attempted on Onondaga Lake an invariant rate coefficient failed to simulate the sediment phosphorus profile. Therefore, labile P was accounted for by progressive preservation after burial and a rate coefficient which gradual decreased with depth was applied. In this study, profile sediment samples were chemically extracted into five operationally-defined fractions: CaCO3-P, Fe/Al-P, Biogenic-P, Ca Mineral-P and Residual-P. Chemical fractionation data, from this study, showed that preservation is not the only mechanism by which phosphorus may be maintained in a non-reactive state in the profile. Sorption has been shown to contribute substantially to P burial within the profile. A new kinetic approach involving partitioning of P into process based fractions is applied here. Results from this approach indicate that labile P (Ca Mineral and Organic P) is contributing to internal P loading to Onondaga Lake, through diagenesis and diffusion to the water column, while the sorbed P fraction (Fe/Al-P and CaCO3-P) is remaining consistent. Sediment profile concentrations of labile and total phosphorus at time of deposition were also modeled and compared with current labile and total phosphorus, to quantify the extent to which remaining phosphorus which will continue to contribute to internal P loading and influence the trophic status of Onondaga Lake. Results presented here also allowed for estimation of the depth of the active sediment layer and the attendant response time as well as the sediment burden of labile P and associated efflux.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.