961 resultados para Model-Based Design


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A very efficient learning algorithm for model subset selection is introduced based on a new composite cost function that simultaneously optimizes the model approximation ability and model robustness and adequacy. The derived model parameters are estimated via forward orthogonal least squares, but the model subset selection cost function includes a D-optimality design criterion that maximizes the determinant of the design matrix of the subset to ensure the model robustness, adequacy, and parsimony of the final model. The proposed approach is based on the forward orthogonal least square (OLS) algorithm, such that new D-optimality-based cost function is constructed based on the orthogonalization process to gain computational advantages and hence to maintain the inherent advantage of computational efficiency associated with the conventional forward OLS approach. Illustrative examples are included to demonstrate the effectiveness of the new approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A three stage-treatment of domestic wastewater including anaerobic, anoxic and aerobic phases is employed in this study while a clarifier unit is replaced with a submerged membrane in the aerobic unit. The effects of operational parameters on the performance of a pilot scale submerged membrane bioreactor (SMBR) namely hydraulic retention time (HRT), ratio of return activated sludge (QRS), ratio of internal recycle (QIR), solid retention time (SRT) and dissolved oxygen (DO) are evaluated by simulations, using a hybrid model composed of TUDP model, oxygen transfer model, biofouling model due to extra-cellular polymeric substances (EPS) and turbulent shear model. The results showed that anaerobic HRT of 3 hours, anoxic HRT of 6 hours, QRS of 20% and QIR of 300 % are satisfactory in obtaining a high removal efficiency (>90%) of COD, NH4-N, P04-P as well as a less sludge production. An increase of sludge production causes an increase in EPS, which fouls the membrane surface and increase the cleaning cycle of membrane. Operation of 5MBR system at 2 mg/I of DO and 30 days of SRT can extend the membrane cleaning cycle dramatically. The membrane cleaning cycle however is strongly dependent on the initial and terminal specific fluxes and displays inverse power relationships to those fluxes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic variations in channel behavior is considered in transmission power control design for cellular radio systems. It is well known that power control increases system capacity, improves Quality of Service (QoS), and reduces multiuser interference. In this paper, an adaptive power control design based on the identification of the underlying pathloss dynamics of the fading channel is presented. Formulating power control decisions based on the measured received power levels allows modeling the fading channel pathloss dynamics in terms of a Hidden Markov Model (HMM). Applying the online HMM identification algorithm enables accurate estimation of the real pathloss ensuring efficient performance of the suggested power control scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pervasive computing is a user-centric mobile computing paradigm, in which tasks should be migrated over different platforms in a shadow-like way when users move around. In this paper, we propose a context-sensitive task migration model that recovers program states and rebinds resources for task migrations based on context semantics through inserting resource description and state description sections in source programs. Based on our model, we design and develop a task migration framework xMozart which extends the Mozart platform in terms of context awareness. Our approach can recover task states and rebind resources in the context-aware way, as well as support multi- modality I/O interactions. The extensive experiments demonstrate that our approach can migrate tasks by resuming them from the last broken points like shadows moving along with the users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relaxed conditions for stability of nonlinear continuous-time systems given by fuzzy models axe presented. A theoretical analysis shows that the proposed method provides better or at least the same results of the methods presented in the literature. Digital simulations exemplify this fact. This result is also used for fuzzy regulators design. The nonlinear systems are represented by fuzzy models proposed by Takagi and Sugeno. The stability analysis and the design of controllers axe described by LMIs (Linear Matrix Inequalities), that can be solved efficiently using convex programming techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of non-pressure compensating drip hose in horticultural and annual cycle fruits is growing in Brazil. In this case, the challenge for designers is getting longer lateral lines with high values of uniformity. The objective of this study was to develop a model to design longer lateral lines using non-pressure compensating drip hose. Using the developed model, the hypotheses to be evaluated were: a) the use of two different spacing between emitters in the same lateral line allows longer length; b) it is possible to get longer lateral lines using high values of pressure variation in the lateral lines since the distribution uniformity stays below allowable limits. A computer program was developed in Delphi based on the model developed and it is able to design lateral lines in level using non-pressure compensating drip hose. The input data are: desired distribution uniformity (DU); initial and final pressure in the lateral line; coefficients of relationship between emitter discharge and pressure head; hose internal diameter; pipe cross-sectional area with the dripper; and roughness coefficient for the Hazen-Williams equation. The program allows calculate the lateral line length with three possibilities: selecting two spacing between emitters and defining the exchange point; using two pre-established spacing between emitters and calculating the length of each section with different spacing; using one emitter spacing. Results showed that the use of two sections with different spacing between drippers in the lateral line didn't allow longer length but got better uniformity when compared with lateral line with one spacing between emitters. The adoption of two spacing increased the flow rate per meter in the final section which represented approximately 80% of the lateral line total length and this justifies their use. The software allowed DU above 90% with pressure head variation of 40% and the use of two spacing between emitters. The developed model/software showed to be accurate, easy to handle and useful for lateral line design using non-pressure compensating drip hose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A ligand-based drug design study was performed to acetaminophen regioisomers as analgesic candidates employing quantum chemical calculations at the DFT/B3LYP level of theory and the 6-31G* basis set. To do so, many molecular descriptors were used such as highest occupied molecular orbital, ionization potential, HO bond dissociation energies, and spin densities, which might be related to quench reactivity of the tyrosyl radical to give N-acetyl-p-benzosemiquinone-imine through an initial electron withdrawing or hydrogen atom abstraction. Based on this in silico work, the most promising molecule, orthobenzamol, was synthesized and tested. The results expected from the theoretical prediction were confirmed in vivo using mouse models of nociception such as writhing, paw licking, and hot plate tests. All biological results suggested an antinociceptive activity mediated by opioid receptors. Furthermore, at 90 and 120 min, this new compound had an effect that was comparable to morphine, the standard drug for this test. Finally, the pharmacophore model is discussed according to the electronic properties derived from quantum chemistry calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In deterministic optimization, the uncertainties of the structural system (i.e. dimension, model, material, loads, etc) are not explicitly taken into account. Hence, resulting optimal solutions may lead to reduced reliability levels. The objective of reliability based design optimization (RBDO) is to optimize structures guaranteeing that a minimum level of reliability, chosen a priori by the designer, is maintained. Since reliability analysis using the First Order Reliability Method (FORM) is an optimization procedure itself, RBDO (in its classical version) is a double-loop strategy: the reliability analysis (inner loop) and the structural optimization (outer loop). The coupling of these two loops leads to very high computational costs. To reduce the computational burden of RBDO based on FORM, several authors propose decoupling the structural optimization and the reliability analysis. These procedures may be divided in two groups: (i) serial single loop methods and (ii) unilevel methods. The basic idea of serial single loop methods is to decouple the two loops and solve them sequentially, until some convergence criterion is achieved. On the other hand, uni-level methods employ different strategies to obtain a single loop of optimization to solve the RBDO problem. This paper presents a review of such RBDO strategies. A comparison of the performance (computational cost) of the main strategies is presented for several variants of two benchmark problems from the literature and for a structure modeled using the finite element method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present a new population-based method for the design of bone fixation plates. Standard pre-contoured plates are designed based on the mean shape of a certain population. We propose a computational process to design implants while reducing the amount of required intra-operative shaping, thus reducing the mechanical stresses applied to the plate. A bending and torsion model was used to measure and minimize the necessary intra-operative deformation. The method was applied and validated on a population of 200 femurs that was further augmented with a statistical shape model. The obtained results showed substantial reduction in the bending and torsion needed to shape the new design into any bone in the population when compared to the standard mean-based plates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on rehabilitation showed that appropriate and repetitive mechanical movements can help spinal cord injured individuals to restore their functional standing and walking. The objective of this paper was to achieve appropriate and repetitive joint movements and approximately normal gait through the PGO by replicating normal walking, and to minimize the energy consumption for both patients and the device. A model based experimental investigative approach is presented in this dissertation. First, a human model was created in Ideas and human walking was simulated in Adams. The main feature of this model was the foot ground contact model, which had distributed contact points along the foot and varied viscoelasticity. The model was validated by comparison of simulated results of normal walking and measured ones from the literature. It was used to simulate current PGO walking to investigate the real causes of poor function of the current PGO, even though it had joint movements close to normal walking. The direct cause was one leg moving at a time, which resulted in short step length and no clearance after toe off. It can not be solved by simply adding power on both hip joints. In order to find a better answer, a PGO mechanism model was used to investigate different walking mechanisms by locking or releasing some joints. A trade-off between energy consumption, control complexity and standing position was found. Finally a foot release PGO virtual model was created and simulated and only foot release mechanism was developed into a prototype. Both the release mechanism and the design of foot release were validated through the experiment by adding the foot release on the current PGO. This demonstrated an advancement in improving functional aspects of the current PGO even without a whole physical model of foot release PGO for comparison.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models are an effective tool for systems and software design. They allow software architects to abstract from the non-relevant details. Those qualities are also useful for the technical management of networks, systems and software, such as those that compose service oriented architectures. Models can provide a set of well-defined abstractions over the distributed heterogeneous service infrastructure that enable its automated management. We propose to use the managed system as a source of dynamically generated runtime models, and decompose management processes into a composition of model transformations. We have created an autonomic service deployment and configuration architecture that obtains, analyzes, and transforms system models to apply the required actions, while being oblivious to the low-level details. An instrumentation layer automatically builds these models and interprets the planned management actions to the system. We illustrate these concepts with a distributed service update operation.