869 resultados para Building Design Process
Resumo:
This paper presents both the theoretical and the experimental approaches of the development of a mathematical model to be used in multi-variable control system designs of an active suspension for a sport utility vehicle (SUV), in this case a light pickup truck. A complete seven-degree-of-freedom model is successfully quickly identified, with very satisfactory results in simulations and in real experiments conducted with the pickup truth. The novelty of the proposed methodology is the use of commercial software in the early stages of the identification to speed up the process and to minimize the need for a large number of costly experiments. The paper also presents major contributions to the identification of uncertainties in vehicle suspension models and in the development of identification methods using the sequential quadratic programming, where an innovation regarding the calculation of the objective function is proposed and implemented. Results from simulations of and practical experiments with the real SUV are presented, analysed, and compared, showing the potential of the method.
Resumo:
Compliant mechanisms can achieve a specified motion as a mechanism without relying on the use of joints and pins. They have broad application in precision mechanical devices and Micro-Electro Mechanical Systems (MEMS) but may lose accuracy and produce undesirable displacements when subjected to temperature changes. These undesirable effects can be reduced by using sensors in combination with control techniques and/or by applying special design techniques to reduce such undesirable effects at the design stage, a process generally termed ""design for precision"". This paper describes a design for precision method based on a topology optimization method (TOM) for compliant mechanisms that includes thermal compensation features. The optimization problem emphasizes actuator accuracy and it is formulated to yield optimal compliant mechanism configurations that maximize the desired output displacement when a force is applied, while minimizing undesirable thermal effects. To demonstrate the effectiveness of the method, two-dimensional compliant mechanisms are designed considering thermal compensation, and their performance is compared with compliant mechanisms designs that do not consider thermal compensation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Tungsten carbide has a wide range of applications, mainly cemented carbides made of WC and Co, as wear resistant materials. However, the high cost of WC-Co powders encourages the use of a substrate to manufacture a functionally graded material (FGM) tool made of WC-Co and a tool steel. These materials join the high wear resistance of the cemented carbide and the toughness of the steel. This work deals with the study interaction of the WC-Co and H13 steel to design a functionally graded material by means of spark plasma sintering (SPS). The SPS, a novel sintering technique reaching the consolidation of the powders at relatively low temperatures and short dwell times, is a promising technique in processing materials. In this study, WC, H13 steel, WC-Co, WC-H13 steel and WC-Co-H13 steel bulk samples were investigated using scanning electron microscopy and X-ray diffraction techniques to evaluate the phase transformations involved during SPS consolidation process. The W(2)C and W(3)Fe(3)C precipitation were identified after the SPS consolidation of the WC and WC-H13 steel samples, respectively. The precipitation Of W(4)Co(2)C was also identified in the WC-Co and WC-Co-H13 steel samples. The WC-H 13 steel and WC-Co-H13 steel were also evaluated after heat treatments at 1100 degrees C for 9 h, which enhanced the chemical interaction and the precipitation of W(3)Fe(3)C and W(4)Co(2)C, respectively. (C) 2009 Elsevier Ltd. All rights reserved.
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
The procedure for online process control by attributes consists of inspecting a single item at every m produced items. It is decided on the basis of the inspection result whether the process is in-control (the conforming fraction is stable) or out-of-control (the conforming fraction is decreased, for example). Most articles about online process control have cited the stoppage of the production process for an adjustment when the inspected item is non-conforming (then the production is restarted in-control, here denominated as corrective adjustment). Moreover, the articles related to this subject do not present semi-economical designs (which may yield high quantities of non-conforming items), as they do not include a policy of preventive adjustments (in such case no item is inspected), which can be more economical, mainly if the inspected item can be misclassified. In this article, the possibility of preventive or corrective adjustments in the process is decided at every m produced item. If a preventive adjustment is decided upon, then no item is inspected. On the contrary, the m-th item is inspected; if it conforms, the production goes on, otherwise, an adjustment takes place and the process restarts in-control. This approach is economically feasible for some practical situations and the parameters of the proposed procedure are determined minimizing an average cost function subject to some statistical restrictions (for example, to assure a minimal levelfixed in advanceof conforming items in the production process). Numerical examples illustrate the proposal.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The optimization of the treatment process for residual waters from a brewery operating under the modality of an anaerobic reactor and activated sludge combination was studied in two phases. In the first stage, lasting for six months, the characteristics and parameters of the plant operation were analyzed, wherein a diversion rate of more than 50% to aerobic treatment, the use of two aeration tanks and a high sludge production prevailed. The second stage comprised four months during which the system worked under the proposed operational model, with the aim of improving the treatment: reduction of the diversion rate to 30% and use of only one aeration tank At each stage, TSS, VSS and COD were measured at the entrance and exit of the anaerobic reactor mid the aeration tanks. The results were compared with the corresponding design specifications and the needed conditions were applied to reduce the diversion rate towards the aerobic process through monitoring the volume and concentration of the affluent, while applying the strategic changes in reactor parameters needed to increase its efficiency. A diversion reduction from 53 to 34% was achieved, reducing the sludge discharge generated in the aerobic system from 3670mg TSS/l. with two aeration tanks down to 2947mf TSS/l using one tank keeping the same relation VSS:TSS (0.55) and an efficiency of total removal of 98% in terms of COD.
Resumo:
The aim of this paper is to present an economical design of an X chart for a short-run production. The process mean starts equal to mu(0) (in-control, State I) and in a random time it shifts to mu(1) > mu(0) (out-of-control, State II). The monitoring procedure consists of inspecting a single item at every m produced ones. If the measurement of the quality characteristic does not meet the control limits, the process is stopped, adjusted, and additional (r - 1) items are inspected retrospectively. The probabilistic model was developed considering only shifts in the process mean. A direct search technique is applied to find the optimum parameters which minimizes the expected cost function. Numerical examples illustrate the proposed procedure. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This article presents an evaluation of the effects of the spouted bed design and operating conditions on system fluiddynamics and process performance during enteric coating of hard gelatine capsules. The design parameters studied were the column diameter (150 mm and 200 mm), the included angle of the conical base, gamma (60 degrees or 40 degrees) and the presence or absence of a Venturi inserted before the inlet air orifice. The process variables studied were the ratio between the feed flow rate of the coating suspension to the spouting gas flow rate (W(s)/W(g)), the mass of capsules loaded to the equipment (M(0)), and the ratio between the Spouting gas flow rate to the gas flow rate at minimum spouting condition (Q/Q(ms)). The response variables were the rate of increase of the capsules mass (K(1)), and the adhesion efficiency (eta). The linear regression equation for the dependent variable K, in terms of the independent variables adequately described the process with an r(2) value of 0.872. Analysis of variance (ANOVA) revealed that increasing of W(s)/W(g), Q/Q(ms) and gamma significantly increased the adhesion efficiency. Adhesion efficiencies higher than 90% were achieved by selecting precise coating conditions, indicating the feasibility of the process for coating of hard gelatine capsules. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The results presented in this report form a part of a larger global study on the major issues in BPM. Only one part of the larger study is reported here, viz. interviews with BPM experts. Interviews of BPM tool vendors together with focus groups involving user organizations, are continuing in parallel and will set the groundwork for the identification of BPM issues on a global scale via a survey (including a Delphi study). Through this multi-method approach, we identify four distinct sets of outcomes. First, as is the focus of this report, we identify the BPM issues as perceived by BPM experts. Second, the research design allows us to gain insight into the opinions of organisations deploying BPM solutions. Third, an understanding of organizations’ misconceptions of BPM technologies, as confronted by BPM tool vendors is obtained. Last, we seek to gain an understanding of BPM issues on a global scale, together with knowledge of matters of concern. This final outcome is aimed to produce an industry driven research agenda which will inform practitioners and in particular, the research community world-wide on issues and challenges that are prevalent or emerging in BPM and related areas.
Resumo:
Business process design is primarily driven by process improvement objectives. However, the role of control objectives stemming from regulations and standards is becoming increasingly important for businesses in light of recent events that led to some of the largest scandals in corporate history. As organizations strive to meet compliance agendas, there is an evident need to provide systematic approaches that assist in the understanding of the interplay between (often conflicting) business and control objectives during business process design. In this paper, our objective is twofold. We will firstly present a research agenda in the space of business process compliance, identifying major technical and organizational challenges. We then tackle a part of the overall problem space, which deals with the effective modeling of control objectives and subsequently their propagation onto business process models. Control objective modeling is proposed through a specialized modal logic based on normative systems theory, and the visualization of control objectives on business process models is achieved procedurally. The proposed approach is demonstrated in the context of a purchase-to-pay scenario.
Resumo:
The development of large-scale solid-stale fermentation (SSF) processes is hampered by the lack of simple tools for the design of SSF bioreactors. The use of semifundamental mathematical models to design and operate SSF bioreactors can be complex. In this work, dimensionless design factors are used to predict the effects of scale and of operational variables on the performance of rotating drum bioreactors. The dimensionless design factor (DDF) is a ratio of the rate of heat generation to the rate of heat removal at the time of peak heat production. It can be used to predict maximum temperatures reached within the substrate bed for given operational variables. Alternatively, given the maximum temperature that can be tolerated during the fermentation, it can be used to explore the combinations of operating variables that prevent that temperature from being exceeded. Comparison of the predictions of the DDF approach with literature data for operation of rotating drums suggests that the DDF is a useful tool. The DDF approach was used to explore the consequences of three scale-up strategies on the required air flow rates and maximum temperatures achieved in the substrate bed as the bioreactor size was increased on the basis of geometric similarity. The first of these strategies was to maintain the superficial flow rate of the process air through the drum constant. The second was to maintain the ratio of volumes of air per volume of bioreactor constant. The third strategy was to adjust the air flow rate with increase in scale in such a manner as to maintain constant the maximum temperature attained in the substrate bed during the fermentation. (C) 2000 John Wiley & Sons, Inc.
Resumo:
In this paper we propose a new framework for evaluating designs based on work domain analysis, the first phase of cognitive work analysis. We develop a rationale for a new approach to evaluation by describing the unique characteristics of complex systems and by showing that systems engineering techniques only partially accommodate these characteristics. We then present work domain analysis as a complementary framework for evaluation. We explain this technique by example by showing how the Australian Defence Force used work domain analysis to evaluate design proposals for a new system called Airborne Early Warning and Control. This case study also demonstrates that work domain analysis is a useful and feasible approach that complements standard techniques for evaluation and that promotes a central role for human factors professionals early in the system design and development process. Actual or potential applications of this research include the evaluation of designs for complex systems.