27 resultados para integrated-process model
Resumo:
This paper describes the use of liaison to better integrate product model and assembly process model so as to enable sharing of design and assembly process information in a common integrated form and reason about them. Liaison can be viewed as a set, usually a pair, of features in proximity with which process information can be associated. A liaison is defined as a set of geometric entities on the parts being assembled and relations between these geometric entities. Liaisons have been defined for riveting, welding, bolt fastening, screw fastening, adhesive bonding (gluing) and blind fastening processes. The liaison captures process specific information through attributes associated with it. The attributes are associated with process details at varying levels of abstraction. A data structure for liaison has been developed to cluster the attributes of the liaison based on the level of abstraction. As information about the liaisons is not explicitly available in either the part model or the assembly model, algorithms have been developed for extracting liaisons from the assembly model. The use of liaison is proposed to enable both the construction of process model as the product model is fleshed out, as well as maintaining integrity of both product and process models as the inevitable changes happen to both design and the manufacturing environment during the product lifecycle. Results from aerospace and automotive domains have been provided to illustrate and validate the use of liaisons. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The term design in this paper particularly refers to the process (verb) and less-to the outcome or product. Design comprises a complex set of activities today involving both man and machine. Sustainability is a fundamental paradigm and carries significance in any process, natural or manmade, and its outcome. In simple terms, sustainability implies a state of sustainable living, viz, health and continuity, nurtured by diversity and evolution (innovations) in an ever-changing world. Design, in a similar line, has been comprehensively investigated and its current manifestations including design-aids (Computer Aided Design) have been evaluated in terms of sustainability. The paper investigates the rationale of sustainability to design as a whole - its purpose, its adoption in the natural world, its relevance to humankind and the technologies involved. Throughout its history, technology has been used to aid design. But in the current context of advanced algorithms and computational capacity, design no longer remains an exclusively animate faculty. Given this scenario, investigating sustainability in the light of advanced design aids such as CAD becomes pertinent. Considering that technology plays a part in design activities, the paper explores where technology must play a part and to what degree amongst the various activities that comprise design. The study includes an examination of the morphology of design and the development of a systems-thinking integrated forecasting model to evaluate the implications of CAD tools in design and sustainability. The results of the study along with a broad range of recommendations have been presented. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The Government of India has announced the Greening India Mission (GIM) under the National Climate Change Action Plan. The Mission aims to restore and afforest about 10 mha over the period 2010-2020 under different sub-missions covering moderately dense and open forests, scrub/grasslands, mangroves, wetlands, croplands and urban areas. Even though the main focus of the Mission is to address mitigation and adaptation aspects in the context of climate change, the adaptation component is inadequately addressed. There is a need for increased scientific input in the preparation of the Mission. The mitigation potential is estimated by simply multiplying global default biomass growth rate values and area. It is incomplete as it does not include all the carbon pools, phasing, differing growth rates, etc. The mitigation potential estimated using the Comprehensive Mitigation Analysis Process model for the GIM for the year 2020 has the potential to offset 6.4% of the projected national greenhouse gas emissions, compared to the GIM estimate of only 1.5%, excluding any emissions due to harvesting or disturbances. The selection of potential locations for different interventions and species choice under the GIM must be based on the use of modelling, remote sensing and field studies. The forest sector provides an opportunity to promote mitigation and adaptation synergy, which is not adequately addressed in the GIM. Since many of the interventions proposed are innovative and limited scientific knowledge exists, there is need for an unprecedented level of collaboration between the research institutions and the implementing agencies such as the Forest Departments, which is currently non-existent. The GIM could propel systematic research into forestry and climate change issues and thereby provide global leadership in this new and emerging science.
Resumo:
Impoverishment of particles, i.e. the discretely simulated sample paths of the process dynamics, poses a major obstacle in employing the particle filters for large dimensional nonlinear system identification. A known route of alleviating this impoverishment, i.e. of using an exponentially increasing ensemble size vis-a-vis the system dimension, remains computationally infeasible in most cases of practical importance. In this work, we explore the possibility of unscented transformation on Gaussian random variables, as incorporated within a scaled Gaussian sum stochastic filter, as a means of applying the nonlinear stochastic filtering theory to higher dimensional structural system identification problems. As an additional strategy to reconcile the evolving process dynamics with the observation history, the proposed filtering scheme also modifies the process model via the incorporation of gain-weighted innovation terms. The reported numerical work on the identification of structural dynamic models of dimension up to 100 is indicative of the potential of the proposed filter in realizing the stated aim of successfully treating relatively larger dimensional filtering problems. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Two atmospheric inversions (one fine-resolved and one process-discriminating) and a process-based model for land surface exchanges are brought together to analyse the variations of methane emissions from 1990 to 2009. A focus is put on the role of natural wetlands and on the years 2000-2006, a period of stable atmospheric concentrations. From 1990 to 2000, the top-down and bottom-up visions agree on the time-phasing of global total and wetland emission anomalies. The process-discriminating inversion indicates that wetlands dominate the time-variability of methane emissions (90% of the total variability). The contribution of tropical wetlands to the anomalies is found to be large, especially during the post-Pinatubo years (global negative anomalies with minima between -41 and -19 Tg yr(-1) in 1992) and during the alternate 1997-1998 El-Nino/1998-1999 La-Nina (maximal anomalies in tropical regions between +16 and +22 Tg yr(-1) for the inversions and anomalies due to tropical wetlands between +12 and +17 Tg yr(-1) for the process-based model). Between 2000 and 2006, during the stagnation of methane concentrations in the atmosphere, the top-down and bottom-up approaches agree on the fact that South America is the main region contributing to anomalies in natural wetland emissions, but they disagree on the sign and magnitude of the flux trend in the Amazon basin. A negative trend (-3.9 +/- 1.3 Tg yr(-1)) is inferred by the process-discriminating inversion whereas a positive trend (+1.3 +/- 0.3 Tg yr(-1)) is found by the process model. Although processed-based models have their own caveats and may not take into account all processes, the positive trend found by the B-U approach is considered more likely because it is a robust feature of the process-based model, consistent with analysed precipitations and the satellite-derived extent of inundated areas. On the contrary, the surface-data based inversions lack constraints for South America. This result suggests the need for a re-interpretation of the large increase found in anthropogenic methane inventories after 2000.
Resumo:
In this research work, we introduce a novel approach for phase estimation from noisy reconstructed interference fields in digital holographic interferometry using an unscented Kalman filter. Unlike conventionally used unwrapping algorithms and piecewise polynomial approximation approaches, this paper proposes, for the first time to the best of our knowledge, a signal tracking approach for phase estimation. The state space model derived in this approach is inspired from the Taylor series expansion of the phase function as the process model, and polar to Cartesian conversion as the measurement model. We have characterized our approach by simulations and validated the performance on experimental data (holograms) recorded under various practical conditions. Our study reveals that the proposed approach, when compared with various phase estimation methods available in the literature, outperforms at lower SNR values (i.e., especially in the range 0-20 dB). It is demonstrated with experimental data as well that the proposed approach is a better choice for estimating rapidly varying phase with high dynamic range and noise. (C) 2014 Optical Society of America
Resumo:
Story understanding involves many perceptual and cognitive subprocesses, from perceiving individual words, to parsing sentences, to understanding the relationships among the story characters. We present an integrated computational model of reading that incorporates these and additional subprocesses, simultaneously discovering their fMRI signatures. Our model predicts the fMRI activity associated with reading arbitrary text passages, well enough to distinguish which of two story segments is being read with 74% accuracy. This approach is the first to simultaneously track diverse reading subprocesses during complex story processing and predict the detailed neural representation of diverse story features, ranging from visual word properties to the mention of different story characters and different actions they perform. We construct brain representation maps that replicate many results from a wide range of classical studies that focus each on one aspect of language processing and offer new insights on which type of information is processed by different areas involved in language processing. Additionally, this approach is promising for studying individual differences: it can be used to create single subject maps that may potentially be used to measure reading comprehension and diagnose reading disorders.
Resumo:
The broader goal of the research being described here is to automatically acquire diagnostic knowledge from documents in the domain of manual and mechanical assembly of aircraft structures. These documents are treated as a discourse used by experts to communicate with others. It therefore becomes possible to use discourse analysis to enable machine understanding of the text. The research challenge addressed in the paper is to identify documents or sections of documents that are potential sources of knowledge. In a subsequent step, domain knowledge will be extracted from these segments. The segmentation task requires partitioning the document into relevant segments and understanding the context of each segment. In discourse analysis, the division of a discourse into various segments is achieved through certain indicative clauses called cue phrases that indicate changes in the discourse context. However, in formal documents such language may not be used. Hence the use of a domain specific ontology and an assembly process model is proposed to segregate chunks of the text based on a local context. Elements of the ontology/model, and their related terms serve as indicators of current context for a segment and changes in context between segments. Local contexts are aggregated for increasingly larger segments to identify if the document (or portions of it) pertains to the topic of interest, namely, assembly. Knowledge acquired through such processes enables acquisition and reuse of knowledge during any part of the lifecycle of a product.
Resumo:
Current design models and frameworks describe various overlapping fragments of designing. However, little effort exists in consolidating these fragments into an integrated model. We propose a model of designing that integrates product and process facets of designing by combining activities, outcomes, requirements, and solutions. Validation of the model using video protocols of design sessions demonstrates that all the constructs are used naturally by designers but often not to the expected level, which hinders the variety and resulting novelty of the concepts developed in these sessions. To resolve this, a prescriptive framework for supporting design for variety and novelty is proposed and plans for its implementation are created. DOI: 10.1115/1.3467011]
Resumo:
There are essentially two different phenomenological models available to describe the interdiffusion process in binary systems in the olid state. The first of these, which is used more frequently, is based on the theory of flux partitioning. The second model, developed much more recently, uses the theory of dissociation and reaction. Although the theory of flux partitioning has been widely used, we found that this theory does not account for the mobility of both species and therefore is not suitable for use in most interdiffusion systems. We have first modified this theory to take into account the mobility of both species and then further extended it to develop relations or the integrated diffusion coefficient and the ratio of diffusivities of the species. The versatility of these two different models is examined in the Co-Si system with respect to different end-member compositions. From our analysis, we found that the applicability of the theory of flux partitioning is rather limited but the theory of dissociation and reaction can be used in any binary system.
Resumo:
Boron carbide is produced in a heat resistance furnace using boric oxide and petroleum coke as the raw materials. The product yield is very low. Heat transfer plays an important role in the formation of boron carbide. Temperature at the core reaches up to 2600 K. No experimental study is available in the open literature for this high temperature process particularly in terms of temperature measurement and heat transfer. Therefore, a laboratory scale hot model of the process has been setup to measure the temperatures in harsh conditions at different locations in the furnace using various temperature measurement devices such as pyrometer and various types of thermocouple. Particular attention was paid towards the accuracy and reliability of the measured data. The recorded data were analysed to understand the heat transfer process inside the reactor and the effect of it on the formation of boron carbide.
Resumo:
The existing models describing electrochemical phase formation involving both adsorption and a nucleation/growth process are modified. The limiting cases leading to the existing models are discussed. The characteristic features of the potentiostatic transients are presented. A generalization of the Avrami ansatz is given for two or more competitive irreversibly growing phases.
Resumo:
An integrated model is developed, based on seasonal inputs of reservoir inflow and rainfall in the irrigated area, to determine the optimal reservoir release policies and irrigation allocations to multiple crops. The model is conceptually made up of two modules, Module 1 is an intraseasonal allocation model to maximize the sum of relative yields of all crops, for a given state of the system, using linear programming (LP). The module takes into account reservoir storage continuity, soil moisture balance, and crop root growth with time. Module 2 is a seasonal allocation model to derive the steady state reservoir operating policy using stochastic dynamic programming (SDP). Reservoir storage, seasonal inflow, and seasonal rainfall are the state variables in the SDP. The objective in SDP is to maximize the expected sum of relative yields of all crops in a year. The results of module 1 and the transition probabilities of seasonal inflow and rainfall form the input for module 2. The use of seasonal inputs coupled with the LP-SDP solution strategy in the present formulation facilitates in relaxing the limitations of an earlier study, while affecting additional improvements. The model is applied to an existing reservoir in Karnataka State, India.
Resumo:
A one-dimensional, biphasic, multicomponent steady-state model based on phenomenological transport equations for the catalyst layer, diffusion layer, and polymeric electrolyte membrane has been developed for a liquid-feed solid polymer electrolyte direct methanol fuel cell (SPE- DMFC). The model employs three important requisites: (i) implementation of analytical treatment of nonlinear terms to obtain a faster numerical solution as also to render the iterative scheme easier to converge, (ii) an appropriate description of two-phase transport phenomena in the diffusive region of the cell to account for flooding and water condensation/evaporation effects, and (iii) treatment of polarization effects due to methanol crossover. An improved numerical solution has been achieved by coupling analytical integration of kinetics and transport equations in the reaction layer, which explicitly include the effect of concentration and pressure gradient on cell polarization within the bulk catalyst layer. In particular, the integrated kinetic treatment explicitly accounts for the nonhomogeneous porous structure of the catalyst layer and the diffusion of reactants within and between the pores in the cathode. At the anode, the analytical integration of electrode kinetics has been obtained within the assumption of macrohomogeneous electrode porous structure, because methanol transport in a liquid-feed SPE- DMFC is essentially a single-phase process because of the high miscibility of methanol with water and its higher concentration in relation to gaseous reactants. A simple empirical model accounts for the effect of capillary forces on liquid-phase saturation in the diffusion layer. Consequently, diffusive and convective flow equations, comprising Nernst-Plank relation for solutes, Darcy law for liquid water, and Stefan-Maxwell equation for gaseous species, have been modified to include the capillary flow contribution to transport. To understand fully the role of model parameters in simulating the performance of the DMCF, we have carried out its parametric study. An experimental validation of model has also been carried out. (C) 2003 The Electrochemical Society.