34 resultados para integrated model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plasma or "dry" etching is an essential process for the production of modern microelectronic circuits. However, despite intensive research, many aspects of the etch process are not fully understood. The results of studies of the plasma etching of Si and Si02 in fluorine-containing discharges, and the complementary technique of plasma polymerisation are presented in this thesis. Optical emission spectroscopy with argon actinometry was used as the principle plasma diagnostic. Statistical experimental design was used to model and compare Si and Si02 etch rates in CF4 and SF6 discharges as a function of flow, pressure and power. Etch mechanisms m both systems, including the potential reduction of Si etch rates in CF4 due to fluorocarbon polymer formation, are discussed. Si etch rates in CF4 /SF6 mixtures were successfully accounted for by the models produced. Si etch rates in CF4/C2F6 and CHF3 as a function of the addition of oxygen-containing additives (02, N20 and CO2) are shown to be consistent with a simple competition between F, 0 and CFx species for Si surface sites. For the range of conditions studied, Si02 etch rates were not dependent on F-atom concentration, but the presence of fluorine was essential in order to achieve significant etch rates. The influence of a wide range of electrode materials on the etch rate of Si and Si02 in CF4 and CF4 /02 plasmas was studied. It was found that the Si etch rate in a CF4 plasma was considerably enhanced, relative to an anodised aluminium electrode, in the presence of soda glass or sodium or potassium "doped" quartz. The effect was even more pronounced in a CF4 /02 discharge. In the latter system lead and copper electrodes also enhanced the Si etch rate. These results could not be accounted for by a corresponding rise in atomic fluorine concentration. Three possible etch enhancement mechanisms are discussed. Fluorocarbon polymer deposition was studied, both because of its relevance to etch mechanisms and its intrinsic interest, as a function of fluorocarbon source gas (CF4, C2F6, C3F8 and CHF3), process time, RF power and percentage hydrogen addition. Gas phase concentrations of F, H and CF2 were measured by optical emission spectroscopy, and the resultant polymer structure determined by X-ray photoelectron spectroscopy and infrared spectroscopy. Thermal and electrical properties were measured also. Hydrogen additions are shown to have a dominant role in determining deposition rate and polymer composition. A qualitative description of the polymer growth mechanism is presented which accounts for both changes in growth rate and structure, and leads to an empirical deposition rate model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a new approach to designing large organizational databases. The approach emphasizes the need for a holistic approach to the design process. The development of the proposed approach was based on a comprehensive examination of the issues of relevance to the design and utilization of databases. Such issues include conceptual modelling, organization theory, and semantic theory. The conceptual modelling approach presented in this thesis is developed over three design stages, or model perspectives. In the semantic perspective, concept definitions were developed based on established semantic principles. Such definitions rely on meaning - provided by intension and extension - to determine intrinsic conceptual definitions. A tool, called meaning-based classification (MBC), is devised to classify concepts based on meaning. Concept classes are then integrated using concept definitions and a set of semantic relations which rely on concept content and form. In the application perspective, relationships are semantically defined according to the application environment. Relationship definitions include explicit relationship properties and constraints. The organization perspective introduces a new set of relations specifically developed to maintain conformity of conceptual abstractions with the nature of information abstractions implied by user requirements throughout the organization. Such relations are based on the stratification of work hierarchies, defined elsewhere in the thesis. Finally, an example of an application of the proposed approach is presented to illustrate the applicability and practicality of the modelling approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is an Inter-Disciplinary Higher Degree (IHD) thesis about Water Pollution Control in the Iron and Steel Industry. After examining the compositions, and various treatment methods, for the major effluent streams from a typical Integrated Iron and Steel works, it was decided to concentrate investigative work on the activated-sludge treatment of coke-oven effluents. A mathematical model of this process was developed in an attempt to provide a tool for plant management that would enable improved performance, and enhanced control of Works Units. The model differs from conventional models in that allowance is made for the presence of two genera of microorganisms, each of which utilises a particular type of substrate as its energy source. Allowance is also made for the inhibitive effect of phenol on thiocyanate biodegradation, and for the self-toxicity of the bacteria when present in a high substrate concentration environment. The enumeration of the kinetic characteristics of the two groups of micro-organisms was shown to be of major importance. Laboratory experiments were instigated in an attempt to determine accurate values of these coefficients. The use of the Suspended Solids concentration was found to be too insensitive a measure of viable active mass. Other measures were investigated, and Adenosine Triphosphate concentration was chosen as the most effective measure of bacterial populations. Using this measure, a model was developed for phenol biodegradation from experimental results which implicated the possibility of storage of substate prior to metabolism. A model for thiocyanate biodegradation was also developed, although the experimental results indicate that much work is still required in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the issues in the innovation system literature is examination of technological learning strategies of laggard nations. Two distinct bodies of literature have contributed to our insight into forces driving learning and innovation, National Systems of Innovation (NSI) and technological learning literature. Although both literatures yield insights on catch-up strategies of 'latecomer' nations, the explanatory powers of each literature by itself is limited. In this paper, a possible way of linking the macro- and the micro-level approaches by incorporating enterprises as active learning entities into the learning and innovation system is proposed. The proposed model has been used to develop research hypotheses and indicate research directions and is relevant for investigating the learning strategies of firms in less technologically intensive industries outside East Asia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale disasters are constantly occurring around the world, and in many cases evacuation of regions of city is needed. ‘Operational Research/Management Science’ (OR/MS) has been widely used in emergency planning for over five decades. Warning dissemination, evacuee transportation and shelter management are three ‘Evacuation Support Functions’ (ESF) generic to many hazards. This thesis has adopted a case study approach to illustrate the importance of integrated approach of evacuation planning and particularly the role of OR/MS models. In the warning dissemination phase, uncertainty in the household’s behaviour as ‘warning informants’ has been investigated along with uncertainties in the warning system. An agentbased model (ABM) was developed for ESF-1 with households as agents and ‘warning informants’ behaviour as the agent behaviour. The model was used to study warning dissemination effectiveness under various conditions of the official channel. In the transportation phase, uncertainties in the household’s behaviour such as departure time (a function of ESF-1), means of transport and destination have been. Households could evacuate as pedestrians, using car or evacuation buses. An ABM was developed to study the evacuation performance (measured in evacuation travel time). In this thesis, a holistic approach for planning the public evacuation shelters called ‘Shelter Information Management System’ (SIMS) has been developed. A generic allocation framework of was developed to available shelter capacity to the shelter demand by considering the evacuation travel time. This was formulated using integer programming. In the sheltering phase, the uncertainty in household shelter choices (either nearest/allocated/convenient) has been studied for its impact on allocation policies using sensitivity analyses. Using analyses from the models and detailed examination of household states from ‘warning to safety’, it was found that the three ESFs though sequential in time, however have lot of interdependencies from the perspective of evacuation planning. This thesis has illustrated an OR/MS based integrated approach including and beyond single ESF preparedness. The developed approach will help in understanding the inter-linkages of the three evacuation phases and preparing a multi-agency-based evacuation planning evacuation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this research is to develop a holistic approach to maximize the customer service level while minimizing the logistics cost by using an integrated multiple criteria decision making (MCDM) method for the contemporary transshipment problem. Unlike the prevalent optimization techniques, this paper proposes an integrated approach which considers both quantitative and qualitative factors in order to maximize the benefits of service deliverers and customers under uncertain environments. Design/methodology/approach – This paper proposes a fuzzy-based integer linear programming model, based on the existing literature and validated with an example case. The model integrates the developed fuzzy modification of the analytic hierarchy process (FAHP), and solves the multi-criteria transshipment problem. Findings – This paper provides several novel insights about how to transform a company from a cost-based model to a service-dominated model by using an integrated MCDM method. It suggests that the contemporary customer-driven supply chain remains and increases its competitiveness from two aspects: optimizing the cost and providing the best service simultaneously. Research limitations/implications – This research used one illustrative industry case to exemplify the developed method. Considering the generalization of the research findings and the complexity of the transshipment service network, more cases across multiple industries are necessary to further enhance the validity of the research output. Practical implications – The paper includes implications for the evaluation and selection of transshipment service suppliers, the construction of optimal transshipment network as well as managing the network. Originality/value – The major advantages of this generic approach are that both quantitative and qualitative factors under fuzzy environment are considered simultaneously and also the viewpoints of service deliverers and customers are focused. Therefore, it is believed that it is useful and applicable for the transshipment service network design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present syllable-based duration modelling in the context of a prosody model for Standard Yorùbá (SY) text-to-speech (TTS) synthesis applications. Our prosody model is conceptualised around a modular holistic framework. This framework is implemented using the Relational Tree (R-Tree) techniques. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration, intonation, and intensity, using different techniques and their subsequent integration. We applied the Fuzzy Decision Tree (FDT) technique to model the duration dimension. In order to evaluate the effectiveness of FDT in duration modelling, we have also developed a Classification And Regression Tree (CART) based duration model using the same speech data. Each of these models was integrated into our R-Tree based prosody model. We performed both quantitative (i.e. Root Mean Square Error (RMSE) and Correlation (Corr)) and qualitative (i.e. intelligibility and naturalness) evaluations on the two duration models. The results show that CART models the training data more accurately than FDT. The FDT model, however, shows a better ability to extrapolate from the training data since it achieved a better accuracy for the test data set. Our qualitative evaluation results show that our FDT model produces synthesised speech that is perceived to be more natural than our CART model. In addition, we also observed that the expressiveness of FDT is much better than that of CART. That is because the representation in FDT is not restricted to a set of piece-wise or discrete constant approximation. We, therefore, conclude that the FDT approach is a practical approach for duration modelling in SY TTS applications. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Indian Petroleum Industry is passing through a very dynamic business environment due to liberalization. Effective project management for developing new infrastructures and maintaining the existing facilities has been considered as one of the means for remaining competitive but these practices suffer from many shortcomings, as time, cost and quality non-achievements are part and parcel of almost every project. This study focuses on identifying the specific causes of project failure by demonstrating first the characteristics of projects in Indian Petroleum industry and suggests some remedial measures for resolving these issues. The suggested project management model is integrated through information management system and demonstrated through a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In supply chain management literature, there has been little empirical research investigation on purchasing consortium issues focusing on a detailed analysis of information and communication (ICT) based procurement strategies. Based on the exploration of academic literature and two surveys among purchasing organisations as well as e-Marketplaces / procurement service providers (PSPs) in the automotive and electronics industry sectors, the research methodology follows a positivistic approach in order to assess the overall statement: ‘Effective participation in electronic purchasing consortia (EPC) can have the potential to enhance competitive advantage. Implementation therefore requires a clear and detailed understanding of the major process structures and drivers, based upon thetechnology-organisation-environment framework.’ Key factors and structures that affect the adoption and diffusion of EPC and the performance impact of adoption are investigated. The empirically derived model for EPC can be a valuable starting point to EPC research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple transformative forces target marketing, many of which derive from new technologies that allow us to sample thinking in real time (i.e., brain imaging), or to look at large aggregations of decisions (i.e., big data). There has been an inclination to refer to the intersection of these technologies with the general topic of marketing as “neuromarketing”. There has not been a serious effort to frame neuromarketing, which is the goal of this paper. Neuromarketing can be compared to neuroeconomics, wherein neuroeconomics is generally focused on how individuals make “choices”, and represent distributions of choices. Neuromarketing, in contrast, focuses on how a distribution of choices can be shifted or “influenced”, which can occur at multiple “scales” of behavior (e.g., individual, group, or market/society). Given influence can affect choice through many cognitive modalities, and not just that of valuation of choice options, a science of influence also implies a need to develop a model of cognitive function integrating attention, memory, and reward/aversion function. The paper concludes with a brief description of three domains of neuromarketing application for studying influence, and their caveats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes how dimensional variation management could be integrated throughout design, manufacture and verification, to improve quality while reducing cycle times and manufacturing cost in the Digital Factory environment. Initially variation analysis is used to optimize tolerances during product and tooling design and also results in the creation of a simplified representation of product key characteristics. This simplified representation can then be used to carry out measurability analysis and process simulation. The link established between the variation analysis model and measurement processes can subsequently be used throughout the production process to automatically update the variation analysis model in real time with measurement data. This ‘live’ simulation of variation during manufacture will allow early detection of quality issues and facilitate autonomous measurement assisted processes such as predictive shimming. A study is described showing how these principles can be demonstrated using commercially available software combined with a number of prototype applications operating as discrete modules. The commercially available modules include Catia/Delmia for product and process design, 3DCS for variation analysis and Spatial Analyzer for measurement simulation. Prototype modules are used to carry out measurability analysis and instrument selection. Realizing the full potential of Metrology in the Digital Factory will require that these modules are integrated and software architecture to facilitate this is described. Crucially this integration must facilitate the use of realtime metrology data describing the emerging assembly to update the digital model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent changes to the legislation on chemicals and cosmetics testing call for a change in the paradigm regarding the current 'whole animal' approach for identifying chemical hazards, including the assessment of potential neurotoxins. Accordingly, since 2004, we have worked on the development of the integrated co-culture of post-mitotic, human-derived neurons and astrocytes (NT2.N/A), for use as an in vitro functional central nervous system (CNS) model. We have used it successfully to investigate indicators of neurotoxicity. For this purpose, we used NT2.N/A cells to examine the effects of acute exposure to a range of test chemicals on the cellular release of brain-derived neurotrophic factor (BDNF). It was demonstrated that the release of this protective neurotrophin into the culture medium (above that of control levels) occurred consistently in response to sub-cytotoxic levels of known neurotoxic, but not non-neurotoxic, chemicals. These increases in BDNF release were quantifiable, statistically significant, and occurred at concentrations below those at which cell death was measureable, which potentially indicates specific neurotoxicity, as opposed to general cytotoxicity. The fact that the BDNF immunoassay is non-invasive, and that NT2.N/A cells retain their functionality for a period of months, may make this system useful for repeated-dose toxicity testing, which is of particular relevance to cosmetics testing without the use of laboratory animals. In addition, the production of NT2.N/A cells without the use of animal products, such as fetal bovine serum, is being explored, to produce a fully-humanised cellular model.