984 resultados para knowledge modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

By alloying metals with other materials, one can modify the metal’s characteristics or compose an alloy which has certain desired characteristics that no pure metal has. The field is vast and complex, and phenomena that govern the behaviour of alloys are numerous. Theories cannot penetrate such complexity, and the scope of experiments is also limited. This is why the relatively new field of ab initio computational methods has much to give to this field. With these methods, one can extend the understanding given by theories, predict how some systems might behave, and be able to obtain information that is not there to see in physical experiments. This thesis pursues to contribute to the collective knowledge of this field in the light of two cases. The first part examines the oxidation of Ag/Cu, namely, the adsorption dynamics and oxygen induced segregation of the surface. Our results demonstrate that the presence of Ag on the Cu(100) surface layer strongly inhibits dissociative adsorption. Our results also confirmed that surface reconstruction does happen, as experiments predicted. Our studies indicate that 0.25 ML of oxygen is enough for Ag to diffuse towards the bulk, under the copper oxide layer. The other part elucidates the complex interplay of various energy and entropy contributions to the phase stability of paramagnetic duplex steel alloys. We were able to produce a phase stability map from first principles, and it agrees with experiments rather well. Our results also show that entropy contributions play a very important role on defining the phase stability. This is, to the author’s knowledge, the first ab initio study upon this subject.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modifiering av metallytor med starkt adsorberade kirala organiska molekyler är eventuellt den mest relevanta teknik man vet i dag för att skapa kirala ytor. Den kan utnyttjas i katalytisk produktion av enantiomeriskt rena kirala föreningar som behövs t.ex. som läkemedel och aromkemikalier. Trots många fördelar av asymmetrisk heterogen katalys jämfört med andra sätt för att få kirala föreningar, har den ändå inte blivit ett allmänt verktyg för storskaliga tillämpningar. Detta beror t.ex. på brist på djupare kunskaper i katalytiska reaktionsmekanismer och ursprunget för asymmetrisk induktion. I denna studie användes molekylmodelleringstekniker för att studera asymmetriska, heterogena katalytiska system, speciellt hydrering av prokirala karbonylföreningar till motsvarande kirala alkoholer på cinchona-alkaloidmodifierade Pt-katalysatorer. 1-Fenyl-1,2-propandion (PPD) och några andra föreningar, som innehåller en prokiral C=O-grupp, användes som reaktanter. Konformationer av reaktanter och cinchona-alkaloider (som kallas modifierare) samt vätebundna 1:1-komplex mellan dem studerades i gas- och lösningsfas med metoder som baserar sig på vågfunktionsteori och täthetsfunktionalteori (DFT). För beräkningen av protonaffiniteter användes också högst noggranna kombinationsmetoder såsom G2(MP2). Den relativa populationen av modifierarnas konformationer varierade som funktion av modifieraren, dess protonering och lösningsmedlet. Flera reaktant–modifierareinteraktionsgeometrier beaktades. Slutsatserna på riktning av stereoselektivitet baserade sig på den relativa termodynamiska stabiliteten av de diastereomeriska reaktant–modifierare-komplexen samt energierna hos π- och π*-orbitalerna i den reaktiva karbonylgruppen. Adsorption och reaktioner på Pt(111)-ytan betraktades med DFT. Regioselektivitet i hydreringen av PPD och 2,3-hexandion kunde förklaras med molekyl–yta-interaktioner. Storleken och formen av klustret använt för att beskriva Pt-ytan inverkade inte bara på adsorptionsenergierna utan också på de relativa stabiliteterna av olika adsorptionsstrukturer av en molekyl. Populationerna av modifierarnas konformationer i gas- och lösningsfas korrelerade inte med populationerna på Pt-ytan eller med enantioselektiviteten i hydreringen av PPD på Pt–cinchona-katalysatorer. Vissa modifierares konformationer och reaktant–modifierare-interaktionsgeometrier var stabila bara på metallytan. Teoretiskt beräknade potentialenergiprofiler för hydrering av kirala α-hydroxiketoner på Pt implicerade preferens för parvis additionsmekanism för väte och selektiviteter i harmoni med experimenten. De uppnådda resultaten ökar uppfattningen om kirala heterogena katalytiska system och kunde därför utnyttjas i utvecklingen av nya, mera aktiva och selektiva kirala katalysatorer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of domain-specific languages (DSLs) has been proposed as an approach to cost-e ectively develop families of software systems in a restricted application domain. Domain-specific languages in combination with the accumulated knowledge and experience of previous implementations, can in turn be used to generate new applications with unique sets of requirements. For this reason, DSLs are considered to be an important approach for software reuse. However, the toolset supporting a particular domain-specific language is also domain-specific and is per definition not reusable. Therefore, creating and maintaining a DSL requires additional resources that could be even larger than the savings associated with using them. As a solution, di erent tool frameworks have been proposed to simplify and reduce the cost of developments of DSLs. Developers of tool support for DSLs need to instantiate, customize or configure the framework for a particular DSL. There are di erent approaches for this. An approach is to use an application programming interface (API) and to extend the basic framework using an imperative programming language. An example of a tools which is based on this approach is Eclipse GEF. Another approach is to configure the framework using declarative languages that are independent of the underlying framework implementation. We believe this second approach can bring important benefits as this brings focus to specifying what should the tool be like instead of writing a program specifying how the tool achieves this functionality. In this thesis we explore this second approach. We use graph transformation as the basic approach to customize a domain-specific modeling (DSM) tool framework. The contributions of this thesis includes a comparison of di erent approaches for defining, representing and interchanging software modeling languages and models and a tool architecture for an open domain-specific modeling framework that e ciently integrates several model transformation components and visual editors. We also present several specific algorithms and tool components for DSM framework. These include an approach for graph query based on region operators and the star operator and an approach for reconciling models and diagrams after executing model transformation programs. We exemplify our approach with two case studies MICAS and EFCO. In these studies we show how our experimental modeling tool framework has been used to define tool environments for domain-specific languages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The knowledge of the slug flow characteristics is very important when designing pipelines and process equipment. When the intermittences typical in slug flow occurs, the fluctuations of the flow variables bring additional concern to the designer. Focusing on this subject the present work discloses the experimental data on slug flow characteristics occurring in a large-size, large-scale facility. The results were compared with data provided by mechanistic slug flow models in order to verify their reliability when modelling actual flow conditions. Experiments were done with natural gas and oil or water as the liquid phase. To compute the frequency and velocity of the slug cell and to calculate the length of the elongated bubble and liquid slug one used two pressure transducers measuring the pressure drop across the pipe diameter at different axial locations. A third pressure transducer measured the pressure drop between two axial location 200 m apart. The experimental data were compared with results of Camargo's1 algorithm (1991, 1993), which uses the basics of Dukler & Hubbard's (1975) slug flow model, and those calculated by the transient two-phase flow simulator OLGA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigation of high pressure pretreatment process for gold leaching is the objective of the present master's thesis. The gold ores and concentrates which cannot be easily treated by leaching process are called "refractory". These types of ores or concentrates often have high content of sulfur and arsenic that renders the precious metal inaccessible to the leaching agents. Since the refractory ores in gold manufacturing industry take a considerable share, the pressure oxidation method (autoclave method) is considered as one of the possible ways to overcome the related problems. Mathematical modeling is the main approach in this thesis which was used for investigation of high pressure oxidation process. For this task, available information from literature concerning this phenomenon, including chemistry, mass transfer and kinetics, reaction conditions, applied apparatus and application, was collected and studied. The modeling part includes investigation of pyrite oxidation kinetics in order to create a descriptive mathematical model. The following major steps are completed: creation of process model by using the available knowledge; estimation of unknown parameters and determination of goodness of the fit; study of the reliability of the model and its parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of the thesis is to improve the state of knowledge and understanding of the physical structure of the TMCS and its short range prediction. The present study principally addresses the fine structure, dynamics and microphysics of severe convective storms.The structure and dynamics of the Tropical cloud clusters over Indian region is not well understood. The observational cases discussed in the thesis are limited to the temperature and humidity observations. We propose a mesoscale observational network along with all the available Doppler radars and other conventional and non—conventional observations. Simultaneous observations with DWR, VHF and UHF radars of the same cloud system will provide new insight into the dynamics and microphysics of the clouds. More cases have to be studied in detail to obtain climatology of the storm type passing over tropical Indian region. These observational data sets provide wide variety of information to be assimilated to the mesoscale data assimilation system and can be used to force CSRM.The gravity wave generation and stratosphere troposphere exchange (STE) processes associated with convection gained a great deal of attention to modem science and meteorologist. Round the clock observations using VHF and UHF radars along with supplementary data sets like DWR, satellite, GPS/Radiosondes, meteorological rockets and aircrafl observations is needed to explore the role of convection and associated energetics in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When simulation modeling is used for performance improvement studies of complex systems such as transport terminals, domain specific conceptual modeling constructs could be used by modelers to create structured models. A two stage procedure which includes identification of the problem characteristics/cluster - ‘knowledge acquisition’ and identification of standard models for the problem cluster – ‘model abstraction’ was found to be effective in creating structured models when applied to certain logistic terminal systems. In this paper we discuss some methods and examples related the knowledge acquisition and model abstraction stages for the development of three different types of model categories of terminal systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among many other knowledge representations formalisms, Ontologies and Formal Concept Analysis (FCA) aim at modeling ‘concepts’. We discuss how these two formalisms may complement another from an application point of view. In particular, we will see how FCA can be used to support Ontology Engineering, and how ontologies can be exploited in FCA applications. The interplay of FCA and ontologies is studied along the life cycle of an ontology: (i) FCA can support the building of the ontology as a learning technique. (ii) The established ontology can be analyzed and navigated by using techniques of FCA. (iii) Last but not least, the ontology may be used to improve an FCA application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new robust neurofuzzy model construction algorithm has been introduced for the modeling of a priori unknown dynamical systems from observed finite data sets in the form of a set of fuzzy rules. Based on a Takagi-Sugeno (T-S) inference mechanism a one to one mapping between a fuzzy rule base and a model matrix feature subspace is established. This link enables rule based knowledge to be extracted from matrix subspace to enhance model transparency. In order to achieve maximized model robustness and sparsity, a new robust extended Gram-Schmidt (G-S) method has been introduced via two effective and complementary approaches of regularization and D-optimality experimental design. Model rule bases are decomposed into orthogonal subspaces, so as to enhance model transparency with the capability of interpreting the derived rule base energy level. A locally regularized orthogonal least squares algorithm, combined with a D-optimality used for subspace based rule selection, has been extended for fuzzy rule regularization and subspace based information extraction. By using a weighting for the D-optimality cost function, the entire model construction procedure becomes automatic. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing energy consumption in the residential sector represents about 30% of global demand. This calls for Demand Side Management solutions propelling change in behaviors of end consumers, with the aim to reduce overall consumption as well as shift it to periods in which demand is lower and where the cost of generating energy is lower. Demand Side Management solutions require detailed knowledge about the patterns of energy consumption. The profile of electricity demand in the residential sector is highly correlated with the time of active occupancy of the dwellings; therefore in this study the occupancy patterns in Spanish properties was determined using the 2009–2010 Time Use Survey (TUS), conducted by the National Statistical Institute of Spain. The survey identifies three peaks in active occupancy, which coincide with morning, noon and evening. This information has been used to input into a stochastic model which generates active occupancy profiles of dwellings, with the aim to simulate domestic electricity consumption. TUS data were also used to identify which appliance-related activities could be considered for Demand Side Management solutions during the three peaks of occupancy.