904 resultados para Single-process Models
Resumo:
Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.
Resumo:
To test the effectiveness of stochastic single-chain models in describing the dynamics of entangled polymers, we systematically compare one such model; the slip-spring model; to a multichain model solved using stochastic molecular dynamics(MD) simulations (the Kremer-Grest model). The comparison involves investigating if the single-chain model can adequately describe both a microscopic dynamical and a macroscopic rheological quantity for a range of chain lengths. Choosing a particular chain length in the slip-spring model, the parameter values that best reproduce the mean-square displacement of a group of monomers is determined by fitting toMDdata. Using the same set of parameters we then test if the predictions of the mean-square displacements for other chain lengths agree with the MD calculations. We followed this by a comparison of the time dependent stress relaxation moduli obtained from the two models for a range of chain lengths. After identifying a limitation of the original slip-spring model in describing the static structure of the polymer chain as seen in MD, we remedy this by introducing a pairwise repulsive potential between the monomers in the chains. Poor agreement of the mean-square monomer displacements at short times can be rectified by the use of generalized Langevin equations for the dynamics and resulted in significantly improved agreement.
Resumo:
Minimal representations are known to have no redundant elements, and are therefore of great importance. Based on the notions of performance and size indices and measures for process systems, the paper proposes conditions for a process model being minimal in a set of functionally equivalent models with respect to a size norm. Generalized versions of known procedures to obtain minimal process models for a given modelling goal, model reduction based on sensitivity analysis and incremental model building are proposed and discussed. The notions and procedures are illustrated and compared on a simple example, that of a simple nonlinear fermentation process with different modelling goals and on a case study of a heat exchanger modelling. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents two strategies for the upgrade of set-up generation systems for tandem cold mills. Even though these mills have been modernized mainly due to quality requests, their upgrades may be made intending to replace pre-calculated reference tables. In this case, Bryant and Osborn mill model without adaptive technique is proposed. As a more demanding modernization, Bland and Ford model including adaptation is recommended, although it requires a more complex computational hardware. Advantages and disadvantages of these two systems are compared and discussed and experimental results obtained from an industrial cold mill are shown.
Resumo:
In this paper, we present various diagnostic methods for polyhazard models. Polyhazard models are a flexible family for fitting lifetime data. Their main advantage over the single hazard models, such as the Weibull and the log-logistic models, is to include a large amount of nonmonotone hazard shapes, as bathtub and multimodal curves. Some influence methods, such as the local influence and total local influence of an individual are derived, analyzed and discussed. A discussion of the computation of the likelihood displacement as well as the normal curvature in the local influence method are presented. Finally, an example with real data is given for illustration.
Resumo:
Business process design is primarily driven by process improvement objectives. However, the role of control objectives stemming from regulations and standards is becoming increasingly important for businesses in light of recent events that led to some of the largest scandals in corporate history. As organizations strive to meet compliance agendas, there is an evident need to provide systematic approaches that assist in the understanding of the interplay between (often conflicting) business and control objectives during business process design. In this paper, our objective is twofold. We will firstly present a research agenda in the space of business process compliance, identifying major technical and organizational challenges. We then tackle a part of the overall problem space, which deals with the effective modeling of control objectives and subsequently their propagation onto business process models. Control objective modeling is proposed through a specialized modal logic based on normative systems theory, and the visualization of control objectives on business process models is achieved procedurally. The proposed approach is demonstrated in the context of a purchase-to-pay scenario.
Resumo:
A lot of research in cognition and decision making suffers from a lack of formalism. The quantum probability program could help to improve this situation, but we wonder whether it would provide even more added value if its presumed focus on outcome models were complemented by process models that are, ideally, informed by ecological analyses and integrated into cognitive architectures.
Resumo:
Forensic science is generally defined as the application of science to address questions related to the law. Too often, this view restricts the contribution of science to one single process which eventually aims at bringing individuals to court while minimising risk of miscarriage of justice. In order to go beyond this paradigm, we propose to refocus the attention towards traces themselves, as remnants of a criminal activity, and their information content. We postulate that traces contribute effectively to a wide variety of other informational processes that support decision making inmany situations. In particular, they inform actors of new policing strategies who place the treatment of information and intelligence at the centre of their systems. This contribution of forensic science to these security oriented models is still not well identified and captured. In order to create the best condition for the development of forensic intelligence, we suggest a framework that connects forensic science to intelligence-led policing (part I). Crime scene attendance and processing can be envisaged within this view. This approach gives indications abouthowto structure knowledge used by crime scene examiners in their effective practice (part II).
Resumo:
In this article, the author provides a framework to guide¦research in emotional intelligence. Studies conducted up¦to the present bear on a conception of emotional intelligence¦as pertaining to the domain of consciousness and¦investigate the construct with a correlational approach.¦As an alternative, the author explores processes underlying¦emotional intelligence, introducing the distinction¦between conscious and automatic processing as a potential¦source of variability in emotionally intelligent¦behavior. Empirical literature is reviewed to support the¦central hypothesis that individual differences in emotional¦intelligence may be best understood by considering¦the way individuals automatically process emotional¦stimuli. Providing directions for research, the author¦encourages the integration of experimental investigation¦of processes underlying emotional intelligence with¦correlational analysis of individual differences and¦fosters the exploration of the automaticity component¦of emotional intelligence.
Resumo:
The main outcome of the master thesis is innovative solution, which can support a choice of business process modeling methodology. Potential users of this tool are people with background in business process modeling and possibilities to collect required information about organization’s business processes. Master thesis states the importance of business process modeling in implementation of strategic goals of organization. It is made by revealing the place of the concept in Business Process Management (BPM) and its particular case Business Process Reengineering (BPR). In order to support the theoretical outcomes of the thesis a case study of Northern Dimension Research Centre (NORDI) in Lappeenranta University of Technology was conducted. On its example several solutions are shown: how to apply business process modeling methodologies in practice; in which way business process models can be useful for BPM and BPR initiatives; how to apply proposed innovative solution for a choice of business process modeling methodology.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.
Resumo:
The theoretical research of the study focused to business process management and business process modeling, the goal was to found a new business process modeling method for electrical accessories manufacturing enterprise. The focus was to find few options for business process modeling methods where company could have chosen the best one for its needs The study was carried out as a qualitative research with an action study and a case study as the most important ways collect data. In the empirical part of the study examples of company’s processes modeled with the new modeling method and process modeling process are presented. The new way of modeling processes improves especially visual presentation of the processes and improves the understanding how employees should work in the organizational interfaces of the process and in the interfaces between different processes. The results of the study is a new unified way to model company’s processes, which makes it easier to understand and create the process models. This improved readability makes it possible to reduce the costs that were created from the unclear old process models.
Resumo:
The development of carbon capture and storage (CCS) has raised interest towards novel fluidised bed (FB) energy applications. In these applications, limestone can be utilized for S02 and/or CO2 capture. The conditions in the new applications differ from the traditional atmospheric and pressurised circulating fluidised bed (CFB) combustion conditions in which the limestone is successfully used for SO2 capture. In this work, a detailed physical single particle model with a description of the mass and energy transfer inside the particle for limestone was developed. The novelty of this model was to take into account the simultaneous reactions, changing conditions, and the effect of advection. Especially, the capability to study the cyclic behaviour of limestone on both sides of the calcination-carbonation equilibrium curve is important in the novel conditions. The significances of including advection or assuming diffusion control were studied in calcination. Especially, the effect of advection in calcination reaction in the novel combustion atmosphere was shown. The model was tested against experimental data; sulphur capture was studied in a laboratory reactor in different fluidised bed conditions. Different Conversion levels and sulphation patterns were examined in different atmospheres for one limestone type. The Conversion curves were well predicted with the model, and the mechanisms leading to the Conversion patterns were explained with the model simulations. In this work, it was also evaluated whether the transient environment has an effect on the limestone behaviour compared to the averaged conditions and in which conditions the effect is the largest. The difference between the averaged and transient conditions was notable only in the conditions which were close to the calcination-carbonation equilibrium curve. The results of this study suggest that the development of a simplified particle model requires a proper understanding of physical and chemical processes taking place in the particle during the reactions. The results of the study will be required when analysing complex limestone reaction phenomena or when developing the description of limestone behaviour in comprehensive 3D process models. In order to transfer the experimental observations to furnace conditions, the relevant mechanisms that take place need to be understood before the important ones can be selected for 3D process model. This study revealed the sulphur capture behaviour under transient oxy-fuel conditions, which is important when the oxy-fuel CFB process and process model are developed.