926 resultados para Process Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Influencing more environmentally friendly and sustainable behaviour is a current focus of many projects, ranging from government social marketing campaigns, education and tax structures to designers’ work on interactive products, services and environments. There is a wide variety of techniques and methods used, intended to work via different sets of cognitive and environmental principles. These approaches make different assumptions about ‘what people are like’: how users will respond to behavioural interventions, and why, and in the process reveal some of the assumptions that designers and other stakeholders, such as clients commissioning a project, make about human nature. This paper discusses three simple models of user behaviour – the pinball, the shortcut and the thoughtful – which emerge from user experience designers’ statements about users while focused on designing for behaviour change. The models are characterised using systems terminology and the application of each model to design for sustainable behaviour is examined via a series of examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’un des problèmes importants en apprentissage automatique est de déterminer la complexité du modèle à apprendre. Une trop grande complexité mène au surapprentissage, ce qui correspond à trouver des structures qui n’existent pas réellement dans les données, tandis qu’une trop faible complexité mène au sous-apprentissage, c’est-à-dire que l’expressivité du modèle est insuffisante pour capturer l’ensemble des structures présentes dans les données. Pour certains modèles probabilistes, la complexité du modèle se traduit par l’introduction d’une ou plusieurs variables cachées dont le rôle est d’expliquer le processus génératif des données. Il existe diverses approches permettant d’identifier le nombre approprié de variables cachées d’un modèle. Cette thèse s’intéresse aux méthodes Bayésiennes nonparamétriques permettant de déterminer le nombre de variables cachées à utiliser ainsi que leur dimensionnalité. La popularisation des statistiques Bayésiennes nonparamétriques au sein de la communauté de l’apprentissage automatique est assez récente. Leur principal attrait vient du fait qu’elles offrent des modèles hautement flexibles et dont la complexité s’ajuste proportionnellement à la quantité de données disponibles. Au cours des dernières années, la recherche sur les méthodes d’apprentissage Bayésiennes nonparamétriques a porté sur trois aspects principaux : la construction de nouveaux modèles, le développement d’algorithmes d’inférence et les applications. Cette thèse présente nos contributions à ces trois sujets de recherches dans le contexte d’apprentissage de modèles à variables cachées. Dans un premier temps, nous introduisons le Pitman-Yor process mixture of Gaussians, un modèle permettant l’apprentissage de mélanges infinis de Gaussiennes. Nous présentons aussi un algorithme d’inférence permettant de découvrir les composantes cachées du modèle que nous évaluons sur deux applications concrètes de robotique. Nos résultats démontrent que l’approche proposée surpasse en performance et en flexibilité les approches classiques d’apprentissage. Dans un deuxième temps, nous proposons l’extended cascading Indian buffet process, un modèle servant de distribution de probabilité a priori sur l’espace des graphes dirigés acycliques. Dans le contexte de réseaux Bayésien, ce prior permet d’identifier à la fois la présence de variables cachées et la structure du réseau parmi celles-ci. Un algorithme d’inférence Monte Carlo par chaîne de Markov est utilisé pour l’évaluation sur des problèmes d’identification de structures et d’estimation de densités. Dans un dernier temps, nous proposons le Indian chefs process, un modèle plus général que l’extended cascading Indian buffet process servant à l’apprentissage de graphes et d’ordres. L’avantage du nouveau modèle est qu’il admet les connections entres les variables observables et qu’il prend en compte l’ordre des variables. Nous présentons un algorithme d’inférence Monte Carlo par chaîne de Markov avec saut réversible permettant l’apprentissage conjoint de graphes et d’ordres. L’évaluation est faite sur des problèmes d’estimations de densité et de test d’indépendance. Ce modèle est le premier modèle Bayésien nonparamétrique permettant d’apprendre des réseaux Bayésiens disposant d’une structure complètement arbitraire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water removal in paper manufacturing is an energy-intensive process. The dewatering process generally consists of four stages of which the first three stages include mechanical water removal through gravity filtration, vacuum dewatering and wet pressing. In the fourth stage, water is removed thermally, which is the most expensive stage in terms of energy use. In order to analyse water removal during a vacuum dewatering process, a numerical model was created by using a Level-Set method. Various different 2D structures of the paper model were created in MATLAB code with randomly positioned circular fibres with identical orientation. The model considers the influence of the forming fabric which supports the paper sheet during the dewatering process, by using volume forces to represent flow resistance in the momentum equation. The models were used to estimate the dry content of the porous structure for various dwell times. The relation between dry content and dwell time was compared to laboratory data for paper sheets with basis weights of 20 and 50 g/m2 exposed to vacuum levels between 20 kPa and 60 kPa. The comparison showed reasonable results for dewatering and air flow rates. The random positioning of the fibres influences the dewatering rate slightly. In order to achieve more accurate comparisons, the random orientation of the fibres needs to be considered, as well as the deformation and displacement of the fibres during the dewatering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business Process Management (BPM) is able to organize and frame a company focusing in the improvement or assurance of performance in order to gain competitive advantage. Although it is believed that BPM improves various aspects of organizational performance, there has been a lack of empirical evidence about this. The present study has the purpose to develop a model to show the impact of business process management in organizational performance. To accomplish that, the theoretical basis required to know the elements that configurate BPM and the measures that can evaluate the BPM success on organizational performance is built through a systematic literature review (SLR). Then, a research model is proposed according to SLR results. Empirical data will be collected from a survey of  larg and mid-sized industrial and service companies headquartered in Brazil. A quantitative analysis will be performed using structural equation modeling (SEM) to show if the direct effects among BPM and organizational performance can be considered statistically significant. At the end will discuss these results and their managerial and cientific implications.Keywords: Business process management (BPM). Organizational performance. Firm performance. Business models. Structural Equation Modeling. Systematic Literature Review.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adsorption of food dyes acid blue 9 and food yellow 3 onto chitosan was optimized. Chitosan was obtained from shrimp wastes and characterized.Afull factorial design was used to analyze the effects of pH, stirring rate and contact time in adsorption capacity. In the optimal conditions, adsorption kinetics was studied and the experimental data were fitted with three kinetic models. The produced chitosan showed good characteristics for dye adsorption. The optimal conditions were: pH 3, 150rpm and 60 min for acid blue 9 and pH 3, 50rpm and 60 min for food yellow 3. In these conditions, the adsorption capacities values were 210mgg−1 and 295mgg−1 for acid blue 9 and food yellow 3, respectively. The Elovich kinetic model was the best fit for experimental data and it showed the chemical nature of dyes adsorption onto chitosan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes two new techniques designed to enhance the performance of fire field modelling software. The two techniques are "group solvers" and automated dynamic control of the solution process, both of which are currently under development within the SMARTFIRE Computational Fluid Dynamics environment. The "group solver" is a derivation of common solver techniques used to obtain numerical solutions to the algebraic equations associated with fire field modelling. The purpose of "group solvers" is to reduce the computational overheads associated with traditional numerical solvers typically used in fire field modelling applications. In an example, discussed in this paper, the group solver is shown to provide a 37% saving in computational time compared with a traditional solver. The second technique is the automated dynamic control of the solution process, which is achieved through the use of artificial intelligence techniques. This is designed to improve the convergence capabilities of the software while further decreasing the computational overheads. The technique automatically controls solver relaxation using an integrated production rule engine with a blackboard to monitor and implement the required control changes during solution processing. Initial results for a two-dimensional fire simulation are presented that demonstrate the potential for considerable savings in simulation run-times when compared with control sets from various sources. Furthermore, the results demonstrate the potential for enhanced solution reliability due to obtaining acceptable convergence within each time step, unlike some of the comparison simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of a long-term project aimed at designing classroom interventions to motivate language learners, we have searched for a motivation model that could serve as a theoretical basis for the methodological applications. We have found that none of the existing models we considered were entirely adequate for our purpose for three reasons: (1) they did not provide a sufficiently comprehensive and detailed summary of all the relevant motivational influences on classroom behaviour; (2) they tended to focus on how and why people choose certain courses of action, while ignoring or playing down the importance of motivational sources of executing goal-directed behaviour; and (3) they did not do justice to the fact that motivation is not static but dynamically evolving and changing in time, making it necessary for motivation constructs to contain a featured temporal axis. Consequently, partly inspired by Heckhausen and Kuhl's 'Action Control Theory', we have developed a new 'Process Model of L2 Motivation', which is intended both to account for the dynamics of motivational change in time and to synthesise many of the most important motivational conceptualisations to date. In this paper we describe the main components of this model, also listing a number of its limitations which need to be resolved in future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Evidence suggests that both the migration and activation of neutrophils into the airway is of importance in pathological conditions such as pulmonary emphysema. In the present study, we describe in vivo models of lung neutrophil infiltration and activation in mice and hamsters. RESULTS: BALB/c and C57BL/6 mice were intranasally treated with lipopolysaccharide (0.3 mg/kg). Twenty-four hours after, animals were treated intranasally with N-Formyl-Met-Leu-Phe (0 to 5 mg/kg). Golden Syrian hamsters were treated intratracheally with 0.5 mg/kg of lipopolysaccharide. Twenty-four hours after, animals were treated intratracheally with 0.25 mg/kg of N-Formyl-Met-Leu-Phe. Both mice and hamster were sacrificed two hours after the N-Formyl-Met-Leu-Phe application. In both BALB/c and C57BL/6 mice, a neutrophil infiltration was observed after the sequential application of lipopolysaccharide and N-Formyl-Met-Leu-Phe. However, 5 times less neutrophil was found in C57BL/6 mice when compared to BALB/c mice. This was reflected in the neutrophil activation parameters measured (myeloperoxidase and elastase activities). Despite the presence of neutrophil and their activation status, no lung haemorrhage could be detected in both strains of mice. When compared with mice, the lung inflammation induced by the sequential application of lipopolysaccharide and N-Formyl-Met-Leu-Phe was much greater in the hamster. In parallel with this lung inflammation, a significant lung haemorrhage was also observed. CONCLUSIONS: Both mouse and hamster can be used for pharmacological studies of new drugs or other therapeutics agents that aimed to interfere with neutrophil activation. However, only the hamster model seems to be suitable for studying the haemorrhagic lung injury process

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual recognition is a fundamental research topic in computer vision. This dissertation explores datasets, features, learning, and models used for visual recognition. In order to train visual models and evaluate different recognition algorithms, this dissertation develops an approach to collect object image datasets on web pages using an analysis of text around the image and of image appearance. This method exploits established online knowledge resources (Wikipedia pages for text; Flickr and Caltech data sets for images). The resources provide rich text and object appearance information. This dissertation describes results on two datasets. The first is Berg’s collection of 10 animal categories; on this dataset, we significantly outperform previous approaches. On an additional set of 5 categories, experimental results show the effectiveness of the method. Images are represented as features for visual recognition. This dissertation introduces a text-based image feature and demonstrates that it consistently improves performance on hard object classification problems. The feature is built using an auxiliary dataset of images annotated with tags, downloaded from the Internet. Image tags are noisy. The method obtains the text features of an unannotated image from the tags of its k-nearest neighbors in this auxiliary collection. A visual classifier presented with an object viewed under novel circumstances (say, a new viewing direction) must rely on its visual examples. This text feature may not change, because the auxiliary dataset likely contains a similar picture. While the tags associated with images are noisy, they are more stable when appearance changes. The performance of this feature is tested using PASCAL VOC 2006 and 2007 datasets. This feature performs well; it consistently improves the performance of visual object classifiers, and is particularly effective when the training dataset is small. With more and more collected training data, computational cost becomes a bottleneck, especially when training sophisticated classifiers such as kernelized SVM. This dissertation proposes a fast training algorithm called Stochastic Intersection Kernel Machine (SIKMA). This proposed training method will be useful for many vision problems, as it can produce a kernel classifier that is more accurate than a linear classifier, and can be trained on tens of thousands of examples in two minutes. It processes training examples one by one in a sequence, so memory cost is no longer the bottleneck to process large scale datasets. This dissertation applies this approach to train classifiers of Flickr groups with many group training examples. The resulting Flickr group prediction scores can be used to measure image similarity between two images. Experimental results on the Corel dataset and a PASCAL VOC dataset show the learned Flickr features perform better on image matching, retrieval, and classification than conventional visual features. Visual models are usually trained to best separate positive and negative training examples. However, when recognizing a large number of object categories, there may not be enough training examples for most objects, due to the intrinsic long-tailed distribution of objects in the real world. This dissertation proposes an approach to use comparative object similarity. The key insight is that, given a set of object categories which are similar and a set of categories which are dissimilar, a good object model should respond more strongly to examples from similar categories than to examples from dissimilar categories. This dissertation develops a regularized kernel machine algorithm to use this category dependent similarity regularization. Experiments on hundreds of categories show that our method can make significant improvement for categories with few or even no positive examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis defends the position that the Eastern Orthodoxy has the potential to develop, on the basis of its core concepts and doctrines, a new political theology that is participatory, personalist and universalist. This participatory political theology, as I name it, endorses modern democracy and the values of civic engagement. It enhances the process of democracy-building and consolidation in the SEE countries through cultivating the ethos of participation and concern with the common good among and the recognition of the dignity and freedom of the person. This political-theological model is developed while analyzing critically the traditional models of church-state relations (the symphonia model corresponding to the medieval empire and the Christian nation model corresponding to the nation-state) as being instrumentalized to serve the political goals of non-democratic regimes. The participatory political-theological model is seen as corresponding to the conditions of the constitutional democratic state. The research is justified by the fact the Eastern Orthodoxy has been a dominant religiouscultural force in the European South East for centuries, thus playing a significant role in the process of creation of the medieval and modern statehood of the SEE countries. The analysis employs comparative constitutional perspectives on democratic transition and consolidation in the SEE region with the theoretical approaches of political theology and Eastern Orthodox theology. The conceptual basis for the political-theological synthesis is found in the concept and doctrines of the Eastern Orthodoxy (theosis and synergy, ecclesia and Eucharist, conciliarity and catholicity, economy and eschatology) which emphasize the participatory, personalist and communal dimensions of the Orthodox faith and practice. The paradigms of revealing the political-theological potential of these concepts are the Eucharistic ecclesiology and the concept of divine-human communion as defining the body of Orthodox theology. The thesis argues that with its ethos of openness and engagement the participatory political theology presupposes political systems that are democratic, inclusive, and participatory, respecting the rights and the dignity of the person. The political theology developed here calls for a transformation and change of democratic systems towards better realization of their personalist and participatory commitments. In the context of the SEE countries the participatory political theology addresses the challenges posed by alternative authoritarian political theologies practiced in neighboring regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrometallurgical process modeling is the main objective of this Master’s thesis work. Three different leaching processes namely, high pressure pyrite oxidation, direct oxidation zinc concentrate (sphalerite) leaching and gold chloride leaching using rotating disc electrode (RDE) are modeled and simulated using gPROMS process simulation program in order to evaluate its model building capabilities. The leaching mechanism in each case is described in terms of a shrinking core model. The mathematical modeling carried out included process model development based on available literature, estimation of reaction kinetic parameters and assessment of the model reliability by checking the goodness fit and checking the cross correlation between the estimated parameters through the use of correlation matrices. The estimated parameter values in each case were compared with those obtained using the Modest simulation program. Further, based on the estimated reaction kinetic parameters, reactor simulation and modeling for direct oxidation zinc concentrate (sphalerite) leaching is carried out in Aspen Plus V8.6. The zinc leaching autoclave is based on Cominco reactor configuration and is modeled as a series of continuous stirred reactors (CSTRs). The sphalerite conversion is calculated and a sensitivity analysis is carried out so to determine the optimum reactor operation temperature and optimum oxygen mass flow rate. In this way, the implementation of reaction kinetic models into the process flowsheet simulation environment has been demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a industrial environment, to know the process one is working with is crucial to ensure its good functioning. In the present work, developed at Prio Biocombustíveis S.A. facilities, using process data, collected during the present work, and historical process data, the methanol recovery process was characterized, having started with the characterization of key process streams. Based on the information retrieved from the stream characterization, Aspen Plus® process simulation software was used to replicate the process and perform a sensitivity analysis with the objective of accessing the relative importance of certain key process variables (reflux/feed ratio, reflux temperature, reboiler outlet temperature, methanol, glycerol and water feed compositions). The work proceeded with the application of a set of statistical tools, starting with the Principal Components Analysis (PCA) from which the interactions between process variables and their contribution to the process variability was studied. Next, the Design of Experiments (DoE) was used to acquire experimental data and, with it, create a model for the water amount in the distillate. However, the necessary conditions to perform this method were not met and so it was abandoned. The Multiple Linear Regression method (MLR) was then used with the available data, creating several empiric models for the water at distillate, the one with the highest fit having a R2 equal to 92.93% and AARD equal to 19.44%. Despite the AARD still being relatively high, the model is still adequate to make fast estimates of the distillate’s quality. As for fouling, its presence has been noticed many times during this work. Not being possible to directly measure the fouling, the reboiler inlet steam pressure was used as an indicator of the fouling growth and its growth variation with the amount of Used Cooking Oil incorporated in the whole process. Comparing the steam cost associated to the reboiler’s operation when fouling is low (1.5 bar of steam pressure) and when fouling is high (reboiler’s steam pressure of 3 bar), an increase of about 58% occurs when the fouling increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blogging is one of the most common forms of social media today. Blogs have become a powerful media and bloggers are settled stakeholders to marketers. Commercialization of the blogosphere has enabled an increasing number of bloggers professionalize and blog as a full-time occupation. The purpose of this study is to understand the professionalization process of a blogger from an amateur blogger to a professional actor. The following sub-questions were used to further elaborate the topic: What have been the meaningful events and developments fostering professionalization? What are the prerequisites for popularity in blogging? Are there any key success factors to acknowledge in order being able to make business out of your blog? The theoretical framework of this study was formed based on the two chosen focus areas for professionalization; social drivers and business drivers. The theoretical framework is based on literature from fields of marketing and social sciences, as well as previous research on social media, blogging and professionalization. The study is a qualitative case-study and the research data was collected in a semi-structured interview. The case chosen to this study is a lifestyle-blog. The writer of the case blog has been able to develop her blog to become a full-time professional blogger. Based on the results, the professionalization process of a blogger is not a defined process, but instead comprised of coincidental events as well as considered advancements. Success in blogging is based on the bloggers own motivation and passion for writing and expressing oneself in the form of a blog, instead of a systematic construction of a successful career in blogging. Networking with other bloggers as well as affiliates was seen as an important success factor. Popularity in the blogosphere and a high number of followers enable professionalization, as marketers actively seek to collaborate with popular bloggers with strong personal brands. Bloggers with strong personal brands are especially attractive due to their opinion leadership in their reference group. A blogger can act professionally either as entrepreneur or blogging for a commercial webpage. According to the results of this study, it is beneficial for the blogger’s professional development as well as career progress, to act on different operating models