908 resultados para dual-process model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper advocates the effectiveness of a dual technique model of interviewing, which combines narrative and depth interview techniques, within the case study method in a cross-cultural management research setting, an Australian MNC operating in China. The case study is acknowledged to be a highly appropriate method for gaining insight into the complicated area of cross-cultural management enquiry in order to generate new theories. In this context, we propose a model which combines the narrative and the depth interview techniques in the interview process, and have illustrated its usefulness with material drawn from the China-Australia cross-cultural research interface.

After establishing the rationale for the model, the discussion focuses on the practicalities of applying it in interviews, in relation to the preparation, warm-up and trust building phases, and in the exercise of personal interviewing skills in cross-cultural research, in this case, the advantage of the interviewer being bilingual.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the present article is to introduce dual-process theories – in particular the default-interventionist model – as an overarching framework for attention-related research in sports. Dual-process theories propose that two different types of processing guide human behavior. Type 1 processing is independent of available working memory capacity (WMC), whereas Type 2 processing depends on available working memory capacity. We review the latest theoretical developments on dual-process theories and present evidence for the validity of dual-process theories from various domains. We demonstrate how existing sport psychology findings can be integrated within the dual-process framework. We illustrate how future sport psychology research might benefit from adopting the dual-process framework as a meta-theoretical framework by arguing that the complex interplay between Type 1 and Type 2 processing has to be taken into account in order to gain a more complete understanding of the dynamic nature of attentional processing during sport performance at varying levels of expertise. Finally, we demonstrate that sport psychology applications might benefit from the dual-process perspective as well: dual-process theories are able to predict which behaviors can be more successfully executed when relying on Type 1 processing and which behaviors benefit from Type 2 processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with the problem of using the data mining models in a real-world situation where the user can not provide all the inputs with which the predictive model is built. A learning system framework, Query Based Learning System (QBLS), is developed for improving the performance of the predictive models in practice where not all inputs are available for querying to the system. The automatic feature selection algorithm called Query Based Feature Selection (QBFS) is developed for selecting features to obtain a balance between the relative minimum subset of features and the relative maximum classification accuracy. Performance of the QBLS system and the QBFS algorithm is successfully demonstrated with a real-world application

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This appendix describes the Order Fulfillment process followed by a fictitious company named Genko Oil. The process is freely inspired by the VICS (Voluntary Inter-industry Commerce Solutions) reference model1 and provides a demonstration of YAWL’s capabilities in modelling complex control-flow, data and resourcing requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While Business Process Management (BPM) is an established discipline, the increased adoption of BPM technology in recent years has introduced new challenges. One challenge concerns dealing with process model complexity in order to improve the understanding of a process model by stakeholders and process analysts. Features for dealing with this complexity can be classified in two categories: 1) those that are solely concerned with the appearance of the model, and 2) those that in essence change the structure of the model. In this paper we focus on the former category and present a collection of patterns that generalize and conceptualize various existing features. The paper concludes with a detailed analysis of the degree of support of a number of state-of-the-art languages and language implementations for these patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Exercise could contribute to weight loss by altering the sensitivity of the appetite regulatory system. Objective: The aim of this study was to assess the effects of 12 wk of mandatory exercise on appetite control. Design: Fifty-eight overweight and obese men and women [mean (±SD) body mass index (in kg/m2) = 31.8 ± 4.5, age = 39.6 ± 9.8 y, and maximal oxygen intake = 29.1 ± 5.7 mL · kg–1 · min–1] completed 12 wk of supervised exercise in the laboratory. The exercise sessions were designed to expend 2500 kcal/wk. Subjective appetite sensations and the satiating efficiency of a fixed breakfast were compared at baseline (week 0) and at week 12. An Electronic Appetite Rating System was used to measure subjective appetite sensations immediately before and after the fixed breakfast in the immediate postprandial period and across the whole day. The satiety quotient of the breakfast was determined by calculating the change in appetite scores relative to the breakfast's energy content. Results: Despite large variability, there was a significant reduction in mean body weight (3.2 ± 3.6 kg), fat mass (3.2 ± 2.2 kg), and waist circumference (5.0 ± 3.2 cm) after 12 wk. The analysis showed that a reduction in body weight and body composition was accompanied by an increase in fasting hunger and in average hunger across the day (P < 0.0001). Paradoxically, the immediate and delayed satiety quotient of the breakfast also increased significantly (P < 0.05). Conclusions: These data show that the effect of exercise on appetite regulation involves at least 2 processes: an increase in the overall (orexigenic) drive to eat and a concomitant increase in the satiating efficiency of a fixed meal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them actually being used. After all, what is not understood cannot be acted upon. Yet until now, understandability has primarily been defined as an intrinsic quality of the models themselves. Moreover, those studies that looked at understandability from a user perspective have mainly conceptualized users through rather arbitrary sets of variables. In this paper we advance an integrative framework to understand the role of the user in the process of understanding process models. Building on cognitive psychology, goal-setting theory and multimedia learning theory, we identify three stages of learning required to realize model understanding, these being Presage, Process, and Product. We define eight relevant user characteristics in the Presage stage of learning, three knowledge construction variables in the Process stage and three potential learning outcomes in the Product stage. To illustrate the benefits of the framework, we review existing process modeling work to identify where our framework can complement and extend existing studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a result of the growing adoption of Business Process Management (BPM) technology different stakeholders need to understand and agree upon the process models that are used to configure BPM systems. However, BPM users have problems dealing with the complexity of such models. Therefore, the challenge is to improve the comprehension of process models. While a substantial amount of literature is devoted to this topic, there is no overview of the various mechanisms that exist to deal with managing complexity in (large) process models. It is thus hard to obtain comparative insight into the degree of support offered for various complexity reducing mechanisms by state-of-the-art languages and tools. This paper focuses on complexity reduction mechanisms that affect the abstract syntax of a process model, i.e. the structure of a process model. These mechanisms are captured as patterns, so that they can be described in their most general form and in a language- and tool-independent manner. The paper concludes with a comparative overview of the degree of support for these patterns offered by state-of-the-art languages and language implementations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As organizations reach higher levels of Business Process Management maturity, they tend to collect numerous business process models. Such models may be linked with each other or mutually overlap, supersede one another and evolve over time. Moreover, they may be represented at different abstraction levels depending on the target audience and modeling purpose, and may be available in multiple languages (e.g. due to company mergers). Thus, it is common that organizations struggle with keeping track of their process models. This demonstration introduces AProMoRe (Advanced Process Model Repository) which aims to facilitate the management of (large) process model collections.