72 resultados para Interaction modeling. Model-based development. Interaction evaluation.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The leaching of elements from the surface of charged fly ash particles is known to be an unsteady process. The mass transfer resistance provided by the diffuse double layer has been quantified as one of the reasons for this delayed leaching. In this work, a model based on mass transfer principles for predicting the concentration of calcium hydroxide in the diffuse double layer is presented. The significant difference between predicted calcium hydroxide concentration and the experimentally measured is explained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a biventricular model, which couples the electrical and mechanical properties of the heart, and computer simulations of ventricular wall motion and deformation by means of a biventricular model. In the constructed electromechanical model, the mechanical analysis was based on composite material theory and the finite-element method; the propagation of electrical excitation was simulated using an electrical heart model, and the resulting active forces were used to calculate ventricular wall motion. Regional deformation and Lagrangian strain tensors were calculated during the systole phase. Displacements, minimum principal strains and torsion angle were used to describe the motion of the two ventricles. The simulations showed that during the period of systole, (1) the right ventricular free wall moves towards the septum, and at the same time, the base and middle of the free wall move towards the apex, which reduces the volume of the right ventricle; the minimum principle strain (E3) is largest at the apex, then at the middle of the free wall and its direction is in the approximate direction of the epicardial muscle fibres; (2) the base and middle of the left ventricular free wall move towards the apex and the apex remains almost static; the torsion angle is largest at the apex; the minimum principle strain E3 is largest at the apex and its direction on the surface of the middle wall of the left ventricle is roughly in the fibre orientation. These results are in good accordance with results obtained from MR tagging images reported in the literature. This study suggests that such an electromechanical biventricular model has the potential to be used to assess the mechanical function of the two ventricles, and also could improve the accuracy ECG simulation when it is used in heart torso model-based body surface potential simulation studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pervasive computing applications must be engineered to provide unprecedented levels of flexibility in order to reconfigure and adapt in response to changes in computing resources and user requirements. To meet these challenges, appropriate software engineering abstractions and infrastructure are required as a platform on which to build adaptive applications. In this paper, we demonstrate the use of a disciplined, model-based approach to engineer a context-aware Session Initiation Protocol (SIP) based communication application. This disciplined approach builds on our previously developed conceptual models and infrastructural components, which enable the description, acquisition, management and exploitation of arbitrary types of context and user preference information to enable adaptation to context changes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Model transformations are an integral part of model-driven development. Incremental updates are a key execution scenario for transformations in model-based systems, and are especially important for the evolution of such systems. This paper presents a strategy for the incremental maintenance of declarative, rule-based transformation executions. The strategy involves recording dependencies of the transformation execution on information from source models and from the transformation definition. Changes to the source models or the transformation itself can then be directly mapped to their effects on transformation execution, allowing changes to target models to be computed efficiently. This particular approach has many benefits. It supports changes to both source models and transformation definitions, it can be applied to incomplete transformation executions, and a priori knowledge of volatility can be used to further increase the efficiency of change propagation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple sampling is widely used in vadose zone percolation experiments to investigate the extent in which soil structure heterogeneities influence the spatial and temporal distributions of water and solutes. In this note, a simple, robust, mathematical model, based on the beta-statistical distribution, is proposed as a method of quantifying the magnitude of heterogeneity in such experiments. The model relies on fitting two parameters, alpha and zeta to the cumulative elution curves generated in multiple-sample percolation experiments. The model does not require knowledge of the soil structure. A homogeneous or uniform distribution of a solute and/or soil-water is indicated by alpha = zeta = 1, Using these parameters, a heterogeneity index (HI) is defined as root 3 times the ratio of the standard deviation and mean. Uniform or homogeneous flow of water or solutes is indicated by HI = 1 and heterogeneity is indicated by HI > 1. A large value for this index may indicate preferential flow. The heterogeneity index relies only on knowledge of the elution curves generated from multiple sample percolation experiments and is, therefore, easily calculated. The index may also be used to describe and compare the differences in solute and soil-water percolation from different experiments. The use of this index is discussed for several different leaching experiments. (C) 1999 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Test templates and a test template framework are introduced as useful concepts in specification-based testing. The framework can be defined using any model-based specification notation and used to derive tests from model-based specifications-in this paper, it is demonstrated using the Z notation. The framework formally defines test data sets and their relation to the operations in a specification and to other test data sets, providing structure to the testing process. Flexibility is preserved, so that many testing strategies can be used. Important application areas of the framework are discussed, including refinement of test data, regression testing, and test oracles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Weiss and Isen have provided many supportive comments about the multi-level perspective, but also found limitations. Isen noted the importance of integrating affect, cognition, and motivation. Weiss commented similarly that the model lacked an integrating “thread.” He suggested that, to be truly multilevel, each level should constrain processes at other levels, and also provide guidance for the development of new concepts. Weiss also noted that the focus on biological processes was a strength of the model. I respond by suggesting that these very biological processes may constitute the “missing” thread. To illustrate this, I discuss some of the recent research on emotions in organizational settings, and argue that biology both constrains and guides theory at each level of the model. Based on this proposition, I revisit each of the five levels in the model, to demonstrate how this integration can be accomplished in this fashion. Finally, I address two additional points: aggregation bias, and the possibility of extending the model to include higher levels of industry and region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Six of the short dietary questions used in the 1995 National Nutrition Survey (see box below) were evaluated for relative validity both directly and indirectly and for consistency, by documenting the differences in mean intakes of foods and nutrients as measured on the 24-hour recall, between groups with different responses to the short questions. 1. Including snacks, how many times do you usually have something to eat in a day including evenings? 2. How many days per week do you usually have something to eat for breakfast? 3. In the last 12 months, were there any times that you ran out of food and couldn’t afford to buy more? 4. What type of milk do you usually consume? 5. How many serves of vegetables do you usually eat each day? (a serve = 1/2 cup cooked vegetables or 1 cup of salad vegetables) 6. How many serves of fruit do you usually eat each day? (a serve = 1 medium piece or 2 small pieces of fruit or 1 cup of diced pieces) These comparisons were made for males and females overall and for population sub-groups of interest including: age, socio-economic disadvantage, region of residence, country of birth, and BMI category. Several limitations to this evaluation of the short questions, as discussed in the report, need to be kept in mind including: · The method for comparison available (24-hour recall) was not ideal (gold standard); as it measures yesterday’s intake. This limitation was overcome by examining only mean differences between groups of respondents, since mean intake for a group can provide a reasonable approximation for ‘usual’ intake. · The need to define and identify, post-hoc, from the 24-hour recall the number of eating occasions, and occasions identified by the respondents as breakfast. · Predetermined response categories for some of the questions effectively limited the number of categories available for evaluation. · Other foods and nutrients, not selected for this evaluation, may have an indirect relationship with the question, and might have shown stronger and more consistent responses. · The number of responses in some categories of the short questions eg for food security may have been too small to detect significant differences between population sub-groups. · No information was available to examine the validity of these questions for detecting differences over time (establishing trends) in food habits and indicators of selected nutrient intakes. By contrast, the strength of this evaluation was its very large sample size, (atypical of most validation studies of dietary assessment) and thus, the opportunity to investigate question performance in a range of broad population sub-groups compared with a well-conducted, quantified survey of intakes. The results of the evaluation are summarised below for each of the questions and specific recommendations for future testing, modifications and use provided for each question. The report concludes with some general recommendations for the further development and evaluation of short dietary questions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dynamic modelling methodology, which combines on-line variable estimation and parameter identification with physical laws to form an adaptive model for rotary sugar drying processes, is developed in this paper. In contrast to the conventional rate-based models using empirical transfer coefficients, the heat and mass transfer rates are estimated by using on-line measurements in the new model. Furthermore, a set of improved sectional solid transport equations with localized parameters is developed in this work to reidentified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.place the global correlation for the computation of solid retention time. Since a number of key model variables and parameters are identified on-line using measurement data, the model is able to closely track the dynamic behaviour of rotary drying processes within a broad range of operational conditions. This adaptive model is validated against experimental data obtained from a pilot-scale rotary sugar dryer. The proposed modelling methodology can be easily incorporated into nonlinear model based control schemes to form a unified modelling and control framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Map algebra is a data model and simple functional notation to study the distribution and patterns of spatial phenomena. It uses a uniform representation of space as discrete grids, which are organized into layers. This paper discusses extensions to map algebra to handle neighborhood operations with a new data type called a template. Templates provide general windowing operations on grids to enable spatial models for cellular automata, mathematical morphology, and local spatial statistics. A programming language for map algebra that incorporates templates and special processing constructs is described. The programming language is called MapScript. Example program scripts are presented to perform diverse and interesting neighborhood analysis for descriptive, model-based and processed-based analysis.