931 resultados para model-based reasoning


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a theoretical model for the analysis of decisions regarding farm household labour allocation. The agricultural household model is selected as the most appropriate theoretical framework; a model based on the assumption that households behave to maximise utility, which is a function of consumption and leisure, and is subject to time and budget constraints. The model can be used to describe the role of government subsidies in farm household labour allocation decisions; in particular the impact of decoupled subsidies on labour allocation can be examined. Decoupled subsidies are a labour-free payment and as such represent an increase in labour-free income or wealth. An increase in wealth allows farm households to work less while maintaining consumption. On the other hand, decoupled subsidies represent a decline in the return to farm labour and may lead to a substitution effect, i.e., farmers may choose to substitute non-farm work for farm work. The theoretical framework proposed in this paper allows for the examination of these two conflicting effects.

Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In a previous paper, Hoornaert et al. (Powder Technol. 96 (1998); 116-128) presented data from granulation experiments performed in a 50 L Lodige high shear mixer. In this study that same data was simulated with a population balance model. Based on an analysis of the experimental data, the granulation process was divided into three separate stages: nucleation, induction, and coalescence growth. These three stages were then simulated separately, with promising results. it is possible to derive a kernel that fit both the induction and the coalescence growth stage. Modeling the nucleation stage proved to be more challenging due to the complex mechanism of nucleus formation. From this work some recommendations are made for the improvement of this type of model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examine the event statistics obtained from two differing simplified models for earthquake faults. The first model is a reproduction of the Block-Slider model of Carlson et al. (1991), a model often employed in seismicity studies. The second model is an elastodynamic fault model based upon the Lattice Solid Model (LSM) of Mora and Place (1994). We performed simulations in which the fault length was varied in each model and generated synthetic catalogs of event sizes and times. From these catalogs, we constructed interval event size distributions and inter-event time distributions. The larger, localised events in the Block-Slider model displayed the same scaling behaviour as events in the LSM however the distribution of inter-event times was markedly different. The analysis of both event size and inter-event time statistics is an effective method for comparative studies of differing simplified models for earthquake faults.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Reliable information on causes of death is a fundamental component of health development strategies, yet globally only about one-third of countries have access to such information. For countries currently without adequate mortality reporting systems there are useful models other than resource-intensive population-wide medical certification. Sample-based mortality surveillance is one such approach. This paper provides methods for addressing appropriate sample size considerations in relation to mortality surveillance, with particular reference to situations in which prior information on mortality is lacking. Methods The feasibility of model-based approaches for predicting the expected mortality structure and cause composition is demonstrated for populations in which only limited empirical data is available. An algorithm approach is then provided to derive the minimum person-years of observation needed to generate robust estimates for the rarest cause of interest in three hypothetical populations, each representing different levels of health development. Results Modelled life expectancies at birth and cause of death structures were within expected ranges based on published estimates for countries at comparable levels of health development. Total person-years of observation required in each population could be more than halved by limiting the set of age, sex, and cause groups regarded as 'of interest'. Discussion The methods proposed are consistent with the philosophy of establishing priorities across broad clusters of causes for which the public health response implications are similar. The examples provided illustrate the options available when considering the design of mortality surveillance for population health monitoring purposes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The ‘leading coordinate’ approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many developing south-east Asian governments are not capturing full rent from domestic forest logging operations. Such rent losses are commonly related to institutional failures, where informal institutions tend to dominate the control of forestry activity in spite of weakly enforced regulations. Our model is an attempt to add a new dimension to thinking about deforestation. We present a simple conceptual model, based on individual decisions rather than social or forest planning, which includes the human dynamics of participation in informal activity and the relatively slower ecological dynamics of changes in forest resources. We demonstrate how incumbent informal logging operations can be persistent, and that any spending aimed at replacing the informal institutions can only be successful if it pushes institutional settings past some threshold. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Model transformations are an integral part of model-driven development. Incremental updates are a key execution scenario for transformations in model-based systems, and are especially important for the evolution of such systems. This paper presents a strategy for the incremental maintenance of declarative, rule-based transformation executions. The strategy involves recording dependencies of the transformation execution on information from source models and from the transformation definition. Changes to the source models or the transformation itself can then be directly mapped to their effects on transformation execution, allowing changes to target models to be computed efficiently. This particular approach has many benefits. It supports changes to both source models and transformation definitions, it can be applied to incomplete transformation executions, and a priori knowledge of volatility can be used to further increase the efficiency of change propagation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Knowledge management (KM) is an emerging discipline (Ives, Torrey & Gordon, 1997) and characterised by four processes: generation, codification, transfer, and application (Alavi & Leidner, 2001). Completing the loop, knowledge transfer is regarded as a precursor to knowledge creation (Nonaka & Takeuchi, 1995) and thus forms an essential part of the knowledge management process. The understanding of how knowledge is transferred is very important for explaining the evolution and change in institutions, organisations, technology, and economy. However, knowledge transfer is often found to be laborious, time consuming, complicated, and difficult to understand (Huber, 2001; Szulanski, 2000). It has received negligible systematic attention (Huber, 2001; Szulanski, 2000), thus we know little about it (Huber, 2001). However, some literature, such as Davenport and Prusak (1998) and Shariq (1999), has attempted to address knowledge transfer within an organisation, but studies on inter-organisational knowledge transfer are still much neglected. An emergent view is that it may be beneficial for organisations if more research can be done to help them understand and, thus, to improve their inter-organisational knowledge transfer process. Therefore, this article aims to provide an overview of the inter-organisational knowledge transfer and its related literature and present a proposed inter-organisational knowledge transfer process model based on theoretical and empirical studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

What does endogenous growth theory tell about regional economies? Empirics of R&D worker-based productivity growth, Regional Studies. Endogenous growth theory emerged in the 1990s as ‘new growth theory’ accounting for technical progress in the growth process. This paper examines the role of research and development (R&D) workers underlying the Romer model (1990) and its subsequent modifications, and compares it with a model based on the accumulation of human capital engaged in R&D. Cross-section estimates of the models against productivity growth of European regions in the 1990s suggest that each R&D worker has a unique set of knowledge while his/her contributions are enhanced by knowledge sharing within a region as well as spillovers from other regions in proximity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Feature detection is a crucial stage of visual processing. In previous feature-marking experiments we found that peaks in the 3rd derivative of the luminance profile can signify edges where there are no 1st derivative peaks nor 2nd derivative zero-crossings (Wallis and George 'Mach edges' (the edges of Mach bands) were nicely predicted by a new nonlinear model based on 3rd derivative filtering. As a critical test of the model, we now use a new class of stimuli, formed by adding a linear luminance ramp to the blurred triangle waves used previously. The ramp has no effect on the second or higher derivatives, but the nonlinear model predicts a shift from seeing two edges to seeing only one edge as the added ramp gradient increases. In experiment 1, subjects judged whether one or two edges were visible on each trial. In experiment 2, subjects used a cursor to mark perceived edges and bars. The position and polarity of the marked edges were close to model predictions. Both experiments produced the predicted shift from two to one Mach edge, but the shift was less complete than predicted. We conclude that the model is a useful predictor of edge perception, but needs some modification.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The preparation and characterisation of collagen: PCL, gelatin: PCL and gelatin/collagen:PCL biocomposites for manufacture of tissue engineered skin substitutes are reported. Films of collagen: PLC, gelatin: PCL (1:4, 1:8 and 1:20 w/w) and gelatin/collagen:PCL (1:8 and 1:20 w/w) biocomposites were prepared by impregnation of lyophilised collagen and/or gelatin mats by PCL solutions followed by solvent evaporation. In vitro assays of total protein release of collagen:PCL and gelatin: PCL biocomposite films revealed an expected inverse relationship between the collagen release rate and the content of synthetic polymer in the biocomposite samples that may be exploited for controlled presentation and release of biopharmaceuticals such as growth factors. Good compatibility of all biocomposite groups was proven by interaction with 3T3 fibroblasts, normal human epidermal keratinocytes (NHEK), and primary human epidermal keratinocytes (PHEK) and dermal fibroblasts (PHDF) in vitro respectively. The 1:20 collagen: PCL materials exhibiting good cell growth curves and mechanical characteristics were selected for engineering of skin substitutes in this work. The tissue-engineered skin model based on single-donor PHEK and PHDF with differentiated confluent epidermal layer and fibrous porous dermal layer was then developed successfully in vitro proven by SEM and immunohistochemistry assay. The following in vivo animal study on athymic mice revealed early complete wound healing in 10 days and good integration of co-cultured skin substitutes with adjacent mice skin structures. Thus the co-cultured skin substitutes based on 1:20 collagen: PCL biocomposite membranes was proven in principle. The approach to skin modelling reported here may find application in wound treatment, gene therapy and screening of new pharmaceuticals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis, I view the historical background of Zimbabwe to show the patterns of traditional life that existed prior to settlerism. The form, nature, pace and impact of settlerism and colonialism up to the time of independence are also discussed to show how they affected the health of the population and the pace of development of the country. The political, social and economic underdevelopment of the African people that occurred in Zimbabwe prior to independence was a result of deliberate, politically motivated and controlled policy initiatives. These led to inequatable, inadequate, inappropriate and inaccessible health care provision. It is submitted that since it was the politics that determined the pace of underdevelopment, it must be the politics that must be at the forefront of the development strategy adopted. In the face of the amed conflict that existed in Zimbabwe, existing frameworks of analyses are shown to be inadequate for planning purposes because of their inability to provide indications about the stability of future outcomes. The Metagame technique of analysis of options is proposed as a methology that can be applied in such situations. It rejects deterministic predicative models as misleading and advocates an interactive model based on objective and subjective valuation of human behaviour. In conclusion, the search for stable outcomes rather than optimal and best solutions strategies is advocated in decision making in organisations of all sizes.