929 resultados para model-based
Resumo:
Motivation: The clustering of gene profiles across some experimental conditions of interest contributes significantly to the elucidation of unknown gene function, the validation of gene discoveries and the interpretation of biological processes. However, this clustering problem is not straightforward as the profiles of the genes are not all independently distributed and the expression levels may have been obtained from an experimental design involving replicated arrays. Ignoring the dependence between the gene profiles and the structure of the replicated data can result in important sources of variability in the experiments being overlooked in the analysis, with the consequent possibility of misleading inferences being made. We propose a random-effects model that provides a unified approach to the clustering of genes with correlated expression levels measured in a wide variety of experimental situations. Our model is an extension of the normal mixture model to account for the correlations between the gene profiles and to enable covariate information to be incorporated into the clustering process. Hence the model is applicable to longitudinal studies with or without replication, for example, time-course experiments by using time as a covariate, and to cross-sectional experiments by using categorical covariates to represent the different experimental classes. Results: We show that our random-effects model can be fitted by maximum likelihood via the EM algorithm for which the E(expectation) and M(maximization) steps can be implemented in closed form. Hence our model can be fitted deterministically without the need for time-consuming Monte Carlo approximations. The effectiveness of our model-based procedure for the clustering of correlated gene profiles is demonstrated on three real datasets, representing typical microarray experimental designs, covering time-course, repeated-measurement and cross-sectional data. In these examples, relevant clusters of the genes are obtained, which are supported by existing gene-function annotation. A synthetic dataset is considered too.
Resumo:
The ‘leading coordinate’ approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information.
Resumo:
Many developing south-east Asian governments are not capturing full rent from domestic forest logging operations. Such rent losses are commonly related to institutional failures, where informal institutions tend to dominate the control of forestry activity in spite of weakly enforced regulations. Our model is an attempt to add a new dimension to thinking about deforestation. We present a simple conceptual model, based on individual decisions rather than social or forest planning, which includes the human dynamics of participation in informal activity and the relatively slower ecological dynamics of changes in forest resources. We demonstrate how incumbent informal logging operations can be persistent, and that any spending aimed at replacing the informal institutions can only be successful if it pushes institutional settings past some threshold. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
Model transformations are an integral part of model-driven development. Incremental updates are a key execution scenario for transformations in model-based systems, and are especially important for the evolution of such systems. This paper presents a strategy for the incremental maintenance of declarative, rule-based transformation executions. The strategy involves recording dependencies of the transformation execution on information from source models and from the transformation definition. Changes to the source models or the transformation itself can then be directly mapped to their effects on transformation execution, allowing changes to target models to be computed efficiently. This particular approach has many benefits. It supports changes to both source models and transformation definitions, it can be applied to incomplete transformation executions, and a priori knowledge of volatility can be used to further increase the efficiency of change propagation.
Resumo:
Knowledge management (KM) is an emerging discipline (Ives, Torrey & Gordon, 1997) and characterised by four processes: generation, codification, transfer, and application (Alavi & Leidner, 2001). Completing the loop, knowledge transfer is regarded as a precursor to knowledge creation (Nonaka & Takeuchi, 1995) and thus forms an essential part of the knowledge management process. The understanding of how knowledge is transferred is very important for explaining the evolution and change in institutions, organisations, technology, and economy. However, knowledge transfer is often found to be laborious, time consuming, complicated, and difficult to understand (Huber, 2001; Szulanski, 2000). It has received negligible systematic attention (Huber, 2001; Szulanski, 2000), thus we know little about it (Huber, 2001). However, some literature, such as Davenport and Prusak (1998) and Shariq (1999), has attempted to address knowledge transfer within an organisation, but studies on inter-organisational knowledge transfer are still much neglected. An emergent view is that it may be beneficial for organisations if more research can be done to help them understand and, thus, to improve their inter-organisational knowledge transfer process. Therefore, this article aims to provide an overview of the inter-organisational knowledge transfer and its related literature and present a proposed inter-organisational knowledge transfer process model based on theoretical and empirical studies.
Resumo:
What does endogenous growth theory tell about regional economies? Empirics of R&D worker-based productivity growth, Regional Studies. Endogenous growth theory emerged in the 1990s as ‘new growth theory’ accounting for technical progress in the growth process. This paper examines the role of research and development (R&D) workers underlying the Romer model (1990) and its subsequent modifications, and compares it with a model based on the accumulation of human capital engaged in R&D. Cross-section estimates of the models against productivity growth of European regions in the 1990s suggest that each R&D worker has a unique set of knowledge while his/her contributions are enhanced by knowledge sharing within a region as well as spillovers from other regions in proximity.
Resumo:
Feature detection is a crucial stage of visual processing. In previous feature-marking experiments we found that peaks in the 3rd derivative of the luminance profile can signify edges where there are no 1st derivative peaks nor 2nd derivative zero-crossings (Wallis and George 'Mach edges' (the edges of Mach bands) were nicely predicted by a new nonlinear model based on 3rd derivative filtering. As a critical test of the model, we now use a new class of stimuli, formed by adding a linear luminance ramp to the blurred triangle waves used previously. The ramp has no effect on the second or higher derivatives, but the nonlinear model predicts a shift from seeing two edges to seeing only one edge as the added ramp gradient increases. In experiment 1, subjects judged whether one or two edges were visible on each trial. In experiment 2, subjects used a cursor to mark perceived edges and bars. The position and polarity of the marked edges were close to model predictions. Both experiments produced the predicted shift from two to one Mach edge, but the shift was less complete than predicted. We conclude that the model is a useful predictor of edge perception, but needs some modification.
Resumo:
The preparation and characterisation of collagen: PCL, gelatin: PCL and gelatin/collagen:PCL biocomposites for manufacture of tissue engineered skin substitutes are reported. Films of collagen: PLC, gelatin: PCL (1:4, 1:8 and 1:20 w/w) and gelatin/collagen:PCL (1:8 and 1:20 w/w) biocomposites were prepared by impregnation of lyophilised collagen and/or gelatin mats by PCL solutions followed by solvent evaporation. In vitro assays of total protein release of collagen:PCL and gelatin: PCL biocomposite films revealed an expected inverse relationship between the collagen release rate and the content of synthetic polymer in the biocomposite samples that may be exploited for controlled presentation and release of biopharmaceuticals such as growth factors. Good compatibility of all biocomposite groups was proven by interaction with 3T3 fibroblasts, normal human epidermal keratinocytes (NHEK), and primary human epidermal keratinocytes (PHEK) and dermal fibroblasts (PHDF) in vitro respectively. The 1:20 collagen: PCL materials exhibiting good cell growth curves and mechanical characteristics were selected for engineering of skin substitutes in this work. The tissue-engineered skin model based on single-donor PHEK and PHDF with differentiated confluent epidermal layer and fibrous porous dermal layer was then developed successfully in vitro proven by SEM and immunohistochemistry assay. The following in vivo animal study on athymic mice revealed early complete wound healing in 10 days and good integration of co-cultured skin substitutes with adjacent mice skin structures. Thus the co-cultured skin substitutes based on 1:20 collagen: PCL biocomposite membranes was proven in principle. The approach to skin modelling reported here may find application in wound treatment, gene therapy and screening of new pharmaceuticals.
Resumo:
In this thesis, I view the historical background of Zimbabwe to show the patterns of traditional life that existed prior to settlerism. The form, nature, pace and impact of settlerism and colonialism up to the time of independence are also discussed to show how they affected the health of the population and the pace of development of the country. The political, social and economic underdevelopment of the African people that occurred in Zimbabwe prior to independence was a result of deliberate, politically motivated and controlled policy initiatives. These led to inequatable, inadequate, inappropriate and inaccessible health care provision. It is submitted that since it was the politics that determined the pace of underdevelopment, it must be the politics that must be at the forefront of the development strategy adopted. In the face of the amed conflict that existed in Zimbabwe, existing frameworks of analyses are shown to be inadequate for planning purposes because of their inability to provide indications about the stability of future outcomes. The Metagame technique of analysis of options is proposed as a methology that can be applied in such situations. It rejects deterministic predicative models as misleading and advocates an interactive model based on objective and subjective valuation of human behaviour. In conclusion, the search for stable outcomes rather than optimal and best solutions strategies is advocated in decision making in organisations of all sizes.
Resumo:
Molecular dynamics (MD) has been used to identify the relative distribution of dysprosium in the phosphate glass DyAl0.30P3.05O9.62. The MD model has been compared directly with experimental data obtained from neutron diffraction to enable a detailed comparison beyond the total structure factor level. The MD simulation gives Dy ... Dy correlations at 3.80(5) and 6.40(5) angstrom with relative coordination numbers of 0.8(1) and 7.3(5), thus providing evidence of minority rare-earth clustering within these glasses. The nearest neighbour Dy-O peak occurs at 2.30 angstrom with each Dy atom having on average 5.8 nearest neighbour oxygen atoms. The MD simulation is consistent with the phosphate network model based on interlinked PO4 tetrahedra where the addition of network modifiers Dy3+ depolymerizes the phosphate network through the breakage of P-(O)-P bonds whilst leaving the tetrahedral units intact. The role of aluminium within the network has been taken into explicit account, and A1 is found to be predominantly (78 tetrahedrally coordinated. In fact all four A1 bonds are found to be to P (via an oxygen atom) with negligible amounts of Al-O-Dy bonds present. This provides an important insight into the role of Al additives in improving the mechanical properties of these glasses.
Resumo:
The paper proposes an ISE (Information goal, Search strategy, Evaluation threshold) user classification model based on Information Foraging Theory for understanding user interaction with content-based image retrieval (CBIR). The proposed model is verified by a multiple linear regression analysis based on 50 users' interaction features collected from a task-based user study of interactive CBIR systems. To our best knowledge, this is the first principled user classification model in CBIR verified by a formal and systematic qualitative analysis of extensive user interaction data. Copyright 2010 ACM.
Resumo:
Background: Coronary heart disease (CHD) is a public health priority in the UK. The National Service Framework (NSF) has set standards for the prevention, diagnosis and treatment of CHD, which include the use of cholesterol-lowering agents aimed at achieving targets of blood total cholesterol (TC) < 5.0 mmol/L and low density lipoprotein-cholesterol (LDL-C) < 3.0 mmol/L. In order to achieve these targets cost effectively, prescribers need to make an informed choice from the range of statins available. Aim: To estimate the average and relative cost effectiveness of atorvastatin, fluvastatin, pravastatin and simvastatin in achieving the NSF LDL-C and TC targets. Design: Model-based economic evaluation. Methods: An economic model was constructed to estimate the number of patients achieving the NSF targets for LDL-C and TC at each dose of statin, and to calculate the average drug cost and incremental drug cost per patient achieving the target levels. The population baseline LDL-C and TC, and drug efficacy and drug costs were taken from previously published data. Estimates of the distribution of patients receiving each dose of statin were derived from the UK national DIN-LINK database. Results: The estimated annual drug cost per 1000 patients treated with atorvastatin was £289 000, with simvastatin £315 000, with pravastatin £333 000 and with fluvastatin £167 000. The percentages of patients achieving target are 74.4%, 46.4%, 28.4% and 13.2% for atorvastatin, simvastatin, pravastatin and fluvastatin, respectively. Incremental drug cost per extra patient treated to LDL-C and TC targets compared with fluvastafin were £198 and £226 for atorvastatin, £443 and £567 for simvastatin and £1089 and £2298 for pravastatin, using 2002 drug costs. Conclusions: As a result of its superior efficacy, atorvastatin generates a favourable cost-effectiveness profile as measured by drug cost per patient treated to LDL-C and TC targets. For a given drug budget, more patients would achieve NSF LDL-C and TC targets with atorvastatin than with any of the other statins examined.
Resumo:
Let V be an array. The range query problem concerns the design of data structures for implementing the following operations. The operation update(j,x) has the effect vj ← vj + x, and the query operation retrieve(i,j) returns the partial sum vi + ... + vj. These tasks are to be performed on-line. We define an algebraic model – based on the use of matrices – for the study of the problem. In this paper we establish as well a lower bound for the sum of the average complexity of both kinds of operations, and demonstrate that this lower bound is near optimal – in terms of asymptotic complexity.
Resumo:
The purpose of the current paper is to present the developed methodology of viable model based enterprise management, which is needed for modern enterprises to survive and growth in the information age century. The approach is based on Beer’s viable system model and uses it as a basis of the information technology implementation and development. The enterprise is viewed as a cybernetic system which functioning is controlled from the same rules as for every living system.
Resumo:
There have been multifarious approaches in building expert knowledge in medical or engineering field through expert system, case-based reasoning, model-based reasoning and also a large-scale knowledge-based system. The intriguing factors with these approaches are mainly the choices of reasoning mechanism, ontology, knowledge representation, elicitation and modeling. In our study, we argue that the knowledge construction through hypermedia-based community channel is an effective approach in constructing expert’s knowledge. We define that the knowledge can be represented as in the simplest form such as stories to the most complex ones such as on-the-job type of experiences. The current approaches of encoding experiences require expert’s knowledge to be acquired and represented in rules, cases or causal model. We differentiate the two types of knowledge which are the content knowledge and socially-derivable knowledge. The latter is described as knowledge that is earned through social interaction. Intelligent Conversational Channel is the system that supports the building and sharing on this type of knowledge.