652 resultados para model library


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixture models implemented via the expectation-maximization (EM) algorithm are being increasingly used in a wide range of problems in pattern recognition such as image segmentation. However, the EM algorithm requires considerable computational time in its application to huge data sets such as a three-dimensional magnetic resonance (MR) image of over 10 million voxels. Recently, it was shown that a sparse, incremental version of the EM algorithm could improve its rate of convergence. In this paper, we show how this modified EM algorithm can be speeded up further by adopting a multiresolution kd-tree structure in performing the E-step. The proposed algorithm outperforms some other variants of the EM algorithm for segmenting MR images of the human brain. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral processing plants use two main processes; these are comminution and separation. The objective of the comminution process is to break complex particles consisting of numerous minerals into smaller simpler particles where individual particles consist primarily of only one mineral. The process in which the mineral composition distribution in particles changes due to breakage is called 'liberation'. The purpose of separation is to separate particles consisting of valuable mineral from those containing nonvaluable mineral. The energy required to break particles to fine sizes is expensive, and therefore the mineral processing engineer must design the circuit so that the breakage of liberated particles is reduced in favour of breaking composite particles. In order to effectively optimize a circuit through simulation it is necessary to predict how the mineral composition distributions change due to comminution. Such a model is called a 'liberation model for comminution'. It was generally considered that such a model should incorporate information about the ore, such as the texture. However, the relationship between the feed and product particles can be estimated using a probability method, with the probability being defined as the probability that a feed particle of a particular composition and size will form a particular product particle of a particular size and composition. The model is based on maximizing the entropy of the probability subject to mass constraints and composition constraint. Not only does this methodology allow a liberation model to be developed for binary particles, but also for particles consisting of many minerals. Results from applying the model to real plant ore are presented. A laboratory ball mill was used to break particles. The results from this experiment were used to estimate the kernel which represents the relationship between parent and progeny particles. A second feed, consisting primarily of heavy particles subsampled from the main ore was then ground through the same mill. The results from the first experiment were used to predict the product of the second experiment. The agreement between the predicted results and the actual results are very good. It is therefore recommended that more extensive validation is needed to fully evaluate the substance of the method. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The XXZ Gaudin model with generic integrable boundaries specified by generic non-diagonal K-matrices is studied. The commuting families of Gaudin operators are diagonalized by the algebraic Bethe ansatz method. The eigenvalues and the corresponding Bethe ansatz equations are obtained. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose : The purpose of this article is to critically review the literature to examine factors that are most consistently related to employment outcome following traumatic brain injury (TBI), with a particular focus on metacognitive skills. It also aims to develop a conceptual model of factors related to employment outcome. Method : The first stage of the review considered 85 studies published between 1980 and December 2003 which investigated factors associated with employment outcome following TBI. English-language studies were identified through searches of Medline and PsycINFO, as well as manual searches of journals and reference lists. The studies were evaluated and rated by two independent raters (Kappa = 0.835) according to the quality of their methodology based upon nine criteria. Fifty studies met the criteria for inclusion in the second stage of the review, which examined the relationship between a broad range of variables and employment outcome. Results : The factors most consistently associated with employment outcome included pre-injury occupational status, functional status at discharge, global cognitive functioning, perceptual ability, executive functioning, involvement in vocational rehabilitation services and emotional status. Conclusions : A conceptual model is presented which emphasises the importance of metacognitive, emotional and social environment factors for improving employment outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study adds to the sparse published Australian literature on the size effect, the book to market (BM) effect and the ability of the Fama French three factor model to account for these effects and to improve on the asset pricing ability of the Capital Asset Pricing Model (CAPM). The present study extends the 1981–1991 period examined by Halliwell, Heaney and Sawicki (1999) a further 10 years to 2000 and addresses several limitations and findings of that research. In contrast to Halliwell, Heaney and Sawicki the current study finds the three factor model provides significantly improved explanatory power over the CAPM, and evidence that the BM factor plays a role in asset pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simulation-based modelling approach is used to examine the effects of stratified seed dispersal (representing the distribution of the majority of dispersal around the maternal parent and also rare long-distance dispersal) on the genetic structure of maternally inherited genomes and the colonization rate of expanding plant populations. The model is parameterized to approximate postglacial oak colonization in the UK, but is relevant to plant populations that exhibit stratified seed dispersal. The modelling approach considers the colonization of individual plants over a large area (three 500 km x 10 km rolled transects are used to approximate a 500 km x 300 km area). Our approach shows how the interaction of plant population dynamics with stratified dispersal can result in a spatially patchy haplotype structure. We show that while both colonization speeds and the resulting genetic structure are influenced by the characteristics of the dispersal kernel, they are robust to changes in the periodicity of long-distance events, provided the average number of long-distance dispersal events remains constant. We also consider the effects of additional physical and environmental mechanisms on plant colonization. Results show significant changes in genetic structure when the initial colonization of different haplotypes is staggered over time and when a barrier to colonization is introduced. Environmental influences on survivorship and fecundity affect both the genetic structure and the speed of colonization. The importance of these mechanisms in relation to the postglacial spread and genetic structure of oak in the UK is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: This pilot study describes a modelling approach to translate group-level changes in health status into changes in preference values, by using the effect size (ES) to summarize group-level improvement. Methods: ESs are the standardized mean difference between treatment groups in standard deviation (SD) units. Vignettes depicting varying severity in SD decrements on the SF-12 mental health summary scale, with corresponding symptom severity profiles, were valued by a convenience sample of general practitioners (n = 42) using the rating scale (RS) and time trade-off methods. Translation factors between ES differences and change in preference value were developed for five mental disorders, such that ES from published meta-analyses could be transformed into predicted changes in preference values. Results: An ES difference in health status was associated with an average 0.171-0.204 difference in preference value using the RS, and 0.104-0.158 using the time trade off. Conclusions: This observed relationship may be particular to the specific versions of the measures employed in the present study. With further development using different raters and preference measures, this approach may expand the evidence base available for modelling preference change for economic analyses from existing data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We obtain a diagonal solution of the dual reflection equation for the elliptic A(n-1)((1)) solid-on-solid model. The isomorphism between the solutions of the reflection equation and its dual is studied. (C) 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consistent with action-based theories of attention, the presence of a nontarget stimulus in the environment has been shown to alter the characteristics of goal-directed movements. Specifically, it has been reported that movement trajectories veer away from (Howard & Tipper, 1997) or towards (Welsh, Elliott, & Weeks, 1999) the location of a nontarget stimulus. The purpose of the experiments reported in this paper was to test a response activation model of selective reaching conceived to account for these variable results. In agreement with the model, the trajectory changes in the movements appear to be determined by the activation levels of each competing response at the moment of response initiation. The results of the present work, as well as those of previous studies, are discussed within the framework of the model of response activation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The best accepted method for design of autogenous and semi-autogenous (AG/SAG) mills is to carry out pilot scale test work using a 1.8 m diameter by 0.6 m long pilot scale test mill. The load in such a mill typically contains 250,000-450,000 particles larger than 6 mm, allowing correct representation of more than 90% of the charge in Discrete Element Method (DEM) simulations. Most AG/SAG mills use discharge grate slots which are 15 mm or more in width. The mass in each size fraction usually decreases rapidly below grate size. This scale of DEM model is now within the possible range of standard workstations running an efficient DEM code. This paper describes various ways of extracting collision data front the DEM model and translating it into breakage estimates. Account is taken of the different breakage mechanisms (impact and abrasion) and of the specific impact histories of the particles in order to assess the breakage rates for various size fractions in the mills. At some future time, the integration of smoothed particle hydrodynamics with DEM will allow for the inclusion of slurry within the pilot mill simulation. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Aims: We have optimized the isolated perfused mouse kidney (IPMK) model for studying renal vascular and tubular function in vitro using 24-28 g C57BL6J mice; the wild type controls for many transgenic mice. Methods and Results: Buffer composition was optimized for bovine serum albumin concentration (BSA). The effect of adding erythrocytes on renal function and morphology was assessed. Autoregulation was investigated during stepped increases in perfusion pressure. Perfusion for 60 min at 90-110 mmHg with Krebs bicarbonate buffer containing 5.5% BSA, and amino acids produced functional parameters within the in vivo range. Erythrocytes increased renal vascular resistance (3.8 +/- 0.2 vs 2.4 +/- 0.1 mL/min.mmHg, P < 0.05), enhanced sodium reabsorption (FENa = 0.3 +/- 0.08 vs 1.5 +/- 0.7%, P < 0.05), produced equivalent glomerular filtration rates (GFR; 364 +/- 38 vs 400 +/- 9 muL/min per gkw) and reduced distal tubular cell injury in the inner stripe (5.8 +/- 1.7 vs 23.7 +/- 3.1%, P < 0.001) compared to cell free perfusion. The IPMK was responsive to vasoconstrictor (angiotensin II, EC50 100 pM) and vasodilator (methacholine, EC50 75 nM) mediators and showed partial autoregulation of perfusate flow under control conditions over 65-85 mmHg; autoregulatory index (ARI) of 0.66 +/- 0.11. Angiotensin II (100 pM) extended this range (to 65-120 mmHg) and enhanced efficiency (ARI 0.21 +/- 0.02, P < 0.05). Angiotensin II facilitation was antagonized by methacholine (ARI 0.76 +/- 0.08) and papaverine (ARI 0.91 +/- 0.13). Conclusion: The IPMK model is useful for studying renal physiology and pathophysiology without systemic neurohormonal influences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The marginalisation of the teaching and learning of legal research in the Australian law school curriculum is, in the author's experience, a condition common to many law schools. This is reflected in the reluctance of some law teachers to include legal research skills in the substantive law teaching schedule — often the result of unwillingness on the part of law school administrators to provide the resources necessary to ensure that such integration does not place a disproportionately heavy burden of assessment on those who are tempted. However, this may only be one of many reasons for the marginalisation of legal research in the law school experience. Rather than analyse the reasons for this marginalisation, this article deals with what needs to be done to rectify the situation, and to ensure that the teaching of legal research can be integrated into the law school curriculum in a meaningful way. This requires the use of teaching and learning theory which focuses on student-centred learning. This article outlines a model of legal research. It incorporates five transparent stages which are: analysis, contextualisation, bibliographic skills, interpretation and assessment and application.