932 resultados para Information dispersal algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

For many species of marine invertebrates, variability in larval settlement behaviour appears to be the rule rather than the exception. This variability has the potential to affect larval dispersal, because settlement behaviour will influence the length of time larvae are in the plankton. Despite the ubiquity and importance of this variability, relatively few sources of variation in larval settlement behaviour have been identified. One important factor that can affect larval settlement behaviour is the nutritional state of larvae. Non-feeding larvae often become less discriminating in their 'choice' of settlement substrate, i.e. more desperate to settle, when energetic reserves run low. We tested whether variation in larval size (and presumably in nutritional reserves) also affects the settlement behaviour of 3 species of colonial marine invertebrate larvae, the bryozoans Bugula neritina and Watersipora subtorquata and the ascidian Diplosoma listerianum. For all 3 species, larger larvae delayed settlement for longer in the absence of settlement cues, and settlement of Bugula neritina larvae was accelerated by the presence of settlement cues, independently of larval size. In the field, larger W subtorquata larvae also took longer to settle than smaller larvae and were more discriminating towards settlement surfaces. These differences in settlement time are likely to result in differences in the distance that larvae disperse in the field. We suggest that species that produce non-feeding larvae can affect the dispersal potential of their offspring by manipulating larval size and thus larval desperation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general objective of this work was to study the contribution of the ERP for the quality of the managerial accounting information, through the perception of managers of large sized Brazilian companies. The initial principle was that, presently, we live in an enterprise reality characterized by global and competitive worldwide scenery where the information about the enterprise performance and the evaluation of the intangible assets are necessary conditions for the survival, of the companies. The research of the exploratory type is based on a sample of 37 managers of large sized-Brazilian companies. The analysis of the data treated by means of the qualitative method showed that the great majority of the companies of the sample (86%) possess an ERP implanted. It also showed that this system is used in combination with other applicative software. The managers, in its majority, were also satisfied with the information generated in relation to the dimensions Time and Content. However, with regard to the qualitative nature of the information, the ERP made some analysis possible when the Balanced Scorecard was adopted, but information able to provide an estimate of the investments carried through in the intangible assets was not obtained. These results Suggest that in these companies ERP systems are not adequate to support strategic decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managing a variable demand scenario is particularly challenging on services organizations because services companies usually have a major part of fixed costs. The article studies how a services organization manages its demand variability and its relation with the organization`s profitability. Moreover, the study searched for alternatives used to reduce the demand variability`s impact on the profitability of the company. The research was based on a case study with a Brazilian services provider on information technology business. The study suggests that alternatives like using outsourced employees to cover demand peaks may bring benefits only on short term, reducing the profitability of the company on long term: Some options are revealed, like the internationalization of employees and the investment on developing its own workforce.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduced a spectral clustering algorithm based on the bipartite graph model for the Manufacturing Cell Formation problem in [Oliveira S, Ribeiro JFF, Seok SC. A spectral clustering algorithm for manufacturing cell formation. Computers and Industrial Engineering. 2007 [submitted for publication]]. It constructs two similarity matrices; one for parts and one for machines. The algorithm executes a spectral clustering algorithm on each separately to find families of parts and cells of machines. The similarity measure in the approach utilized limited information between parts and between machines. This paper reviews several well-known similarity measures which have been used for Group Technology. Computational clustering results are compared by various performance measures. (C) 2008 The Society of Manufacturing Engineers. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A major goal in the post-genomic era is to identify and characterise disease susceptibility genes and to apply this knowledge to disease prevention and treatment. Rodents and humans have remarkably similar genomes and share closely related biochemical, physiological and pathological pathways. In this work we utilised the latest information on the mouse transcriptome as revealed by the RIKEN FANTOM2 project to identify novel human disease-related candidate genes. We define a new term patholog to mean a homolog of a human disease-related gene encoding a product ( transcript, anti-sense or protein) potentially relevant to disease. Rather than just focus on Mendelian inheritance, we applied the analysis to all potential pathologs regardless of their inheritance pattern. Results: Bioinformatic analysis and human curation of 60,770 RIKEN full-length mouse cDNA clones produced 2,578 sequences that showed similarity ( 70 - 85% identity) to known human-disease genes. Using a newly developed biological information extraction and annotation tool ( FACTS) in parallel with human expert analysis of 17,051 MEDLINE scientific abstracts we identified 182 novel potential pathologs. Of these, 36 were identified by computational tools only, 49 by human expert analysis only and 97 by both methods. These pathologs were related to neoplastic ( 53%), hereditary ( 24%), immunological ( 5%), cardio-vascular (4%), or other (14%), disorders. Conclusions: Large scale genome projects continue to produce a vast amount of data with potential application to the study of human disease. For this potential to be realised we need intelligent strategies for data categorisation and the ability to link sequence data with relevant literature. This paper demonstrates the power of combining human expert annotation with FACTS, a newly developed bioinformatics tool, to identify novel pathologs from within large-scale mouse transcript datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On the basis of a spatially distributed sediment budget across a large basin, costs of achieving certain sediment reduction targets in rivers were estimated. A range of investment prioritization scenarios were tested to identify the most cost-effective strategy to control suspended sediment loads. The scenarios were based on successively introducing more information from the sediment budget. The relationship between spatial heterogeneity of contributing sediment sources on cost effectiveness of prioritization was investigated. Cost effectiveness was shown to increase with sequential introduction of sediment budget terms. The solution which most decreased cost was achieved by including spatial information linking sediment sources to the downstream target location. This solution produced cost curves similar to those derived using a genetic algorithm formulation. Appropriate investment prioritization can offer large cost savings because the magnitude of the costs can vary by several times depending on what type of erosion source or sediment delivery mechanism is targeted. Target settings which only consider the erosion source rates can potentially result in spending more money than random management intervention for achieving downstream targets. Coherent spatial patterns of contributing sediment emerge from the budget model and its many inputs. The heterogeneity in these patterns can be summarized in a succinct form. This summary was shown to be consistent with the cost difference between local and regional prioritization for three of four test catchments. To explain the effect for the fourth catchment, the detail of the individual sediment sources needed to be taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Cellular-Automaton Finite-Volume-Method (CAFVM) algorithm has been developed, coupling with macroscopic model for heat transfer calculation and microscopic models for nucleation and growth. The solution equations have been solved to determine the time-dependent constitutional undercooling and interface retardation during solidification. The constitutional undercooling is then coupled into the CAFVM algorithm to investigate both the effects of thermal and constitutional undercooling on columnar growth and crystal selection in the columnar zone, and formation of equiaxed crystals in the bulk liquid. The model cannot only simulate microstructures of alloys but also investigates nucleation mechanisms and growth kinetics of alloys solidified with various solute concentrations and solidification morphologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three experiments examined the hypothesis that people show consistency in motivated social cognitive processing across self-serving domains. Consistent with this hypothesis, Experiment 1 revealed that people who rated a task at which they succeeded as more important than a task at which they failed also cheated on a series of math problems, but only when they could rationalize their cheating as unintentional. Experiment 2 replicated this finding and demonstrated that a self-report measure of self-deception did not predict this rationalized cheating. Experiment 3 replicated Experiments 1 and 2 and ruled out several alternative explanations. These experiments suggest that people who show motivated processing in ego-protective domains also show motivated processing in extrinsic domains. These experiments also introduce a new measurement procedure for differentiating between intentional versus rationalized cheating.