50 resultados para Model selection criteria
Resumo:
Time-lagged responses of biological variables to landscape modifications are widely recognized, but rarely considered in ecological studies. In order to test for the existence of time-lags in the response of trees, small mammals, birds and frogs to changes in fragment area and connectivity, we studied a fragmented and highly dynamic landscape in the Atlantic forest region. We also investigated the biological correlates associated with differential responses among taxonomic groups. Species richness and abundance for four taxonomic groups were measured in 21 secondary forest fragments during the same period (2000-2002), following a standardized protocol. Data analyses were based on power regressions and model selection procedures. The model inputs included present (2000) and past (1962, 1981) fragment areas and connectivity, as well as observed changes in these parameters. Although past landscape structure was particularly relevant for trees, all taxonomic groups (except small mammals) were affected by landscape dynamics, exhibiting a time-lagged response. Furthermore, fragment area was more important for species groups with lower dispersal capacity, while species with higher dispersal ability had stronger responses to connectivity measures. Although these secondary forest fragments still maintain a large fraction of their original biodiversity, the delay in biological response combined with high rates of deforestation and fast forest regeneration imply in a reduction in the average age of the forest. This also indicates that future species losses are likely, especially those that are more strictly-forest dwellers. Conservation actions should be implemented to reduce species extinction, to maintain old-growth forests and to favour the regeneration process. Our results demonstrate that landscape history can strongly affect the present distribution pattern of species in fragmented landscapes, and should be considered in conservation planning. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Spiders are considered conservative with regard to their resting metabolic rate, presenting the same allometric relation with body mass as the majority of land-arthropods. Nevertheless, web-building is thought to have a great impact on the energetic metabolism, and any modification that affects this complex behavior is expected to have an impact over the daily energetic budget. We analyzed the possibility of the presence of the cribellum having an effect on the allometric relation between resting metabolic rate and body mass for an ecribellate species (Zosis geniculata) and a cribellate one (Metazygia rogenhoferi), and employed a model selection approach to test if these species had the same allometric relationship as other land-arthropods. Our results show that M. rogenhoferi has a higher resting metabolic rate, while Z. geniculata fitted the allometric prediction for land arthropods. This indicates that the absence of the cribellum is associated with a higher resting metabolic rate, thus explaining the higher promptness to activity found for the ecribellate species. If our result proves to be a general rule among spiders, the radiation of Araneoidea could be connected to a more energy-consuming life style. Thus, we briefly outline an alternative model of diversification of Araneoidea that accounts for this possibility. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
1. Analyses of species association have major implications for selecting indicators for freshwater biomonitoring and conservation, because they allow for the elimination of redundant information and focus on taxa that can be easily handled and identified. These analyses are particularly relevant in the debate about using speciose groups (such as the Chironomidae) as indicators in the tropics, because they require difficult and time-consuming analysis, and their responses to environmental gradients, including anthropogenic stressors, are poorly known. 2. Our objective was to show whether chironomid assemblages in Neotropical streams include clear associations of taxa and, if so, how well these associations could be explained by a set of models containing information from different spatial scales. For this, we formulated a priori models that allowed for the influence of local, landscape and spatial factors on chironomid taxon associations (CTA). These models represented biological hypotheses capable of explaining associations between chironomid taxa. For instance, CTA could be best explained by local variables (e.g. pH, conductivity and water temperature) or by processes acting at wider landscape scales (e.g. percentage of forest cover). 3. Biological data were taken from 61 streams in Southeastern Brazil, 47 of which were in well-preserved regions, and 14 of which drained areas severely affected by anthropogenic activities. We adopted a model selection procedure using Akaike`s information criterion to determine the most parsimonious models for explaining CTA. 4. Applying Kendall`s coefficient of concordance, seven genera (Tanytarsus/Caladomyia, Ablabesmyia, Parametriocnemus, Pentaneura, Nanocladius, Polypedilum and Rheotanytarsus) were identified as associated taxa. The best-supported model explained 42.6% of the total variance in the abundance of associated taxa. This model combined local and landscape environmental filters and spatial variables (which were derived from eigenfunction analysis). However, the model with local filters and spatial variables also had a good chance of being selected as the best model. 5. Standardised partial regression coefficients of local and landscape filters, including spatial variables, derived from model averaging allowed an estimation of which variables were best correlated with the abundance of associated taxa. In general, the abundance of the associated genera tended to be lower in streams characterised by a high percentage of forest cover (landscape scale), lower proportion of muddy substrata and high values of pH and conductivity (local scale). 6. Overall, our main result adds to the increasing number of studies that have indicated the importance of local and landscape variables, as well as the spatial relationships among sampling sites, for explaining aquatic insect community patterns in streams. Furthermore, our findings open new possibilities for the elimination of redundant data in the assessment of anthropogenic impacts on tropical streams.
Resumo:
In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this paper is to develop a Bayesian approach for log-Birnbaum-Saunders Student-t regression models under right-censored survival data. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the considered model. In order to attenuate the influence of the outlying observations on the parameter estimates, we present in this paper Birnbaum-Saunders models in which a Student-t distribution is assumed to explain the cumulative damage. Also, some discussions on the model selection to compare the fitted models are given and case deletion influence diagnostics are developed for the joint posterior distribution based on the Kullback-Leibler divergence. The developed procedures are illustrated with a real data set. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
P>In the context of either Bayesian or classical sensitivity analyses of over-parametrized models for incomplete categorical data, it is well known that prior-dependence on posterior inferences of nonidentifiable parameters or that too parsimonious over-parametrized models may lead to erroneous conclusions. Nevertheless, some authors either pay no attention to which parameters are nonidentifiable or do not appropriately account for possible prior-dependence. We review the literature on this topic and consider simple examples to emphasize that in both inferential frameworks, the subjective components can influence results in nontrivial ways, irrespectively of the sample size. Specifically, we show that prior distributions commonly regarded as slightly informative or noninformative may actually be too informative for nonidentifiable parameters, and that the choice of over-parametrized models may drastically impact the results, suggesting that a careful examination of their effects should be considered before drawing conclusions.Resume Que ce soit dans un cadre Bayesien ou classique, il est bien connu que la surparametrisation, dans les modeles pour donnees categorielles incompletes, peut conduire a des conclusions erronees. Cependant, certains auteurs persistent a negliger les problemes lies a la presence de parametres non identifies. Nous passons en revue la litterature dans ce domaine, et considerons quelques exemples surparametres simples dans lesquels les elements subjectifs influencent de facon non negligeable les resultats, independamment de la taille des echantillons. Plus precisement, nous montrons comment des a priori consideres comme peu ou non-informatifs peuvent se reveler extremement informatifs en ce qui concerne les parametres non identifies, et que le recours a des modeles surparametres peut avoir sur les conclusions finales un impact considerable. Ceci suggere un examen tres attentif de l`impact potentiel des a priori.
Resumo:
A combination of an extension of the topological instability ""lambda criterion"" and a thermodynamic criterion were applied to the Al-La system, indicating the best range of compositions for glass formation. Alloy compositions in this range were prepared by melt-spinning and casting in an arc-melting furnace with a wedge-section copper mold. The GFA of these samples was evaluated by X-ray diffraction, differential scanning calorimetry and scanning electron microscopy. The results indicated that the gamma* parameter of compositions with high GFA is higher, corresponding to a range in which the lambda parameter is greater than 0.1, which are compositions far from Al solid solution. A new alloy was identified with the best GFA reported so far for this system, showing a maximum thickness of 286 mu m in a wedge-section copper mold. Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.
Resumo:
In 2004 the National Household Survey (Pesquisa Nacional par Amostras de Domicilios - PNAD) estimated the prevalence of food and nutrition insecurity in Brazil. However, PNAD data cannot be disaggregated at the municipal level. The objective of this study was to build a statistical model to predict severe food insecurity for Brazilian municipalities based on the PNAD dataset. Exclusion criteria were: incomplete food security data (19.30%); informants younger than 18 years old (0.07%); collective households (0.05%); households headed by indigenous persons (0.19%). The modeling was carried out in three stages, beginning with the selection of variables related to food insecurity using univariate logistic regression. The variables chosen to construct the municipal estimates were selected from those included in PNAD as well as the 2000 Census. Multivariate logistic regression was then initiated, removing the non-significant variables with odds ratios adjusted by multiple logistic regression. The Wald Test was applied to check the significance of the coefficients in the logistic equation. The final model included the variables: per capita income; years of schooling; race and gender of the household head; urban or rural residence; access to public water supply; presence of children; total number of household inhabitants and state of residence. The adequacy of the model was tested using the Hosmer-Lemeshow test (p=0.561) and ROC curve (area=0.823). Tests indicated that the model has strong predictive power and can be used to determine household food insecurity in Brazilian municipalities, suggesting that similar predictive models may be useful tools in other Latin American countries.
Resumo:
The implementation of confidential contracts between a container liner carrier and its customers, because of the Ocean Shipping Reform Act (OSRA) 1998, demands a revision in the methodology applied in the carrier's planning of marketing and sales. The marketing and sales planning process should be more scientific and with a better use of operational research tools considering the selection of the customers under contracts, the duration of the contracts, the freight, and the container imbalances of these contracts are basic factors for the carrier's yield. This work aims to develop a decision support system based on a linear programming model to generate the business plan for a container liner carrier, maximizing the contribution margin of its freight.
Resumo:
Background: The criteria and timing for nerve surgery in infants with obstetric brachial plexopathy remain controversial. Our aim was to develop a new method for early prognostic assessment to assist this decision process. Methods: Fifty-four patients with unilateral obstetric brachial plexopathy who were ten to sixty days old underwent bilateral motor-nerve-conduction studies of the axillary, musculocutaneous, proximal radial, distal radial, median, and ulnar nerves. The ratio between the amplitude of the compound muscle action potential of the affected limb and that of the healthy side was called the axonal viability index. The patients were followed and classified in three groups according to the clinical outcome. We analyzed the receiver operating characteristic curve of each index to define the best cutoff point to detect patients with a poor recovery. Results: The best cutoff points on the axonal viability index for each nerve (and its sensitivity and specificity) were <10% (88% and 89%, respectively) for the axillary nerve, 0% (88% and 73%) for the musculocutaneous nerve, <20% (82% and 97%) for the proximal radial nerve, <50% (82% and 97%) for the distal radial nerve, and <50% (59% and 97%) for the ulnar nerve. The indices from the proximal radial, distal radial, and ulnar nerves had better specificities compared with the most frequently used clinical criterion: absence of biceps function at three months of age. Conclusions: The axonal viability index yields an earlier and more specific prognostic estimation of obstetric brachial plexopathy than does the clinical criterion of biceps function, and we believe it may be useful in determining surgical indications in these patients.
Resumo:
Background: Considering the broad variation in the expression of housekeeping genes among tissues and experimental situations, studies using quantitative RT-PCR require strict definition of adequate endogenous controls. For glioblastoma, the most common type of tumor in the central nervous system, there was no previous report regarding this issue. Results: Here we show that amongst seven frequently used housekeeping genes TBP and HPRT1 are adequate references for glioblastoma gene expression analysis. Evaluation of the expression levels of 12 target genes utilizing different endogenous controls revealed that the normalization method applied might introduce errors in the estimation of relative quantities. Genes presenting expression levels which do not significantly differ between tumor and normal tissues can be considered either increased or decreased if unsuitable reference genes are applied. Most importantly, genes showing significant differences in expression levels between tumor and normal tissues can be missed. We also demonstrated that the Holliday Junction Recognizing Protein, a novel DNA repair protein over expressed in lung cancer, is extremely over-expressed in glioblastoma, with a median change of about 134 fold. Conclusion: Altogether, our data show the relevance of previous validation of candidate control genes for each experimental model and indicate TBP plus HPRT1 as suitable references for studies on glioblastoma gene expression.
Resumo:
Background: Plasmodium vivax malaria is a major public health challenge in Latin America, Asia and Oceania, with 130-435 million clinical cases per year worldwide. Invasion of host blood cells by P. vivax mainly depends on a type I membrane protein called Duffy binding protein (PvDBP). The erythrocyte-binding motif of PvDBP is a 170 amino-acid stretch located in its cysteine-rich region II (PvDBP(II)), which is the most variable segment of the protein. Methods: To test whether diversifying natural selection has shaped the nucleotide diversity of PvDBP(II) in Brazilian populations, this region was sequenced in 122 isolates from six different geographic areas. A Bayesian method was applied to test for the action of natural selection under a population genetic model that incorporates recombination. The analysis was integrated with a structural model of PvDBP(II), and T-and B-cell epitopes were localized on the 3-D structure. Results: The results suggest that: (i) recombination plays an important role in determining the haplotype structure of PvDBP(II), and (ii) PvDBP(II) appears to contain neutrally evolving codons as well as codons evolving under natural selection. Diversifying selection preferentially acts on sites identified as epitopes, particularly on amino acid residues 417, 419, and 424, which show strong linkage disequilibrium. Conclusions: This study shows that some polymorphisms of PvDBP(II) are present near the erythrocyte-binding domain and might serve to elude antibodies that inhibit cell invasion. Therefore, these polymorphisms should be taken into account when designing vaccines aimed at eliciting antibodies to inhibit erythrocyte invasion.
Resumo:
Background: Feature selection is a pattern recognition approach to choose important variables according to some criteria in order to distinguish or explain certain phenomena (i.e., for dimensionality reduction). There are many genomic and proteomic applications that rely on feature selection to answer questions such as selecting signature genes which are informative about some biological state, e. g., normal tissues and several types of cancer; or inferring a prediction network among elements such as genes, proteins and external stimuli. In these applications, a recurrent problem is the lack of samples to perform an adequate estimate of the joint probabilities between element states. A myriad of feature selection algorithms and criterion functions have been proposed, although it is difficult to point the best solution for each application. Results: The intent of this work is to provide an open-source multiplataform graphical environment for bioinformatics problems, which supports many feature selection algorithms, criterion functions and graphic visualization tools such as scatterplots, parallel coordinates and graphs. A feature selection approach for growing genetic networks from seed genes ( targets or predictors) is also implemented in the system. Conclusion: The proposed feature selection environment allows data analysis using several algorithms, criterion functions and graphic visualization tools. Our experiments have shown the software effectiveness in two distinct types of biological problems. Besides, the environment can be used in different pattern recognition applications, although the main concern regards bioinformatics tasks.
Resumo:
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Model predictive control (MPC) is usually implemented as a control strategy where the system outputs are controlled within specified zones, instead of fixed set points. One strategy to implement the zone control is by means of the selection of different weights for the output error in the control cost function. A disadvantage of this approach is that closed-loop stability cannot be guaranteed, as a different linear controller may be activated at each time step. A way to implement a stable zone control is by means of the use of an infinite horizon cost in which the set point is an additional variable of the control problem. In this case, the set point is restricted to remain inside the output zone and an appropriate output slack variable is included in the optimisation problem to assure the recursive feasibility of the control optimisation problem. Following this approach, a robust MPC is developed for the case of multi-model uncertainty of open-loop stable systems. The controller is devoted to maintain the outputs within their corresponding feasible zone, while reaching the desired optimal input target. Simulation of a process of the oil re. ning industry illustrates the performance of the proposed strategy.