853 resultados para heterogeneous regressions algorithms
Resumo:
The objectives of the present study were to determine if variance components of calving intervals varied with age at calving and if considering calving intervals as a longitudinal trait would be a useful approach for fertility analysis of Zebu dairy herds. With these purposes, calving records from females born from 1940 to 2006 in a Guzerat dairy subpopulation in Brazil were analyzed. The fixed effects of contemporary groups, formed by year and farm at birth or at calving, and the regressions of age at calving, equivalent inbreeding coefficient and day of the year on the studied traits were considered in the statistical models. In one approach, calving intervals (Cl) were analyzed as a single trait, by fitting a statistical model on which both animal and permanent environment effects were adjusted for the effect of age at calving by random regression. In a second approach, a four-trait analysis was conducted, including age at first calving (AFC) and three different female categories for the calving intervals: first calving females; young females (less than 80 months old, but not first calving); or mature females (80 months old or more). Finally, a two-trait analysis was performed, also including AFC and Cl, but calving intervals were regarded as a single trait in a repeatability model. Additionally, the ranking of sires was compared among approaches. Calving intervals decreased with age until females were about 80 months old, remaining nearly constant after that age. A quasi-linear increase of 11.5 days on the calving intervals was observed for each 10% increase in the female's equivalent inbreeding coefficient. The heritability of AFC was 0.37. For Cl. the genetic-phenotypic variance ratios ranged from 0.064 to 0.141, depending on the approach and on ages at calving. Differences among genetic variance components for calving intervals were observed along the animal's lifetime. Those differences confirmed the longitudinal aspect of that trait, indicating the importance of such consideration when accessing fertility of Zebu dairy females, especially in situations where the available information relies on their calving intervals. Spearman rank correlations among approaches ranged from 0.90 to 0.95, and changes observed in the ranking of sires suggested that the genetic progress of the population could be affected by the approach chosen for the analysis of calving intervals. (C) 2012 Elsevier ay. All rights reserved.
Resumo:
Assessment of the suitability of anthropogenic landscapes for wildlife species is crucial for setting priorities for biodiversity conservation. This study aimed to analyse the environmental suitability of a highly fragmented region of the Brazilian Atlantic Forest, one of the world's 25 recognized biodiversity hotspots, for forest bird species. Eight forest bird species were selected for the analyses, based on point counts (n = 122) conducted in April-September 2006 and January-March 2009. Six additional variables (landscape diversity, distance from forest and streams, aspect, elevation and slope) were modelled in Maxent for (1) actual and (2) simulated land cover, based on the forest expansion required by existing Brazilian forest legislation. Models were evaluated by bootstrap or jackknife methods and their performance was assessed by AUC, omission error, binomial probability or p value. All predictive models were statistically significant, with high AUC values and low omission errors. A small proportion of the actual landscape (24.41 +/- 6.31%) was suitable for forest bird species. The simulated landscapes lead to an increase of c. 30% in total suitable areas. In average, models predicted a small increase (23.69 +/- 6.95%) in the area of suitable native forest for bird species. Being close to forest increased the environmental suitability of landscapes for all bird species; landscape diversity was also a significant factor for some species. In conclusion, this study demonstrates that species distribution modelling (SDM) successfully predicted bird distribution across a heterogeneous landscape at fine spatial resolution, as all models were biologically relevant and statistically significant. The use of landscape variables as predictors contributed significantly to the results, particularly for species distributions over small extents and at fine scales. This is the first study to evaluate the environmental suitability of the remaining Brazilian Atlantic Forest for bird species in an agricultural landscape, and provides important additional data for regional environmental planning.
Resumo:
There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.
Resumo:
This paper presents a survey of evolutionary algorithms that are designed for decision-tree induction. In this context, most of the paper focuses on approaches that evolve decision trees as an alternate heuristics to the traditional top-down divide-and-conquer approach. Additionally, we present some alternative methods that make use of evolutionary algorithms to improve particular components of decision-tree classifiers. The paper's original contributions are the following. First, it provides an up-to-date overview that is fully focused on evolutionary algorithms and decision trees and does not concentrate on any specific evolutionary approach. Second, it provides a taxonomy, which addresses works that evolve decision trees and works that design decision-tree components by the use of evolutionary algorithms. Finally, a number of references are provided that describe applications of evolutionary algorithms for decision-tree induction in different domains. At the end of this paper, we address some important issues and open questions that can be the subject of future research.
Resumo:
Background: This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. Results: The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. Conclusions: We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.
Resumo:
Leao RM, Li S, Doiron B, Tzounopoulos T. Diverse levels of an inwardly rectifying potassium conductance generate heterogeneous neuronal behavior in a population of dorsal cochlear nucleus pyramidal neurons. J Neurophysiol 107: 3008-3019, 2012. First published February 29, 2012; doi:10.1152/jn.00660.2011.-Homeostatic mechanisms maintain homogeneous neuronal behavior among neurons that exhibit substantial variability in the expression levels of their ionic conductances. In contrast, the mechanisms, which generate heterogeneous neuronal behavior across a neuronal population, remain poorly understood. We addressed this problem in the dorsal cochlear nucleus, where principal neurons exist in two qualitatively distinct states: spontaneously active or not spontaneously active. Our studies reveal that distinct activity states are generated by the differential levels of a Ba2+-sensitive, inwardly rectifying potassium conductance (K-ir). Variability in K-ir maximal conductance causes variations in the resting membrane potential (RMP). Low K-ir conductance depolarizes RMP to voltages above the threshold for activating subthreshold-persistent sodium channels (Na-p). Once Na-p channels are activated, the RMP becomes unstable, and spontaneous firing is triggered. Our results provide a biophysical mechanism for generating neural heterogeneity, which may play a role in the encoding of sensory information.
Resumo:
Diffuse large B-cell lymphoma can be subclassified into at least two molecular subgroups by gene expression profiling: germinal center B-cell like and activated B-cell like diffuse large B-cell lymphoma. Several immunohistological algorithms have been proposed as surrogates to gene expression profiling at the level of protein expression, but their reliability has been an issue of controversy. Furthermore, the proportion of misclassified cases of germinal center B-cell subgroup by immunohistochemistry, in all reported algorithms, is higher compared with germinal center B-cell cases defined by gene expression profiling. We analyzed 424 cases of nodal diffuse large B-cell lymphoma with the panel of markers included in the three previously described algorithms: Hans, Choi, and Tally. To test whether the sensitivity of detecting germinal center B-cell cases could be improved, the germinal center B-cell marker HGAL/GCET2 was also added to all three algorithms. Our results show that the inclusion of HGAL/GCET2 significantly increased the detection of germinal center B-cell cases in all three algorithms (P<0.001). The proportions of germinal center B-cell cases in the original algorithms were 27%, 34%, and 19% for Hans, Choi, and Tally, respectively. In the modified algorithms, with the inclusion of HGAL/GCET2, the frequencies of germinal center B-cell cases were increased to 38%, 48%, and 35%, respectively. Therefore, HGAL/GCET2 protein expression may function as a marker for germinal center B-cell type diffuse large B-cell lymphoma. Consideration should be given to the inclusion of HGAL/GCET2 analysis in algorithms to better predict the cell of origin. These findings bear further validation, from comparison to gene expression profiles and from clinical/therapeutic data. Modern Pathology (2012) 25, 1439-1445; doi: 10.1038/modpathol.2012.119; published online 29 June 2012
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.
Resumo:
Cancer cachexia induces loss of fat mass that accounts for a large part of the dramatic weight loss observed both in humans and in animal models; however, the literature does not provide consistent information regarding the set point of weight loss and how the different visceral adipose tissue depots contribute to this symptom. To evaluate that, 8-week-old male Wistar rats were subcutaneously inoculated with 1 ml (2 x 10(7)) of tumour cells (Walker 256). Samples of different visceral white adipose tissue (WAT) depots were collected at days 0, 4, 7 and 14 and stored at -80 degrees C (seven to ten animals/each day per group). Mesenteric and retroperitoneal depot mass was decreased to the greatest extent on day 14 compared with day 0. Gene and protein expression of PPAR gamma(2) (PPARG) fell significantly following tumour implantation in all three adipose tissue depots while C/EBP alpha (CEBPA) and SREBP-1c (SREBF1) expression decreased over time only in epididymal and retroperitoneal depots. Decreased adipogenic gene expression and morphological disruption of visceral WAT are further supported by the dramatic reduction in mRNA and protein levels of perilipin. Classical markers of inflammation and macrophage infiltration (f4/80, CD68 and MIF-1 alpha) in WAT were significantly increased in the later stage of cachexia (although showing a incremental pattern along the course of cachexia) and presented a depot-specific regulation. These results indicate that impairment in the lipid-storing function of adipose tissue occurs at different times and that the mesenteric adipose tissue is more resistant to the 'fat-reducing effect' than the other visceral depots during cancer cachexia progression. Journal of Endocrinology (2012) 215, 363-373
Resumo:
The present study investigates the use of solar heterogeneous photocatalyis (TiO2) for the destruction of [D-Leu]-Microcystin-LR, powerful toxin of widespread occurrence within cyanobacteria blooms. We extracted [D-Leu]-Microcystin-LR from a culture of Microcystis spp. and used a flat plate glass reactor coated with TiO2 (Degussa, P25) for the degradation studies. The irradiance was measured during the experiments with the aid of a spectroradiometer. After the degradation experiments, toxin concentrations were determined by HPLC and mineralization by TOC analyses. Acute and chronic toxicities were, quantified using mice and phosphatase inhibition in vitro assays, respectively. According to the performed experiments, 150 min were necessary to reduce the toxin concentration to the WHO's guideline for drinking water (from 10 to 1 mu g L-1) and to mineralize 90% of the initial carbon content. Another important finding is that solar heterogeneous photocatalysis was a destructive process indeed, not only for the toxin, but also for the other extract components and degradation products generated. Moreover, toxicity tests using mice have shown that the acute effect caused by the initial sample was removed. However, tests using the phosphatase enzyme indicated that it may be formed products capable of inducing chronic effects on mammals. The performed experiments indicate the feasibility of using solar heterogeneous photocatalysis for treating contaminated water with [D-Leu]-Microcystin-LR, not only due to its destruction, but also to the significant removal of organic matter and acute toxicity that can be achieved. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.
Resumo:
This work aimed to apply genetic algorithms (GA) and particle swarm optimization (PSO) in cash balance management using Miller-Orr model, which consists in a stochastic model that does not define a single ideal point for cash balance, but an oscillation range between a lower bound, an ideal balance and an upper bound. Thus, this paper proposes the application of GA and PSO to minimize the Total Cost of cash maintenance, obtaining the parameter of the lower bound of the Miller-Orr model, using for this the assumptions presented in literature. Computational experiments were applied in the development and validation of the models. The results indicated that both the GA and PSO are applicable in determining the cash level from the lower limit, with best results of PSO model, which had not yet been applied in this type of problem.
Resumo:
Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.
Resumo:
Strontium zirconate oxide was synthesized by co-precipitation and the citrate route and was evaluated as a heterogeneous catalyst for biodiesel production. The catalyst samples were characterized by XRD, FTIR, and TG, and catalytic activity was measured based on the ester content of the biodiesel produced that was quantified by GC. The co-precipitate samples were obtained in alkaline pH and had a mixture of the perovskite and pure strontium and zirconium oxide phases. Ester conversion using these samples was approximately 1.6%, indicating no catalytic activity. The citrate route was more efficient in producing perovskite when carried out at pH 7-8; excess SrCO3 was found on the catalyst surface due to CO2 adsorption, thus demonstrating no catalytic activity. The same synthesis carried out at pH 2 resulted in free OH- groups, with a small amount of the carbonate species that produced ester yield values of 98%. Therefore, matrices based on strontium zirconate produced via the citrate route in acidic media are potential heterogeneous catalysts for transesterification. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This paper addresses the analysis of probabilistic corrosion time initiation in reinforced concrete structures exposed to ions chloride penetration. Structural durability is an important criterion which must be evaluated in every type of structure, especially when these structures are constructed in aggressive atmospheres. Considering reinforced concrete members, chloride diffusion process is widely used to evaluate the durability. Therefore, at modelling this phenomenon, corrosion of reinforcements can be better estimated and prevented. These processes begin when a threshold level of chlorides concentration is reached at the steel bars of reinforcements. Despite the robustness of several models proposed in the literature, deterministic approaches fail to predict accurately the corrosion time initiation due to the inherently randomness observed in this process. In this regard, the durability can be more realistically represented using probabilistic approaches. A probabilistic analysis of ions chloride penetration is presented in this paper. The ions chloride penetration is simulated using the Fick's second law of diffusion. This law represents the chloride diffusion process, considering time dependent effects. The probability of failure is calculated using Monte Carlo simulation and the First Order Reliability Method (FORM) with a direct coupling approach. Some examples are considered in order to study these phenomena and a simplified method is proposed to determine optimal values for concrete cover.