15 resultados para Multi-classifier systems

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Binary and ternary systems of Ni2+, Zn2+, and Pb2+ were investigated at initial metal concentrations of 0.5, 1.0 and 2.0 mM as competitive adsorbates using Arthrospira platensis and Chlorella vulgaris as biosorbents. The experimental results were evaluated in terms of equilibrium sorption capacity and metal removal efficiency and fitted to the multi-component Langmuir and Freundlich isotherms. The pseudo second order model of Ho and McKay described well the adsorption kinetics, and the FT-IR spectroscopy confirmed metal binding to both biomasses. Ni2+ and Zn2+ interference on Pb2+ sorption was lower than the contrary, likely due to biosorbent preference to Pb. In general, the higher the total initial metal concentration, the lower the adsorption capacity. The results of this study demonstrated that dry biomass of C. vulgaris behaved as better biosorbent than A. platensis and suggest its use as an effective alternative sorbent for metal removal from wastewater. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

20 years after the discovery of the first planets outside our solar system, the current exoplanetary population includes more than 700 confirmed planets around main sequence stars. Approximately 50% belong to multiple-planet systems in very diverse dynamical configurations, from two-planet hierarchical systems to multiple resonances that could only have been attained as the consequence of a smooth large-scale orbital migration. The first part of this paper reviews the main detection techniques employed for the detection and orbital characterization of multiple-planet systems, from the (now) classical radial velocity (RV) method to the use of transit time variations (TTV) for the identification of additional planetary bodies orbiting the same star. In the second part we discuss the dynamical evolution of multi-planet systems due to their mutual gravitational interactions. We analyze possible modes of motion for hierarchical, secular or resonant configurations, and what stability criteria can be defined in each case. In some cases, the dynamics can be well approximated by simple analytical expressions for the Hamiltonian function, while other configurations can only be studied with semi-analytical or numerical tools. In particular, we show how mean-motion resonances can generate complex structures in the phase space where different libration islands and circulation domains are separated by chaotic layers. In all cases we use real exoplanetary systems as working examples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Current SoC design trends are characterized by the integration of larger amount of IPs targeting a wide range of application fields. Such multi-application systems are constrained by a set of requirements. In such scenario network-on-chips (NoC) are becoming more important as the on-chip communication structure. Designing an optimal NoC for satisfying the requirements of each individual application requires the specification of a large set of configuration parameters leading to a wide solution space. It has been shown that IP mapping is one of the most critical parameters in NoC design, strongly influencing the SoC performance. IP mapping has been solved for single application systems using single and multi-objective optimization algorithms. In this paper we propose the use of a multi-objective adaptive immune algorithm (M(2)AIA), an evolutionary approach to solve the multi-application NoC mapping problem. Latency and power consumption were adopted as the target multi-objective functions. To compare the efficiency of our approach, our results are compared with those of the genetic and branch and bound multi-objective mapping algorithms. We tested 11 well-known benchmarks, including random and real applications, and combines up to 8 applications at the same SoC. The experimental results showed that the M(2)AIA decreases in average the power consumption and the latency 27.3 and 42.1 % compared to the branch and bound approach and 29.3 and 36.1 % over the genetic approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a theoretical model developed for estimating the power, the optical signal to noise ratio and the number of generated carriers in a comb generator, having as a reference the minimum optical signal do noise ratio at the receiver input, for a given fiber link. Based on the recirculating frequency shifting technique, the generator relies on the use of coherent and orthogonal multi-carriers (Coherent-WDM) that makes use of a single laser source (seed) for feeding high capacity (above 100 Gb/s) systems. The theoretical model has been validated by an experimental demonstration, where 23 comb lines with an optical signal to noise ratio ranging from 25 to 33 dB, in a spectral window of similar to 3.5 nm, are obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last few years, Business Process Management (BPM) has achieved increasing popularity and dissemination. An analysis of the underlying assumptions of BPM shows that it pursues two apparently contradicting goals: on the one hand it aims at formalising work practices into business process models; on the other hand, it intends to confer flexibility to the organization - i.e. to maintain its ability to respond to new and unforeseen situations. This paper analyses the relationship between formalisation and flexibility in business process modelling by means of an empirical case study of a BPM project in an aircraft maintenance company. A qualitative approach is adopted based on the Actor-Network Theory. The paper offers two major contributions: (a) it illustrates the sociotechnical complexity involved in BPM initiatives; (b) it points towards a multidimensional understanding of the relation between formalization and flexibility in BPM projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: We investigated whether 9p21 polymorphisms are associated with cardiovascular events in a group of 611 patients enrolled in the Medical, Angioplasty or Surgery Study II (MASS II), a randomized trial comparing treatments for patients with coronary artery disease (CAD) and preserved left ventricular function. Methods: The participants of the MASS II were genotyped for 9p21 polymorphisms (rs10757274, rs2383206, rs10757278 and rs1333049). Survival curves were calculated with the Kaplan-Meier method and compared with the log-rank statistic. We assessed the relationship between baseline variables and the composite end-point of death, death from cardiac causes and myocardial infarction using a Cox proportional hazards survival model. Results: We observed significant differences between patients within each polymorphism genotype group for baseline characteristics. The frequency of diabetes was lower in patients carrying GG genotype for rs10757274, rs2383206 and rs10757278 (29.4%, 32.8%, 32.0%) compared to patients carrying AA or AG genotypes (49.1% and 39.2%, p = 0.01; 52.4% and 40.1%, p = 0.01; 47.8% and 37.9%, p = 0.04; respectively). Significant differences in genotype frequencies between double and triple vessel disease patients were observed for the rs10757274, rs10757278 and rs1333049. Finally, there was a higher incidence of overall mortality in patients with the GG genotype for rs2383206 compared to patients with AA and AG genotypes (19.5%, 11.9%, 11.0%, respectively; p = 0.04). Moreover, the rs2383206 was still significantly associated with a 1.75-fold increased risk of overall mortality (p = 0.02) even after adjustment of a Cox multivariate model for age, previous myocardial infarction, diabetes, smoking and type of coronary anatomy. Conclusions: Our data are in accordance to previous evidence that chromosome 9p21 genetic variation may constitute a genetic modulator in the cardiovascular system in different scenarios. In patients with established CAD, we observed an association between the rs2383206 and higher incidence of overall mortality and death from cardiac causes in patients with multi-vessel CAD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we propose an efficient and accurate method for fault location in underground distribution systems by means of an Optimum-Path Forest (OPF) classifier. We applied the time domains reflectometry method for signal acquisition, which was further analyzed by OPF and several other well-known pattern recognition techniques. The results indicated that OPF and support vector machines outperformed artificial neural networks and a Bayesian classifier, but OPF was much more efficient than all classifiers for training, and the second fastest for classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the stochastic optimal control problem of discrete-time linear systems subject to Markov jumps and multiplicative noises under two criteria. The first one is an unconstrained mean-variance trade-off performance criterion along the time, and the second one is a minimum variance criterion along the time with constraints on the expected output. We present explicit conditions for the existence of an optimal control strategy for the problems, generalizing previous results in the literature. We conclude the paper by presenting a numerical example of a multi-period portfolio selection problem with regime switching in which it is desired to minimize the sum of the variances of the portfolio along the time under the restriction of keeping the expected value of the portfolio greater than some minimum values specified by the investor. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In multi-label classification, examples can be associated with multiple labels simultaneously. The task of learning from multi-label data can be addressed by methods that transform the multi-label classification problem into several single-label classification problems. The binary relevance approach is one of these methods, where the multi-label learning task is decomposed into several independent binary classification problems, one for each label in the set of labels, and the final labels for each example are determined by aggregating the predictions from all binary classifiers. However, this approach fails to consider any dependency among the labels. Aiming to accurately predict label combinations, in this paper we propose a simple approach that enables the binary classifiers to discover existing label dependency by themselves. An experimental study using decision trees, a kernel method as well as Naive Bayes as base-learning techniques shows the potential of the proposed approach to improve the multi-label classification performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The aim of this study was to evaluate, ex vivo, the precision of five electronic root canal length measurement devices (ERCLMDs) with different operating systems: the Root ZX, Mini Apex Locator, Propex II, iPex, and RomiApex A-15, and the possible influence of the positioning of the instrument tips short of the apical foramen. Material and Methods: Forty-two mandibular bicuspids had their real canal lengths (RL) previously determined. Electronic measurements were performed 1.0 mm short of the apical foramen (-1.0), followed by measurements at the apical foramen (0.0). The data resulting from the comparison of the ERCLMD measurements and the RL were evaluated by the Wilcoxon and Friedman tests at a significance level of 5%. Results: Considering the measurements performed at 0.0 and -1.0, the precision rates for the ERCLMDs were: 73.5% and 47.1% (Root ZX), 73.5% and 55.9% (Mini Apex Locator), 67.6% and 41.1% (Propex II), 61.7% and 44.1% (iPex), and 79.4% and 44.1% (RomiApex A-15), respectively, considering ±0.5 mm of tolerance. Regarding the mean discrepancies, no differences were observed at 0.0; however, in the measurements at -1.0, the iPex, a multi-frequency ERCLMD, had significantly more discrepant readings short of the apical foramen than the other devices, except for the Propex II, which had intermediate results. When the ERCLMDs measurements at -1.0 were compared with those at 0.0, the Propex II, iPex and RomiApex A-15 presented significantly higher discrepancies in their readings. Conclusions: Under the conditions of the present study, all the ERCLMDs provided acceptable measurements at the 0.0 position. However, at the -1.0 position, the ERCLMDs had a lower precision, with statistically significant differences for the Propex II, iPex, and RomiApex A-15.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods We conducted a phase I, multicenter, randomized, double-blind, placebo-controlled, multi-arm (10) parallel study involving healthy adults to evaluate the safety and immunogenicity of influenza A (H1N1) 2009 non-adjuvanted and adjuvanted candidate vaccines. Subjects received two intramuscular injections of one of the candidate vaccines administered 21 days apart. Antibody responses were measured by means of hemagglutination-inhibition assay before and 21 days after each vaccination. The three co-primary immunogenicity end points were the proportion of seroprotection >70%, seroconversion >40%, and the factor increase in the geometric mean titer >2.5. Results A total of 266 participants were enrolled into the study. No deaths or serious adverse events were reported. The most commonly solicited local and systemic adverse events were injection-site pain and headache, respectively. Only three subjects (1.1%) reported severe injection-site pain. Four 2009 influenza A (H1N1) inactivated monovalent candidate vaccines that met the three requirements to evaluate influenza protection, after a single dose, were identified: 15 μg of hemagglutinin antigen without adjuvant; 7.5 μg of hemagglutinin antigen with aluminum hydroxide, MPL and squalene; 3.75 μg of hemagglutinin antigen with aluminum hydroxide and MPL; and 3.75 μg of hemagglutinin antigen with aluminum hydroxide and squalene. Conclusions Adjuvant systems can be safely used in influenza vaccines, including the adjuvant monophosphoryl lipid A (MPL) derived from Bordetella pertussis with squalene and aluminum hydroxide, MPL with aluminum hydroxide, and squalene and aluminum hydroxide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a novel texture descriptor based on fractal theory. The method is based on the Bouligand- Minkowski descriptors. We decompose the original image recursively into four equal parts. In each recursion step, we estimate the average and the deviation of the Bouligand-Minkowski descriptors computed over each part. Thus, we extract entropy features from both average and deviation. The proposed descriptors are provided by concatenating such measures. The method is tested in a classification experiment under well known datasets, that is, Brodatz and Vistex. The results demonstrate that the novel technique achieves better results than classical and state-of-the-art texture descriptors, such as Local Binary Patterns, Gabor-wavelets and co-occurrence matrix.