944 resultados para Multi-soft sets
Resumo:
Although some individual techniques of supervised Machine Learning (ML), also known as classifiers, or algorithms of classification, to supply solutions that, most of the time, are considered efficient, have experimental results gotten with the use of large sets of pattern and/or that they have a expressive amount of irrelevant data or incomplete characteristic, that show a decrease in the efficiency of the precision of these techniques. In other words, such techniques can t do an recognition of patterns of an efficient form in complex problems. With the intention to get better performance and efficiency of these ML techniques, were thought about the idea to using some types of LM algorithms work jointly, thus origin to the term Multi-Classifier System (MCS). The MCS s presents, as component, different of LM algorithms, called of base classifiers, and realized a combination of results gotten for these algorithms to reach the final result. So that the MCS has a better performance that the base classifiers, the results gotten for each base classifier must present an certain diversity, in other words, a difference between the results gotten for each classifier that compose the system. It can be said that it does not make signification to have MCS s whose base classifiers have identical answers to the sames patterns. Although the MCS s present better results that the individually systems, has always the search to improve the results gotten for this type of system. Aim at this improvement and a better consistency in the results, as well as a larger diversity of the classifiers of a MCS, comes being recently searched methodologies that present as characteristic the use of weights, or confidence values. These weights can describe the importance that certain classifier supplied when associating with each pattern to a determined class. These weights still are used, in associate with the exits of the classifiers, during the process of recognition (use) of the MCS s. Exist different ways of calculating these weights and can be divided in two categories: the static weights and the dynamic weights. The first category of weights is characterizes for not having the modification of its values during the classification process, different it occurs with the second category, where the values suffers modifications during the classification process. In this work an analysis will be made to verify if the use of the weights, statics as much as dynamics, they can increase the perfomance of the MCS s in comparison with the individually systems. Moreover, will be made an analysis in the diversity gotten for the MCS s, for this mode verify if it has some relation between the use of the weights in the MCS s with different levels of diversity
Resumo:
Deformation bands are structures, developed in porous sandstones, that has small offsets and they are not shown on seismic section. The deformation bands of the pre and synrift sandstones of Araripe Basin were studied in outcrop, macroscopic and microscopic scales. The hierarchical, cinematic and spatial-geometric characteristics, and also the deformational mechanisms acting during its structural evolution were established too. In general, the mesoscopic scale observation allowed to discriminate deformation bands as singles or clusters in three main sets: NNE-SSW dextral; NE-SW normal (sometimes with strike-slip offset); and E-W sinistral; further a bed-parallel deformation bands as a local set. The microscopic characterization allowed to recognize the shearing and cataclastic character of such structures. Through the multi-scale study done in this work we verified that deformation bands analyzed were preferentially developed when sandstones under advanced stage of lithification. We also infer that the geometrical-spatial complexity of these bands, together with the presence of cataclastic matrix, can difficult the migration of fluids in reservoir rocks, resulting on their compartmentalization. Therefore, the study of deformation bands can aid researches about the structural evolution of sedimentary basin, as well as collaborate to understand the hydrodynamic behavior of reservoirs compartmented by these deformational structures
Resumo:
The tectonics activity on the southern border of Parnaíba Basin resulted in a wide range of brittle structures that affect siliciclastic sedimentary rocks. This tectonic activity and related faults, joints, and folds are poorly known. The main aims of this study were (1) to identify lineaments using several remotesensing systems, (2) to check how the interpretation based on these systems at several scales influence the identification of lineaments, and (3) to contribute to the knowledge of brittle tectonics in the southern border of the Parnaíba Basin. The integration of orbital and aerial systems allowed a multi-scale identification, classification, and quantification of lineaments. Maps of lineaments were elaborated in the following scales: 1:200,000 (SRTM Shuttle Radar Topographic Mission), 1:50,000 (Landsat 7 ETM+ satellite), 1:10,000 (aerial photographs) and 1:5,000 (Quickbird satellite). The classification of the features with structural significance allowed the determination of four structural sets: NW, NS, NE, and EW. They were usually identified in all remote-sensing systems. The NE-trending set was not easily identified in aerial photographs but was better visualized on images of medium-resolution systems (SRTM and Landsat 7 ETM+). The same behavior characterizes the NW-trending. The NS-and EW-trending sets were better identified on images from high-resolution systems (aerial photographs and Quickbird). The structural meaning of the lineaments was established after field work. The NEtrending set is associated with normal and strike-slip faults, including deformation bands. These are the oldest structures identified in the region and are related to the reactivation of Precambrian basement structures from the Transbrazilian Lineament. The NW-trending set represents strike-slip and subordinated normal faults. The high dispersion of this set suggests a more recent origin than the previous structures. The NW-trending set may be related to the Picos-Santa Inês Lineament. The NS-and EW-trending sets correspond to large joints (100 m 5 km long). The truncation relationships between these joint sets indicate that the EW-is older than the NS-trending set. The methodology developed by the present work is an excellent tool for the understanding of the regional and local tectonic structures in the Parnaíba basin. It helps the choice of the best remote-sensing system to identify brittle features in a poorly known sedimentary basin
Resumo:
This paper presents a multi-cell single-phase high power factor boost rectifier in interleave connection, operating in critical conduction mode, employing a soft-switching technique, and controlled by Field Programmable Gate Array (FPGA). The soft-switching technique is based on zero-current-switching (ZCS) cells, providing ZC (zero-current) turn-on and ZCZV (zero-current-zero-voltage) turn-off for the active switches, and ZV (zero-vohage) turn-on and ZC (zero-current) turn-off for the boost diodes. The disadvantages related to reverse recovery effects of boost diodes operated in continuous conduction mode (additional losses, and electromagnetic interference (EMI) problems) are minimized, due to the operation in critical conduction mode. In addition, due to the interleaving technique, the rectifier's features include the reduction in the input current ripple, the reduction in the output voltage ripple, the use of low stress devices, low volume for the EMI input filter, high input power factor (PF), and low total harmonic distortion (THD) in the input current, in compliance with the IEC61000-3-2 standards. The digital controller has been developed using a hardware description language (VHDL) and implemented using a XC2S200E-SpartanII-E/Xilinx FPGA device, performing a true critical conduction operation mode for all interleaved cells, and a closed-loop to provide the output voltage regulation, like as a preregulator rectifier. Experimental results are presented for a implemented prototype with two and with four interleaved cells, 400V nominal output voltage and 220V(rms) nominal input voltage, in order to verify the feasibility and performance of the proposed digital control through the use of a FPGA device.
Resumo:
This paper presents a new approach for solving constraint optimization problems (COP) based on the philosophy of lexicographical goal programming. A two-phase methodology for solving COP using a multi-objective strategy is used. In the first phase, the objective function is completely disregarded and the entire search effort is directed towards finding a single feasible solution. In the second phase, the problem is treated as a bi-objective optimization problem, turning the constraint optimization into a two-objective optimization. The two resulting objectives are the original objective function and the constraint violation degree. In the first phase a methodology based on progressive hardening of soft constraints is proposed in order to find feasible solutions. The performance of the proposed methodology was tested on 11 well-known benchmark functions.
Resumo:
Reconfigurable computing is one of the most recent research topics in computer science. The Altera - Nios II soft-core processor can be included in a large set of reconfigurable architectures, especially because it is designed in software, allowing it to be configured according to the application. The recent growth in applications that demand reconfigurable computing made necessary the building of compilers that translate high level languages source codes into reconfigurable devices instruction sets. In this paper we present a compiler that takes as input the bytecodes generated by a Java front-end compiler and generates a set of instructions that attends to the Nios II processor instruction set rules. Our work shows how we process Java bytecodes to the intermediate code, in the Nios II instructions format, and build the control flow and the control dependence graphs. © 2009 IEEE.
Resumo:
Multi-relational data mining enables pattern mining from multiple tables. The existing multi-relational mining association rules algorithms are not able to process large volumes of data, because the amount of memory required exceeds the amount available. The proposed algorithm MRRadix presents a framework that promotes the optimization of memory usage. It also uses the concept of partitioning to handle large volumes of data. The original contribution of this proposal is enable a superior performance when compared to other related algorithms and moreover successfully concludes the task of mining association rules in large databases, bypass the problem of available memory. One of the tests showed that the MR-Radix presents fourteen times less memory usage than the GFP-growth. © 2011 IEEE.
Resumo:
The present paper proposes a new hybrid multi-population genetic algorithm (HMPGA) as an approach to solve the multi-level capacitated lot sizing problem with backlogging. This method combines a multi-population based metaheuristic using fix-and-optimize heuristic and mathematical programming techniques. A total of four test sets from the MULTILSB (Multi-Item Lot-Sizing with Backlogging) library are solved and the results are compared with those reached by two other methods recently published. The results have shown that HMPGA had a better performance for most of the test sets solved, specially when longer computing time is given. © 2012 Elsevier Ltd.
Resumo:
This paper tackles a Nurse Scheduling Problem which consists of generating work schedules for a set of nurses while considering their shift preferences and other requirements. The objective is to maximize the satisfaction of nurses' preferences and minimize the violation of soft constraints. This paper presents a new deterministic heuristic algorithm, called MAPA (multi-assignment problem-based algorithm), which is based on successive resolutions of the assignment problem. The algorithm has two phases: a constructive phase and an improvement phase. The constructive phase builds a full schedule by solving successive assignment problems, one for each day in the planning period. The improvement phase uses a couple of procedures that re-solve assignment problems to produce a better schedule. Given the deterministic nature of this algorithm, the same schedule is obtained each time that the algorithm is applied to the same problem instance. The performance of MAPA is benchmarked against published results for almost 250,000 instances from the NSPLib dataset. In most cases, particularly on large instances of the problem, the results produced by MAPA are better when compared to best-known solutions from the literature. The experiments reported here also show that the MAPA algorithm finds more feasible solutions compared with other algorithms in the literature, which suggest that this proposed approach is effective and robust. © 2013 Springer Science+Business Media New York.
Resumo:
Pós-graduação em Ciências Biológicas (Zoologia) - IBRC
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Given a large image set, in which very few images have labels, how to guess labels for the remaining majority? How to spot images that need brand new labels different from the predefined ones? How to summarize these data to route the user’s attention to what really matters? Here we answer all these questions. Specifically, we propose QuMinS, a fast, scalable solution to two problems: (i) Low-labor labeling (LLL) – given an image set, very few images have labels, find the most appropriate labels for the rest; and (ii) Mining and attention routing – in the same setting, find clusters, the top-'N IND.O' outlier images, and the 'N IND.R' images that best represent the data. Experiments on satellite images spanning up to 2.25 GB show that, contrasting to the state-of-the-art labeling techniques, QuMinS scales linearly on the data size, being up to 40 times faster than top competitors (GCap), still achieving better or equal accuracy, it spots images that potentially require unpredicted labels, and it works even with tiny initial label sets, i.e., nearly five examples. We also report a case study of our method’s practical usage to show that QuMinS is a viable tool for automatic coffee crop detection from remote sensing images.
Resumo:
Membrane proteins play a major role in every living cell. They are the key factors in the cell’s metabolism and in other functions, for example in cell-cell interaction, signal transduction, and transport of ions and nutrients. Cytochrome c oxidase (CcO), as one of the membrane proteins of the respiratory chain, plays a significant role in the energy transformation of higher organisms. CcO is a multi centered heme protein, utilizing redox energy to actively transport protons across the mitochondrial membrane. One aim of this dissertation is to investigate single steps in the mechanism of the ion transfer process coupled to electron transfer, which are not fully understood. The protein-tethered bilayer lipid membrane is a general approach to immobilize membrane proteins in an oriented fashion on a planar electrode embedded in a biomimetic membrane. This system enables the combination of electrochemical techniques with surface enhanced resonance Raman (SERRS), surface enhanced reflection absorption infrared (SEIRAS), and surface plasmon spectroscopy to study protein mediated electron and ion transport processes. The orientation of the enzymes within the surface confined architecture can be controlled by specific site-mutations, i.e. the insertion of a poly-histidine tag to different subunits of the enzyme. CcO can, thus, be oriented uniformly with its natural electron pathway entry pointing either towards or away from the electrode surface. The first orientation allows an ultra-fast direct electron transfer(ET) into the protein, not provided by conventional systems, which can be leveraged to study intrinsic charge transfer processes. The second orientation permits to study the interaction with its natural electron donor cytochrome c. Electrochemical and SERR measurements show conclusively that the redox site structure and the activity of the surface confined enzyme are preserved. Therefore, this biomimetic system offers a unique platform to study the kinetics of the ET processes in order to clarify mechanistic properties of the enzyme. Highly sensitive and ultra fast electrochemical techniques allow the separation of ET steps between all four redox centres including the determination of ET rates. Furthermore, proton transfer coupled to ET could be directly measured and discriminated from other ion transfer processes, revealing novel mechanistic information of the proton transfer mechanism of cytochrome c oxidase. In order to study the kinetics of the ET inside the protein, including the catalytic center, time resolved SEIRAS and SERRS measurements were performed to gain more insight into the structural and coordination changes of the heme environment. The electrical behaviour of tethered membrane systems and membrane intrinsic proteins as well as related charge transfer processes were simulated by solving the respective sets of differential equations, utilizing a software package called SPICE. This helps to understand charge transfer processes across membranes and to develop models that can help to elucidate mechanisms of complex enzymatic processes.
Resumo:
Nowadays the rise of non-recurring engineering (NRE) costs associated with complexity is becoming a major factor in SoC design, limiting both scaling opportunities and the flexibility advantages offered by the integration of complex computational units. The introduction of embedded programmable elements can represent an appealing solution, able both to guarantee the desired flexibility and upgradabilty and to widen the SoC market. In particular embedded FPGA (eFPGA) cores can provide bit-level optimization for those applications which benefits from synthesis, paying on the other side in terms of performance penalties and area overhead with respect to standard cell ASIC implementations. In this scenario this thesis proposes a design methodology for a synthesizable programmable device designed to be embedded in a SoC. A soft-core embedded FPGA (eFPGA) is hence presented and analyzed in terms of the opportunities given by a fully synthesizable approach, following an implementation flow based on Standard-Cell methodology. A key point of the proposed eFPGA template is that it adopts a Multi-Stage Switching Network (MSSN) as the foundation of the programmable interconnects, since it can be efficiently synthesized and optimized through a standard cell based implementation flow, ensuring at the same time an intrinsic congestion-free network topology. The evaluation of the flexibility potentialities of the eFPGA has been performed using different technology libraries (STMicroelectronics CMOS 65nm and BCD9s 0.11μm) through a design space exploration in terms of area-speed-leakage tradeoffs, enabled by the full synthesizability of the template. Since the most relevant disadvantage of the adopted soft approach, compared to a hardcore, is represented by a performance overhead increase, the eFPGA analysis has been made targeting small area budgets. The generation of the configuration bitstream has been obtained thanks to the implementation of a custom CAD flow environment, and has allowed functional verification and performance evaluation through an application-aware analysis.
Virtobot--a multi-functional robotic system for 3D surface scanning and automatic post mortem biopsy
Resumo:
The Virtopsy project, a multi-disciplinary project that involves forensic science, diagnostic imaging, computer science, automation technology, telematics and biomechanics, aims to develop new techniques to improve the outcome of forensic investigations. This paper presents a new approach in the field of minimally invasive virtual autopsy for a versatile robotic system that is able to perform three-dimensional (3D) surface scans as well as post mortem image-guided soft tissue biopsies.