845 resultados para Sign Data LMS algorithm.
Resumo:
A search is presented for physics beyond the standard model (BSM) in final states with a pair of opposite-sign isolated leptons accompanied by jets and missing transverse energy. The search uses LHC data recorded at a center-of-mass energy s=7 TeV with the CMS detector, corresponding to an integrated luminosity of approximately 5 fb-1. Two complementary search strategies are employed. The first probes models with a specific dilepton production mechanism that leads to a characteristic kinematic edge in the dilepton mass distribution. The second strategy probes models of dilepton production with heavy, colored objects that decay to final states including invisible particles, leading to very large hadronic activity and missing transverse energy. No evidence for an event yield in excess of the standard model expectations is found. Upper limits on the BSM contributions to the signal regions are deduced from the results, which are used to exclude a region of the parameter space of the constrained minimal supersymmetric extension of the standard model. Additional information related to detector efficiencies and response is provided to allow testing specific models of BSM physics not considered in this Letter. © 2012 CERN.
Resumo:
In this paper, a search for supersymmetry (SUSY) is presented in events with two opposite-sign isolated leptons in the final state, accompanied by hadronic jets and missing transverse energy. An artificial neural network is employed to discriminate possible SUSY signals from a standard model background. The analysis uses a data sample collected with the CMS detector during the 2011 LHC run, corresponding to an integrated luminosity of 4.98 fb-1 of proton-proton collisions at the center-of-mass energy of 7 TeV. Compared to other CMS analyses, this one uses relaxed criteria on missing transverse energy (EÌ̧T>40 GeV) and total hadronic transverse energy (HT>120 GeV), thus probing different regions of parameter space. Agreement is found between standard model expectation and observations, yielding limits in the context of the constrained minimal supersymmetric standard model and on a set of simplified models. © 2013 CERN.
Resumo:
Wireless Sensor Networks (WSNs) can be used to monitor hazardous and inaccessible areas. In these situations, the power supply (e.g. battery) of each node cannot be easily replaced. One solution to deal with the limited capacity of current power supplies is to deploy a large number of sensor nodes, since the lifetime and dependability of the network will increase through cooperation among nodes. Applications on WSN may also have other concerns, such as meeting temporal deadlines on message transmissions and maximizing the quality of information. Data fusion is a well-known technique that can be useful for the enhancement of data quality and for the maximization of WSN lifetime. In this paper, we propose an approach that allows the implementation of parallel data fusion techniques in IEEE 802.15.4 networks. One of the main advantages of the proposed approach is that it enables a trade-off between different user-defined metrics through the use of a genetic machine learning algorithm. Simulations and field experiments performed in different communication scenarios highlight significant improvements when compared with, for instance, the Gur Game approach or the implementation of conventional periodic communication techniques over IEEE 802.15.4 networks. © 2013 Elsevier B.V. All rights reserved.
Resumo:
This paper proposes a method by simulated annealing for building roof contours identification from LiDAR-derived digital elevation model. Our method is based on the concept of first extracting aboveground objects and then identifying those objects that are building roof contours. First, to detect aboveground objects (buildings, trees, etc.), the digital elevation model is segmented through a recursive splitting technique followed by a region merging process. Vectorization and polygonization are used to obtain polyline representations of the detected aboveground objects. Second, building roof contours are identified from among the aboveground objects by optimizing a Markov-random-field-based energy function that embodies roof contour attributes and spatial constraints. The solution of this function is a polygon set corresponding to building roof contours and is found by using a minimization technique, like the Simulated Annealing algorithm. Experiments carried out with laser scanning digital elevation model showed that the methodology works properly, as it provides roof contour information with approximately 90% shape accuracy and no verified false positives.
Resumo:
Feature selection aims to find the most important information to save computational efforts and data storage. We formulated this task as a combinatorial optimization problem since the exponential growth of possible solutions makes an exhaustive search infeasible. In this work, we propose a new nature-inspired feature selection technique based on bats behavior, namely, binary bat algorithm The wrapper approach combines the power of exploration of the bats together with the speed of the optimum-path forest classifier to find a better data representation. Experiments in public datasets have shown that the proposed technique can indeed improve the effectiveness of the optimum-path forest and outperform some well-known swarm-based techniques. © 2013 Copyright © 2013 Elsevier Inc. All rights reserved.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.
Resumo:
ABSTRACT: The Kalman-Bucy method is here analized and applied to the solution of a specific filtering problem to increase the signal message/noise ratio. The method is a time domain treatment of a geophysical process classified as stochastic non-stationary. The derivation of the estimator is based on the relationship between the Kalman-Bucy and Wiener approaches for linear systems. In the present work we emphasize the criterion used, the model with apriori information, the algorithm, and the quality as related to the results. The examples are for the ideal well-log response, and the results indicate that this method can be used on a variety of geophysical data treatments, and its study clearly offers a proper insight into modeling and processing of geophysical problems.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We present an implementation of the F-statistic to carry out the first search in data from the Virgo laser interferometric gravitational wave detector for periodic gravitational waves from a priori unknown, isolated rotating neutron stars. We searched a frequency f(0) range from 100 Hz to 1 kHz and the frequency dependent spindown f(1) range from -1.6(f(0)/100 Hz) x 10(-9) Hz s(-1) to zero. A large part of this frequency-spindown space was unexplored by any of the all-sky searches published so far. Our method consisted of a coherent search over two-day periods using the F-statistic, followed by a search for coincidences among the candidates from the two-day segments. We have introduced a number of novel techniques and algorithms that allow the use of the fast Fourier transform (FFT) algorithm in the coherent part of the search resulting in a fifty-fold speed-up in computation of the F-statistic with respect to the algorithm used in the other pipelines. No significant gravitational wave signal was found. The sensitivity of the search was estimated by injecting signals into the data. In the most sensitive parts of the detector band more than 90% of signals would have been detected with dimensionless gravitational-wave amplitude greater than 5 x 10(-24).
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Concept drift, which refers to non stationary learning problems over time, has increasing importance in machine learning and data mining. Many concept drift applications require fast response, which means an algorithm must always be (re)trained with the latest available data. But the process of data labeling is usually expensive and/or time consuming when compared to acquisition of unlabeled data, thus usually only a small fraction of the incoming data may be effectively labeled. Semi-supervised learning methods may help in this scenario, as they use both labeled and unlabeled data in the training process. However, most of them are based on assumptions that the data is static. Therefore, semi-supervised learning with concept drifts is still an open challenging task in machine learning. Recently, a particle competition and cooperation approach has been developed to realize graph-based semi-supervised learning from static data. We have extend that approach to handle data streams and concept drift. The result is a passive algorithm which uses a single classifier approach, naturally adapted to concept changes without any explicit drift detection mechanism. It has built-in mechanisms that provide a natural way of learning from new data, gradually "forgetting" older knowledge as older data items are no longer useful for the classification of newer data items. The proposed algorithm is applied to the KDD Cup 1999 Data of network intrusion, showing its effectiveness.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This Project aims to develop methods for data classification in a Data Warehouse for decision-making purposes. We also have as another goal the reduction of an attribute set in a Data Warehouse, in which a given reduced set is capable of keeping the same properties of the original one. Once we achieve a reduced set, we have a smaller computational cost of processing, we are able to identify non-relevant attributes to certain kinds of situations, and finally we are also able to recognize patterns in the database that will help us to take decisions. In order to achieve these main objectives, it will be implemented the Rough Sets algorithm. We chose PostgreSQL as our data base management system due to its efficiency, consolidation and finally, it’s an open-source system (free distribution)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)