992 resultados para Extracting information
Resumo:
In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.
Resumo:
Behovet av att analysera volatilt minne från Macintosh-datorer med OS X har blivit allt mer betydelsefull på grund av att deras datorer blivit allt populärare och att volatil minnesanalysering blivit en allt viktigare del i en IT-forensikers arbete. Anledningen till att volatil minnesanalysering blivit allt viktigare är för att det går att finna viktig information som inte finns lagrad permanent på datorns interna hårddisk. Problemet som låg till grunden för det här examensarbetet var att det uppenbart fanns brist på undersökningsmetoder av det volatila minnet för Mac-datorer med OS X.Syftet med detta arbete var därför att undersöka möjligheten att utvinna information från ett volatilt minne från en Mac-dator med OS X genom att kartlägga och bedöma olika undersökningsmetoder. För att göra denna undersökning har litteraturstudier, informella intervjuer, egna kunskaper och praktiska försök genomförts.Slutsatsen blev att möjligheten att utvinna information från det volatila minnet från en Mac-dator med OS X är relativt begränsad. Det största problemet är själva dumpningen av minnet. Många av dumpningsmetoderna som finns att tillgå kräver administrativa rättigheter. Vid analysering av en minnesdump bör man aldrig förlita sig på en analysmetod då olika analysmetoder ger olika resultat som kan vara till nytta för en vidare undersökning av en Mac-dator.
Resumo:
A growing awareness of the modern society about the direct relationship between a growing global community with increasing total industrial activities on one hand and various environmental problems and a natural limitation of natural resources on the other hand set the base for sustainable or “green” approaches within the supply chain. This paper therefore will look at the issue of “Green Logistics” which seeks to reduce the environmental impact of logistics activities by taking into account functions such as recycling, waste and carbon emission reduction and the use of alternative sources of energy. In order to analyze how these approaches and ideas are being perceived by the system as a whole two models from the area of prospective and scenario planning are being used and described to identify the main drivers and tendencies within the system in order to create feasible hypothesis. Using the URCA/CHIVAS model allows us to identify the driver variables out of a high number of variables that best describe the system “Green Logistics”. Followed by the analysis of the actor’s strategies in the system with the Mactor model it is possible to reduce the complexity of a completely holistic system to a few key drivers that can be analyzed further on. Here the implications of URCA/CHIVAS and Mactor are being used to formulate hypotheses about the perception of Green Logistics and its successful implementation among logistics decision makers by an online survey. This research seeks to demonstrate the usefulness of scenario planning to a highly complex system observing it from all angles and extracting information about the relevant factors of it. The results of this demonstration indicate that there are drivers much beyond the factory walls that need to be considered when implementing successfully a system such as Green Logistics.
Resumo:
Neural networks and wavelet transform have been recently seen as attractive tools for developing eficient solutions for many real world problems in function approximation. Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. So, mathematical model is a very important tool to guarantee the development of the neural network area. In this article we will introduce one series of mathematical demonstrations that guarantee the wavelets properties for the PPS functions. As application, we will show the use of PPS-wavelets in pattern recognition problems of handwritten digit through function approximation techniques.
Resumo:
This work presents the tVoice, software that manipulates tags languages, extracting information and, being integral part of the VoiceProxy system, it aids bearers of special needs in the access to the Web. This system is responsible for the search and treatment of the documents in the Web, extracting the textual information contained in those documents and preceding the capability of generating eventually through translation techniques, an audio script, used by the of interface subsystem of VoiceProxy, the iVoice, in the process of voice synthesis. In this stage the tVoice, besides the treatment of the tag language HTML, processes other two formats of documents, PDF and XHTML. Additionally to allow that, besides the iVoice, other interface subsystems can make use of the tVoice through remote access, we propose distribution systems techniques based in the model Client-Server providers operations of the fashion of a proxy server treatment of documents
Resumo:
Extensive systematizations of theoretical and experimental nuclear densities and of optical potential strengths extracted from heavy-ion elastic scattering data analyses at low and intermediate energies are presented. The energy-dependence of the nuclear potential is accounted for within a model based on the nonlocal nature of the interaction. The systematics indicates that the heavy-ion nuclear potential can be described in a simple global way through a double-folding shape, which basically depends only on the density of nucleons of the partners in the collision. The possibility of extracting information about the nucleon-nucleon interaction from the heavy-ion potential is investigated.
Resumo:
We present a general formalism for extracting information on the fundamental parameters associated with neutrino masses and mixings from two or more long baseline neutrino oscillation experiments. This formalism is then applied to the current most likely experiments using neutrino beams from the Japan Hadron Facility (JHF) and Fermilab's NuMI beamline. Different combinations of muon neutrino or muon anti-neutrino running are considered. The type of neutrino mass hierarchy is extracted using the effects of matter on neutrino propogation. Contrary to naive expectation, we find that both beams using neutrinos is more suitable for determining the hierarchy provided that the neutrino energy divided by baseline (E/L) for NuMI is smaller than or equal to that of JHF, whereas to determine the small mixing angle, theta(13), and the CP or T violating phase delta, one neutrino and the other anti-neutrino are most suitable. We make extensive use of bi-probability diagrams for both understanding and extracting the physics involved in such comparisons.
Resumo:
Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. Neural networks and wavenets have been recently seen as attractive tools for developing efficient solutions for many real world problems in function approximation. In this paper, it is shown how feedforward neural networks can be built using a different type of activation function referred to as the PPS-wavelet. An algorithm is presented to generate a family of PPS-wavelets that can be used to efficiently construct feedforward networks for function approximation.
Resumo:
Extensive systematizations of theoretical and experimental nuclear densities and of optical potential strengths extracted from heavy-ion elastic scattering data analyses at low and intermediate energies are presented. The energy dependence of the nuclear potential is accounted for within a model based on the nonlocal nature of the interaction. The systematics indicates that the heavy-ion nuclear potential can be described in a simple global way through a double-folding shape, which basically depends only on the density of nucleons of the partners in the collision. The possibility of extracting information about the nucleon-nucleon interaction from the heavy-ion potential is investigated.
Resumo:
Pós-graduação em Agronomia (Energia na Agricultura) - FCA
Resumo:
Function approximation is a very important task in environments where the computation has to be based on extracting information from data samples in real world processes. So, the development of new mathematical model is a very important activity to guarantee the evolution of the function approximation area. In this sense, we will present the Polynomials Powers of Sigmoid (PPS) as a linear neural network. In this paper, we will introduce one series of practical results for the Polynomials Powers of Sigmoid, where we will show some advantages of the use of the powers of sigmiod functions in relationship the traditional MLP-Backpropagation and Polynomials in functions approximation problems.
Resumo:
We investigated the effects of texture gradient and the position of test stimulus in relation to the horizon on the perception of relative sizes. By using the staircase method, 50 participants adjusted the size of a bar presented above, below or on the horizon as it could be perceived in the same size of a bar presented in the lower visual field. Stimuli were presented during 100ms on five background conditions. Perspective gradient contributed more to the overestimation of relative sizes than compression gradient. The sizes of the objects which intercepted the horizon line were overestimated. Visual system was very effective in extracting information from perspective depth cues, making it even during very brief exposure.
Resumo:
The floods that occurred on the Aare and Rhine rivers in May 2015 and the mostly successful handling of this event in terms of flood protection measures are a good reminder of how important it is to comprehend the causes and processes involved in such natural hazards. While the needed data series of gauge measurements and peak discharge calculations reach back to the 19th century, historical records dating further back in time can provide additional and useful information to help understanding extreme flood events and to evaluate prevention measures such as river dams and corrections undertaken prior to instrumental measurements. In my PhD project I will use a wide range of historical sources to assess and quantify past extreme flood events. It is part of the SNF-funded project “Reconstruction of the Genesis, Process and Impact of Major Pre-instrumental Flood Events of Major Swiss Rivers Including a Peak Discharge Quantification” and will cover the research locations Fribourg (Saane R.), Burgdorf (Emme R.), Thun, Bern (both Aare R.), and the Lake of Constance at the locations Lindau, Constance and Rorschach. My main goals are to provide a long time series of quantitative data for extreme flood events, to discuss the occurring changes in these data, and to evaluate the impact of the aforementioned human influences on the drainage system. Extracting information given in account books from the towns of Basel and Solothurn may also enable me to assess the frequency and seasonality of less severe river floods. Finally, historical information will be used for remodeling the historical hydrological regime to homogenize the historical data series to modern day conditions and thus make it comparable to the data provided by instrumental measurements. The method I will apply for processing all information provided by historical sources such as chronicles, newspapers, institutional records, as well as flood marks, paintings and archeological evidence has been developed and successfully applied to the site of Basel by Wetter et al. (2011). They have also shown that data homogenization is possible by reconstructing previous stream flow conditions using historical river profiles and by carefully observing and re-constructing human changes of the river bed and its surroundings. Taken all information into account, peak discharges for past extreme flood events will be calculated with a one-dimensional hydrological model.