4 resultados para complex data

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It s notorious the advance of computer networks in recent decades, whether in relation to transmission rates, the number of interconnected devices or the existing applications. In parallel, it s also visible this progress in various sectors of the automation, such as: industrial, commercial and residential. In one of its branches, we find the hospital networks, which can make the use of a range of services, ranging from the simple registration of patients to a surgery by a robot under the supervision of a physician. In the context of both worlds, appear the applications in Telemedicine and Telehealth, which work with the transfer in real time of high resolution images, sound, video and patient data. Then comes a problem, since the computer networks, originally developed for the transfer of less complex data, is now being used by a service that involves high transfer rates and needs requirements for quality of service (QoS) offered by the network . Thus, this work aims to do the analysis and comparison of performance of a network when subjected to this type of application, for two different situations: the first without the use of QoS policies, and the second with the application of such policies, using as scenario for testing, the Metropolitan Health Network of the Federal University of Rio Grande do Norte (UFRN)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Oil prospecting is one of most complex and important features of oil industry Direct prospecting methods like drilling well logs are very expensive, in consequence indirect methods are preferred. Among the indirect prospecting techniques the seismic imaging is a relevant method. Seismic method is based on artificial seismic waves that are generated, go through the geologic medium suffering diffraction and reflexion and return to the surface where they are recorded and analyzed to construct seismograms. However, the seismogram contains not only actual geologic information, but also noise, and one of the main components of the noise is the ground roll. Noise attenuation is essential for a good geologic interpretation of the seismogram. It is common to study seismograms by using time-frequency transformations that map the seismic signal into a frequency space where it is easier to remove or attenuate noise. After that, data is reconstructed in the original space in such a way that geologic structures are shown in more detail. In addition, the curvelet transform is a new and effective spectral transformation that have been used in the analysis of complex data. In this work, we employ the curvelet transform to represent geologic data using basis functions that are directional in space. This particular basis can represent more effectively two dimensional objects with contours and lines. The curvelet analysis maps real space into frequencies scales and angular sectors in such way that we can distinguish in detail the sub-spaces where is the noise and remove the coefficients corresponding to the undesired data. In this work we develop and apply the denoising analysis to remove the ground roll of seismograms. We apply this technique to a artificial seismogram and to a real one. In both cases we obtain a good noise attenuation

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of increasingly complex software applications is demanding greater investment in the development of such systems to ensure applications with better quality. Therefore, new techniques are being used in Software Engineering, thus making the development process more effective. Among these new approaches, we highlight Formal Methods, which use formal languages that are strongly based on mathematics and have a well-defined semantics and syntax. One of these languages is Circus, which can be used to model concurrent systems. It was developed from the union of concepts from two other specification languages: Z, which specifies systems with complex data, and CSP, which is normally used to model concurrent systems. Circus has an associated refinement calculus, which can be used to develop software in a precise and stepwise fashion. Each step is justified by the application of a refinement law (possibly with the discharge of proof obligations). Sometimes, the same laws can be applied in the same manner in different developments or even in different parts of a single development. A strategy to optimize this calculus is to formalise these application as a refinement tactic, which can then be used as a single transformation rule. CRefine was developed to support the Circus refinement calculus. However, before the work presented here, it did not provide support for refinement tactics. The aim of this work is to provide tool support for refinement tactics. For that, we develop a new module in CRefine, which automates the process of defining and applying refinement tactics that are formalised in the tactic language ArcAngelC. Finally, we validate the extension by applying the new module in a case study, which used the refinement tactics in a refinement strategy for verification of SPARK Ada implementations of control systems. In this work, we apply our module in the first two phases of this strategy