853 resultados para Tutorial on Computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant set of information stored in different databases around the world, can be shared through peer-topeer databases. With that, is obtained a large base of knowledge, without the need for large investments because they are used existing databases, as well as the infrastructure in place. However, the structural characteristics of peer-topeer, makes complex the process of finding such information. On the other side, these databases are often heterogeneous in their schemas, but semantically similar in their content. A good peer-to-peer databases systems should allow the user access information from databases scattered across the network and receive only the information really relate to your topic of interest. This paper proposes to use ontologies in peer-to-peer database queries to represent the semantics inherent to the data. The main contribution of this work is enable integration between heterogeneous databases, improve the performance of such queries and use the algorithm of optimization Ant Colony to solve the problem of locating information on peer-to-peer networks, which presents an improve of 18% in results. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aiming to ensure greater reliability and consistency of data stored in the database, the data cleaning stage is set early in the process of Knowledge Discovery in Databases (KDD) and is responsible for eliminating problems and adjust the data for the later stages, especially for the stage of data mining. Such problems occur in the instance level and schema, namely, missing values, null values, duplicate tuples, values outside the domain, among others. Several algorithms were developed to perform the cleaning step in databases, some of them were developed specifically to work with the phonetics of words, since a word can be written in different ways. Within this perspective, this work presents as original contribution an optimization of algorithm for the detection of duplicate tuples in databases through phonetic based on multithreading without the need for trained data, as well as an independent environment of language to be supported for this. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of snoring on the cardiovascular system is not well-known. In this study we analyzed the Heart Rate Variability (HRV) differences between light and heavy snorers. The experiments are done on the full-whole-night polysomnography (PSG) with ECG and audio channels from patient group (heavy snorer) and control group (light snorer), which are gender- and age-paired, totally 30 subjects. A feature Snoring Density (SND) of audio signal as classification criterion and HRV features are computed. Mann-Whitney statistical test and Support Vector Machine (SVM) classification are done to see the correlation. The result of this study shows that snoring has close impact on the HRV features. This result can provide a deeper insight into the physiological understand of snoring. © 2011 CCAL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A risks management, carried on in an effective way, leads the software development to success and may influence on the organization. The knowledge takes part of such a process as a way to help taking decisions. This research aimed to analyze the use of Knowledge Management techniques to the Risk Management in software projects development and the possible influence on the enterprise revenue. It had, as its main studying subject, Brazilian incubated and graduated software developing enterprises. The chosen research method was the Survey type. Multivariate statistical methods were used for the treatment and analysis of the obtained results, this way identifying the most significant factors, that is, enterprise's achievement constraining factors and those outcome achievement ones. Among the latter we highlight the knowledge methodology, the time of existence of the enterprise, the amount of employees and the knowledge externalization. The results encourage contributing actions to the increasing of financial revenue. © 2013 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parametric VaR (Value-at-Risk) is widely used due to its simplicity and easy calculation. However, the normality assumption, often used in the estimation of the parametric VaR, does not provide satisfactory estimates for risk exposure. Therefore, this study suggests a method for computing the parametric VaR based on goodness-of-fit tests using the empirical distribution function (EDF) for extreme returns, and compares the feasibility of this method for the banking sector in an emerging market and in a developed one. The paper also discusses possible theoretical contributions in related fields like enterprise risk management (ERM). © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Includes bibliography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The “Implementation of the National Data Centre” project, Augusto Espín, Deputy Minister of Telecommunications and Information Society, Ecuador .-- Cloud computing and public policy in Brazil, Rafael Henrique Rodrigues Moreira, Ministry of Science, Technology and Innovation, Brazil .-- “The cloud is being taken up more quickly in Latin America than in the rest of the world”, interview to Lalo Steinmann, Microsoft .-- The impact of education and research networks on the development of cloud computing Eduardo Vera, University of Chile .-- “The cloud helps to narrow divides by providing access to technology resources that used to be unaffordable”, interview to Luis Urzúa, Movistar Chile .-- “Cloud computing will be a strategic sector of the economy in the coming years”, interview to Jean-Bernard Gramunt, France’s digital strategy .-- “If take-up in Latin America is as strong as predicted, it will be a good opportunity to create and export technology”, interview to Flavio Junqueira, Yahoo! Labs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A seleção de métodos apropriados para a análise estatística pode parecer complexa, principalmente para estudantes de pós-graduação e pesquisadores no início da carreira científica. Por outro lado, a apresentação em PowerPoint é uma ferramenta comum para estudantes e pesquisadores. Assim, um tutorial de Bioestatística desenvolvido em uma apresentação em PowerPoint poderia estreitar a distância entre ortodontistas e a Bioestatística. Esse guia proporciona informações úteis e objetivas a respeito de vários métodos estatísticos empregando exemplos relacionados à Odontologia e, mais especificamente, à Ortodontia. Esse tutorial deve ser empregado, principalmente, para o usuário obter algumas respostas a questões comuns relacionadas ao teste mais apropriado para executar comparações entre grupos, examinar correlações e regressões ou analisar o erro do método. Também pode ser obtido auxílio para checar a distribuição dos dados (normal ou anormal) e a escolha do gráfico mais adequado para a apresentação dos resultados. Esse guia pode ainda ser de bastante utilidade para revisores de periódicos examinarem, de forma rápida, a adequabilidade do método estatístico apresentado em um artigo submetido à publicação.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neste tutorial apresentamos uma revisão da deconvolução de Euler que consiste de três partes. Na primeira parte, recordamos o papel da clássica formulação da deconvolução de Euler 2D e 3D como um método para localizar automaticamente fontes de campos potenciais anômalas e apontamos as dificuldades desta formulação: a presença de uma indesejável nuvem de soluções, o critério empírico usado para determinar o índice estrutural (um parâmetro relacionado com a natureza da fonte anômala), a exeqüibilidade da aplicação da deconvolução de Euler a levantamentos magnéticos terrestres, e a determinação do mergulho e do contraste de susceptibilidade magnética de contatos geológicos (ou o produto do contraste de susceptibilidade e a espessura quando aplicado a dique fino). Na segunda parte, apresentamos as recentes melhorias objetivando minimizar algumas dificuldades apresentadas na primeira parte deste tutorial. Entre estas melhorias incluem-se: i) a seleção das soluções essencialmente associadas com observações apresentando alta razão sinal-ruído; ii) o uso da correlação entre a estimativa do nível de base da anomalia e a própria anomalia observada ou a combinação da deconvolução de Euler com o sinal analítico para determinação do índice estrutural; iii) a combinação dos resultados de (i) e (ii), permitindo estimar o índice estrutural independentemente do número de soluções; desta forma, um menor número de observações (tal como em levantamentos terrestres) pode ser usado; iv) a introdução de equações adicionais independentes da equação de Euler que permitem estimar o mergulho e o contraste de susceptibilidade das fontes magnéticas 2D. Na terceira parte apresentaremos um prognóstico sobre futuros desenvolvimentos a curto e médio prazo envolvendo a deconvolução de Euler. As principais perspectivas são: i) novos ataques aos problemas selecionados na segunda parte deste tutorial; ii) desenvolvimento de métodos que permitam considerar interferências de fontes localizadas ao lado ou acima da fonte principal, e iii) uso das estimativas de localização da fonte anômala produzidas pela deconvolução de Euler como vínculos em métodos de inversão para obter a delineação das fontes em um ambiente computacional amigável.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, the authors investigate, from an interdisciplinary perspective, possible ethical implications of the presence of ubiquitous computing systems in human perception/action. The term ubiquitous computing is used to characterize information-processing capacity from computers that are available everywhere and all the time, integrated into everyday objects and activities. The contrast in approach to aspects of ubiquitous computing between traditional considerations of ethical issues and the Ecological Philosophy view concerning its possible consequences in the context of perception/action are the underlying themes of this paper. The focus is on an analysis of how the generalized dissemination of microprocessors in embedded systems, commanded by a ubiquitous computing system, can affect the behaviour of people considered as embodied embedded agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Huge image collections are becoming available lately. In this scenario, the use of Content-Based Image Retrieval (CBIR) systems has emerged as a promising approach to support image searches. The objective of CBIR systems is to retrieve the most similar images in a collection, given a query image, by taking into account image visual properties such as texture, color, and shape. In these systems, the effectiveness of the retrieval process depends heavily on the accuracy of ranking approaches. Recently, re-ranking approaches have been proposed to improve the effectiveness of CBIR systems by taking into account the relationships among images. The re-ranking approaches consider the relationships among all images in a given dataset. These approaches typically demands a huge amount of computational power, which hampers its use in practical situations. On the other hand, these methods can be massively parallelized. In this paper, we propose to speedup the computation of the RL-Sim algorithm, a recently proposed image re-ranking approach, by using the computational power of Graphics Processing Units (GPU). GPUs are emerging as relatively inexpensive parallel processors that are becoming available on a wide range of computer systems. We address the image re-ranking performance challenges by proposing a parallel solution designed to fit the computational model of GPUs. We conducted an experimental evaluation considering different implementations and devices. Experimental results demonstrate that significant performance gains can be obtained. Our approach achieves speedups of 7x from serial implementation considering the overall algorithm and up to 36x on its core steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technologies are rapidly developing, but some of them present in the computers, as for instance their processing capacity, are reaching their physical limits. It is up to quantum computation offer solutions to these limitations and issues that may arise. In the field of information security, encryption is of paramount importance, being then the development of quantum methods instead of the classics, given the computational power offered by quantum computing. In the quantum world, the physical states are interrelated, thus occurring phenomenon called entanglement. This study presents both a theoretical essay on the merits of quantum mechanics, computing, information, cryptography and quantum entropy, and some simulations, implementing in C language the effects of entropy of entanglement of photons in a data transmission, using Von Neumann entropy and Tsallis entropy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An expression for the U(3) content of the matrix elements of one- and two-body operators in Elliott's basis is obtained. Three alternative ways of evaluating this content with increasing performance in computing time are presented. All of them allow an exact representation of that content in terms of integers, avoiding Founding errors in the computer codes. The role of dual bases in dealing with nonorthogonal bases is also clarified. © 1992 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usually we observe that Bio-physical systems or Bio-chemical systems are many a time based on nanoscale phenomenon in different host environments, which involve many particles can often not be solved explicitly. Instead a physicist, biologist or a chemist has to rely either on approximate or numerical methods. For a certain type of systems, called integrable in nature, there exist particular mathematical structures and symmetries which facilitate the exact and explicit description. Most integrable systems, we come across are low-dimensional, for instance, a one-dimensional chain of coupled atoms in DNA molecular system with a particular direction or exist as a vector in the environment. This theoretical research paper aims at bringing one of the pioneering ‘Reaction-Diffusion’ aspects of the DNA-plasma material system based on an integrable lattice model approach utilizing quantized functional algebras, to disseminate the new developments, initiate novel computational and design paradigms.