987 resultados para [INFO] Computer Science [cs]
Resumo:
We study the relationship between topological scales and dynamic time scales in complex networks. The analysis is based on the full dynamics towards synchronization of a system of coupled oscillators. In the synchronization process, modular structures corresponding to well-defined communities of nodes emerge in different time scales, ordered in a hierarchical way. The analysis also provides a useful connection between synchronization dynamics, complex networks topology, and spectral graph analysis.
Resumo:
A recent method used to optimize biased neural networks with low levels of activity is applied to a hierarchical model. As a consequence, the performance of the system is strongly enhanced. The steps to achieve optimization are analyzed in detail.
Resumo:
Remote sensing image processing is nowadays a mature research area. The techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics, and communications. From a machine learning and signal/image processing point of view, all the applications are tackled under specific formalisms, such as classification and clustering, regression and function approximation, image coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This paper serves as a survey of methods and applications, and reviews the last methodological advances in remote sensing image processing.
Resumo:
Este proyecto de final de carrera de la titulación de Ingeniería en Informática de Gestión consiste en el diseño y desarrollo de un sistema de control de dispositivos conectados a Raspberry Pi y gestionados desde un dispositivo Android. Se podrán llevar a cabo diferentes acciones como encender, apagar, programar la calefacción y consultar la temperatura actual remotamente desde un dispostivo conectado a la red wifi doméstica o desde de cualquier red que esté conectada a internet.
Resumo:
Finding an adequate paraphrase representation formalism is a challenging issue in Natural Language Processing. In this paper, we analyse the performance of Tree Edit Distance as a paraphrase representation baseline. Our experiments using Edit Distance Textual Entailment Suite show that, as Tree Edit Distance consists of a purely syntactic approach, paraphrase alternations not based on structural reorganizations do not find an adequate representation. They also show that there is much scope for better modelling of the way trees are aligned.
Resumo:
In this paper, we present a critical analysis of the state of the art in the definition and typologies of paraphrasing. This analysis shows that there exists no characterization of paraphrasing that is comprehensive, linguistically based and computationally tractable at the same time. The following sets out to define and delimit the concept on the basis of the propositional content. We present a general, inclusive and computationally oriented typology of the linguistic mechanisms that give rise to form variations between paraphrase pairs.
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flow computation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we de- velop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional mul- tilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrec- tional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimiza- tion search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow com- putation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
Although paraphrasing is the linguistic mechanism underlying many plagiarism cases, little attention has been paid to its analysis in the framework of automatic plagiarism detection. Therefore, state-of-the-art plagiarism detectors find it difficult to detect cases of paraphrase plagiarism. In this article, we analyse the relationship between paraphrasing and plagiarism, paying special attention to which paraphrase phenomena underlie acts of plagiarism and which of them are detected by plagiarism detection systems. With this aim in mind, we created the P4P corpus, a new resource which uses a paraphrase typology to annotate a subset of the PAN-PC-10 corpus for automatic plagiarism detection. The results of the Second International Competition on Plagiarism Detection were analysed in the light of this annotation. The presented experiments show that (i) more complex paraphrase phenomena and a high density of paraphrase mechanisms make plagiarism detection more difficult, (ii) lexical substitutions are the paraphrase mechanisms used the most when plagiarising, and (iii) paraphrase mechanisms tend to shorten the plagiarized text. For the first time, the paraphrase mechanisms behind plagiarism have been analysed, providing critical insights for the improvement of automatic plagiarism detection systems.
Resumo:
L'article és una reflexió sobre els requisits de formació dels professionals que demana la societat del coneixement. Un dels objectius més importants que ha de tenir la universitat en la societat del coneixement és la formació de professionals competents que tinguin prou eines intel·lectuals per a enfrontar-se a la incertesa de la informació, a la consciència que aquesta té una data de caducitat a curt termini i a l'ansietat que això provoca. Però, a més, també han de ser capaços de definir i crear les eines de treball amb què donaran sentit i eficàcia a aquest coneixement mudable i mutant. Per això, l'espai europeu d'ensenyament superior prioritza la competència transversal del treball col·laboratiu amb l'objectiu de promoure un aprenentatge autònom, compromès i adaptat a les noves necessitats de l'empresa del segle xxi. En aquest context, es presenta l'entorn teòric que fonamenta el treball desenvolupat a la plataforma informàtica ACME, que uneix el treball col·laboratiu i l'aprenentatge semipresencial o blended learning. Així mateix, es descriuen amb detall alguns exemples de wikis, paradigma del treball col·laboratiu, fets en assignatures impartides per la Universitat de Girona en l'espai virtual ACME
Resumo:
L'article és una reflexió sobre els requisits de formació dels professionals que demana la societat del coneixement. Un dels objectius més importants que ha de tenir la universitat en la societat del coneixement és la formació de professionals competents que tinguin prou eines intel·lectuals per a enfrontar-se a la incertesa de la informació, a la consciència que aquesta té una data de caducitat a curt termini i a l'ansietat que això provoca. Però, a més, també han de ser capaços de definir i crear les eines de treball amb què donaran sentit i eficàcia a aquest coneixement mudable i mutant. Per això, l'espai europeu d'ensenyament superior prioritza la competència transversal del treball col·laboratiu amb l'objectiu de promoure un aprenentatge autònom, compromès i adaptat a les noves necessitats de l'empresa del segle xxi. En aquest context, es presenta l'entorn teòric que fonamenta el treball desenvolupat a la plataforma informàtica ACME, que uneix el treball col·laboratiu i l'aprenentatge semipresencial o blended learning. Així mateix, es descriuen amb detall alguns exemples de wikis, paradigma del treball col·laboratiu, fets en assignatures impartides per la Universitat de Girona en l'espai virtual ACME
Resumo:
We consider the numerical treatment of the optical flow problem by evaluating the performance of the trust region method versus the line search method. To the best of our knowledge, the trust region method is studied here for the first time for variational optical flow computation. Four different optical flow models are used to test the performance of the proposed algorithm combining linear and nonlinear data terms with quadratic and TV regularization. We show that trust region often performs better than line search; especially in the presence of non-linearity and non-convexity in the model.
Resumo:
This work focuses on the prediction of the two main nitrogenous variables that describe the water quality at the effluent of a Wastewater Treatment Plant. We have developed two kind of Neural Networks architectures based on considering only one output or, in the other hand, the usual five effluent variables that define the water quality: suspended solids, biochemical organic matter, chemical organic matter, total nitrogen and total Kjedhal nitrogen. Two learning techniques based on a classical adaptative gradient and a Kalman filter have been implemented. In order to try to improve generalization and performance we have selected variables by means genetic algorithms and fuzzy systems. The training, testing and validation sets show that the final networks are able to learn enough well the simulated available data specially for the total nitrogen
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
Resumo:
Se describen algunas aplicaciones de la teoría de matrices a diversos temas pertenecientes alámbito de la matem\'atica discreta.
Resumo:
L'estudi analitza la producció científica de la UPC vinculada amb la informàtica i es compara amb la d'altres 16 universitats de l’estat espanyol, europees, dels Estats Units i asiàtiques, amb una notable activitat investigadora en aquest camp.