934 resultados para Computer Science(all)
Resumo:
El software lliure està tenint últimament un pes cada cop més important en les empreses, però encara és el gran desconegut per a molta gent. Des de la seva creació als anys 80 fins ara, hi ha hagut un creixement exponencial de software lliure de gran qualitat, oferint eines per a tot tipus de necessitats, eines ofimàtiques, gestors de correu, sistemes de fitxer, sistemes operatius…. Tot aquest moviment no ha passat desapercebut per a molts usuaris i empreses, que s’han aprofitat d’ell per cobrir les seves necessitats. Pel que fa a les empreses, cada cop n’hi ha més que en petita o gran mesura, utilitzen el software lliure, ja sigui per el seu menor cost d’adquisició, o bé per la seva gran fiabilitat o per que és fàcilment adaptable o per no establir cap lligam tecnològic, en definitiva per tenir més llibertat. En el moment de la creació d’una nova empresa, on es parteix de zero en tota la tecnologia informàtica, és el moment menys costòs d’implementar l’arquitectura informàtica amb software lliure, és quan l’impacte que té sobre l’empresa, usuaris i clients és menor. En les empreses que ja tenen un sistema informàtic, caldrà establir un pla de migració, ja sigui total o parcial. La finalitat d’aquest projecte no és la de dir quin software és millor que l’altre o de dir quin s’ha d’instal•lar, sinó el de donar a conèixer el món del software lliure, mostrar part d’aquest software, fer alguna comparativa de software lliure amb software propietari, donant idees i un conjunt de solucions per a empreses, per què una empresa pugui agafar idees d’implementació d’algunes de les solucions informàtiques exposades o seguir algun dels consells proposats. Actualment ja hi ha moltes empreses que utilitzen software lliure. Algunes només n’utilitzen una petita part en les seves instal•lacions, ja que el fet de que una empresa funcioni al 100% amb software lliure, tot i que n’hi comença ha haver, de moment ho considero una mica arriscat, però que en poc temps, aquest fet serà cada cop més habitual.
Resumo:
Remote sensing image processing is nowadays a mature research area. The techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics, and communications. From a machine learning and signal/image processing point of view, all the applications are tackled under specific formalisms, such as classification and clustering, regression and function approximation, image coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This paper serves as a survey of methods and applications, and reviews the last methodological advances in remote sensing image processing.
Resumo:
The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.
Resumo:
Realització d'un sistema de Business Intelligence que analitzi les dades extretes dels tweets de la plataforma Twitter en relació a les hospitalitzacions produïdes a un hospital de Catalunya, per tal de tenir una anàlisi predictiva de l'aparició d'un brot de grip. El treball va més enllà a l'emprar una tecnologia no convencional per la implementació del sistema BI. S'escull la dupla ElasticSearch i Kibana per tal d'aconseguir un sistema robust, distribuït, escalable i, sobretot, totalment personalitzable. Després d'un estudi d'aquestes dos solucions, incloent els plugins de monitoratge i càrrega de dades, s'ha elaborat un data warehouse complet i un quadre de comandament introductori.
Resumo:
The use of contextual information in mobile devices is receiving increasing attention in mobile and ubiquitous computing research. An important requirement for mobile development today is that devices should be able to interact with the context. In this paper we present a series of contributions regarding previous work on context-awareness. In the first place, we describe a client-server architecture that provides a mechanism for preparing target non context-aware applications in order to be delivered as context-aware applications in a semi-automatic way. Secondly, the framework used in the server to instantiate specific components for context-awareness, the Implicit Plasticity Framework, provides independence from the underlying mobile technology used in client device, as it is shown in the case studies presented. Finally, proposed infrastructure deals with the interaction among different context constraints provided by diverse sensors. All of these contributions are extensions to the infrastructure based on the Dichotomic View of plasticity, which now offers multi-purpose support.
Resumo:
Peer-reviewed
Resumo:
This thesis considers aspects related to the design and standardisation of transmission systems for wireless broadcasting, comprising terrestrial and mobile reception. The purpose is to identify which factors influence the technical decisions and what issues could be better considered in the design process in order to assess different use cases, service scenarios and end-user quality. Further, the necessity of cross-layer optimisation for efficient data transmission is emphasised and means to take this into consideration are suggested. The work is mainly related terrestrial and mobile digital video broadcasting systems but many of the findings can be generalised also to other transmission systems and design processes. The work has led to three main conclusions. First, it is discovered that there are no sufficiently accurate error criteria for measuring the subjective perceived audiovisual quality that could be utilised in transmission system design. Means for designing new error criteria for mobile TV (television) services are suggested and similar work related to other services is recommended. Second, it is suggested that in addition to commercial requirements there should be technical requirements setting the frame work for the design process of a new transmission system. The technical requirements should include the assessed reception conditions, technical quality of service and service functionalities. Reception conditions comprise radio channel models, receiver types and antenna types. Technical quality of service consists of bandwidth, timeliness and reliability. Of these, the thesis focuses on radio channel models and errorcriteria (reliability) as two of the most important design challenges and provides means to optimise transmission parameters based on these. Third, the thesis argues that the most favourable development for wireless broadcasting would be a single system suitable for all scenarios of wireless broadcasting. It is claimed that there are no major technical obstacles to achieve this and that the recently published second generation digital terrestrial television broadcasting system provides a good basis. The challenges and opportunities of a universal wireless broadcasting system are discussed mainly from technical but briefly also from commercial and regulatory aspect
Resumo:
The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.
Resumo:
The proposed transdisciplinary field of ‘complexics’ would bring together allcontemporary efforts in any specific disciplines or by any researchersspecifically devoted to constructing tools, procedures, models and conceptsintended for transversal application that are aimed at understanding andexplaining the most interwoven and dynamic phenomena of reality. Our aimneeds to be, as Morin says, not “to reduce complexity to simplicity, [but] totranslate complexity into theory”.New tools for the conception, apprehension and treatment of the data ofexperience will need to be devised to complement existing ones and toenable us to make headway toward practices that better fit complexictheories. New mathematical and computational contributions have alreadycontinued to grow in number, thanks primarily to scholars in statisticalphysics and computer science, who are now taking an interest in social andeconomic phenomena.Certainly, these methodological innovations put into question and againmake us take note of the excessive separation between the training receivedby researchers in the ‘sciences’ and in the ‘arts’. Closer collaborationbetween these two subsets would, in all likelihood, be much moreenergising and creative than their current mutual distance. Humancomplexics must be seen as multi-methodological, insofar as necessarycombining quantitative-computation methodologies and more qualitativemethodologies aimed at understanding the mental and emotional world ofpeople.In the final analysis, however, models always have a narrative runningbehind them that reflects the attempts of a human being to understand theworld, and models are always interpreted on that basis.
Resumo:
Els jocs són una de les indústries de software més gran del món de la informàtica. Des dels primers jocs en blanc i negre que simulaven raquetes i una pilota, fins avui en dia, en que el desenvolupament d’un joc porta darrere un equip de professionals tant o més gran que el major dels projectes informàtics del món de les indústries, els jocs han evolucionat més que la majoria de programes. La possibilitat d’elaborar un joc és, a part d’una proposta temptadora (ja que difereix enormement de qualsevol pràctica feta durant la carrera), un repte de caire personal per algú que sempre ha estat en contacte amb videojocs i que, després d’adquirir una sèrie de coneixements indispensables, s’ha proposat d’intentar desenvolupar-ne un des de l’arrel. L’objectiu d’aquest treball és doncs això, aprendre com neix un joc partint de res, i veure totes les complicacions que sorgeixen a l’hora de desenvolupar-lo. Els resultats delt projecte mostren generosament el gran nombre de problemes que sorgeixen en un procés com aquest, però com a conclusions importants cal destacar la satisfacció envers els resultats obtinguts, així com els coneixements que s’han guanyat mitjançant el desenvolupament del programa.
Resumo:
En els últims anys el sector de la construcció ha experimentat un creixement exponencial. Aquest creixement ha repercutit sobre molts aspectes: des de la necessitat de tenir més personal a les obres, la implantació d’unes oficines per a poder gestionar la compatibilitat i portar un control sobre les obres fins a la necessitat d’haver de disposar de programes informàtics específics que ajudin a realitzar la feina de la manera més còmode i àgil possible. El projecte que s’ha dut a terme consisteix a cobrir una d’aquestes necessitats, que és la de la gestió dels pressupostos en les diferents obres que els constructors realitzen. Utilitza la base de dades de l’ITEC (institut de Tecnologia de la Construcció de Catalunya) sobre la qual treballen la immensa majoria dels arquitectes quan dissenyen les obres, però també permet entrar les pròpies dades que el constructor vulgui. L’usuari de l’aplicació podrà fer pressupostos per obres de nova construcció, reformes ... agrupant cada una d’elles per capítols. Aquests capítols els podem entendre com les diferents fases a dur a terme, per exemple: la construcció dels fonaments, l’aixecament de les parets o fer la teulada. Dins dels capítols hi trobem les partides, que és un conjunt de materials i hores de feina i maquinària per a dur a terme una part de l’obra, com per exemple seria fer un envà de separació entre habitacions. En aquest cas hi tindríem els diferents materials que necessitaríem, totxanes, morter; les hores de manobre necessàries per aixecar-la, el transport de tot el material fins a l’obra... Tots aquests paràmetres (materials, hores, transport...) s’anomenen articles i van inclosos a dins de les partides. Aquesta aplicació està dissenyada per funcionar en un entorn client/servidor, utilitzant com a servidor un Linux OpenSuse 10.2 i com a clients estacions de treball amb Windows XP, tot i que també podríem utilitzar d’altres versions dels sistemes operatius de Microsoft. L’entorn de desenvolupament utilitzat és el del llenguatge FDS , el qual ja porta integrat un gestor de fitxers que és el que es farà servir.
Resumo:
El software lliure està tenint últimament un pes cada cop més important en les empreses, però encara és el gran desconegut per a molta gent. Des de la seva creació als anys 80 fins ara, hi ha hagut un creixement exponencial de software lliure de gran qualitat, oferint eines per a tot tipus de necessitats, eines ofimàtiques, gestors de correu, sistemes de fitxer, sistemes operatius…. Tot aquest moviment no ha passat desapercebut per a molts usuaris i empreses, que s’han aprofitat d’ell per cobrir les seves necessitats. Pel que fa a les empreses, cada cop n’hi ha més que en petita o gran mesura, utilitzen el software lliure, ja sigui per el seu menor cost d’adquisició, o bé per la seva gran fiabilitat o per que és fàcilment adaptable o per no establir cap lligam tecnològic, en definitiva per tenir més llibertat. En el moment de la creació d’una nova empresa, on es parteix de zero en tota la tecnologia informàtica, és el moment menys costòs d’implementar l’arquitectura informàtica amb software lliure, és quan l’impacte que té sobre l’empresa, usuaris i clients és menor. En les empreses que ja tenen un sistema informàtic, caldrà establir un pla de migració, ja sigui total o parcial. La finalitat d’aquest projecte no és la de dir quin software és millor que l’altre o de dir quin s’ha d’instal•lar, sinó el de donar a conèixer el món del software lliure, mostrar part d’aquest software, fer alguna comparativa de software lliure amb software propietari, donant idees i un conjunt de solucions per a empreses, per què una empresa pugui agafar idees d’implementació d’algunes de les solucions informàtiques exposades o seguir algun dels consells proposats. Actualment ja hi ha moltes empreses que utilitzen software lliure. Algunes només n’utilitzen una petita part en les seves instal•lacions, ja que el fet de que una empresa funcioni al 100% amb software lliure, tot i que n’hi comença ha haver, de moment ho considero una mica arriscat, però que en poc temps, aquest fet serà cada cop més habitual.
Resumo:
The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.
Resumo:
The second Symposium on Cellular Automata “Journ´ees Automates Cellulaires” (JAC 2010) took place in Turku, Finland, on December 15-17, 2010. The first two conference days were held in the Educarium building of the University of Turku, while the talks of the third day were given onboard passenger ferry boats in the beautiful Turku archipelago, along the route Turku–Mariehamn–Turku. The conference was organized by FUNDIM, the Fundamentals of Computing and Discrete Mathematics research center at the mathematics department of the University of Turku. The program of the conference included 17 submitted papers that were selected by the international program committee, based on three peer reviews of each paper. These papers form the core of these proceedings. I want to thank the members of the program committee and the external referees for the excellent work that have done in choosing the papers to be presented in the conference. In addition to the submitted papers, the program of JAC 2010 included four distinguished invited speakers: Michel Coornaert (Universit´e de Strasbourg, France), Bruno Durand (Universit´e de Provence, Marseille, France), Dora Giammarresi (Universit` a di Roma Tor Vergata, Italy) and Martin Kutrib (Universit¨at Gie_en, Germany). I sincerely thank the invited speakers for accepting our invitation to come and give a plenary talk in the conference. The invited talk by Bruno Durand was eventually given by his co-author Alexander Shen, and I thank him for accepting to make the presentation with a short notice. Abstracts or extended abstracts of the invited presentations appear in the first part of this volume. The program also included several informal presentations describing very recent developments and ongoing research projects. I wish to thank all the speakers for their contribution to the success of the symposium. I also would like to thank the sponsors and our collaborators: the Finnish Academy of Science and Letters, the French National Research Agency project EMC (ANR-09-BLAN-0164), Turku Centre for Computer Science, the University of Turku, and Centro Hotel. Finally, I sincerely thank the members of the local organizing committee for making the conference possible. These proceedings are published both in an electronic format and in print. The electronic proceedings are available on the electronic repository HAL, managed by several French research agencies. The printed version is published in the general publications series of TUCS, Turku Centre for Computer Science. We thank both HAL and TUCS for accepting to publish the proceedings.