794 resultados para Conference topics
Resumo:
Polissema: Revista de Letras do ISCAP 2001/N.º 1- Tradução
Resumo:
Purpose – The aim of this article is to present some results from research undertaken into the information behaviour of European Documentation Centre (EDC) users. It will reflect on the practices of a group of 234 users of 55 EDCs covering 21 Member States of the European Union (EU), used to access European information. Design/methodology/approach – In order to collect the data presented here, five questionnaires were sent to users in all the EDCs in Finland, Ireland, Hungary and Portugal. In the remaining EU countries, five questionnaires were sent to two EDCs chosen at random. The questionnaires were sent by post, following telephone contact with the EDC managers. Findings – Factors determining access to information on the European Union and the frequency of this access are identified. The information providers most commonly used to access European information and the information sources considered the most reliable by respondents will also be analysed. Another area of analysis concerns the factors cited by respondents as facilitating access to information on Europe or, conversely, making it more difficult to access. Parallel to this, the aspects of accessing information on EU that are valued most by users will also be assessed. Research limitations/implications – Questionnaires had to be used, as the intention was to cover a very extensive geographical area. However, in opting for closed questions, it is acknowledged that standard responses have been obtained with no scope for capturing the individual circumstances of each respondent, thus making a qualitative approach difficult. Practical implications – The results provide an overall picture of certain aspects of the information behaviour of EDC users. They may serve as a starting point for planning training sessions designed to develop the skills required to search, access, evaluate and apply European information within an academic context. From a broader perspective, they also constitute factors which the European Commission should take into consideration when formulating its information and communication policy. Originality/value – This is the first piece of academic research into the EDCs and their users, which aimed to cover all Members State of the EU.
Resumo:
Low-density parity-check (LDPC) codes are nowadays one of the hottest topics in coding theory, notably due to their advantages in terms of bit error rate performance and low complexity. In order to exploit the potential of the Wyner-Ziv coding paradigm, practical distributed video coding (DVC) schemes should use powerful error correcting codes with near-capacity performance. In this paper, new ways to design LDPC codes for the DVC paradigm are proposed and studied. The new LDPC solutions rely on merging parity-check nodes, which corresponds to reduce the number of rows in the parity-check matrix. This allows to change gracefully the compression ratio of the source (DCT coefficient bitplane) according to the correlation between the original and the side information. The proposed LDPC codes reach a good performance for a wide range of source correlations and achieve a better RD performance when compared to the popular turbo codes.
Resumo:
This paper suggests that the thought of the North-American critical theorist James W. Carey provides a relevant perspective on communication and technology. Having as background American social pragmatism and progressive thinkers of the beginning of the 20th century (as Dewey, Mead, Cooley, and Park), Carey built a perspective that brought together the political economy of Harold A. Innis, the social criticism of David Riesman and Charles W. Mills and incorporated Marxist topics such as commodification and sociocultural domination. The main goal of this paper is to explore the connection established by Carey between modern technological communication and what he called the “transmissive model”, a model which not only reduces the symbolic process of communication to instrumentalization and to information delivery, but also politically converges with capitalism as well as power, control and expansionist goals. Conceiving communication as a process that creates symbolic and cultural systems, in which and through which social life takes place, Carey gives equal emphasis to the incorporation processes of communication.If symbolic forms and culture are ways of conditioning action, they are also influenced by technological and economic materializations of symbolic systems, and by other conditioning structures. In Carey’s view, communication is never a disembodied force; rather, it is a set of practices in which co-exist conceptions, techniques and social relations. These practices configure reality or, alternatively, can refute, transform and celebrate it. Exhibiting sensitiveness favourable to the historical understanding of communication, media and information technologies, one of the issues Carey explored most was the history of the telegraph as an harbinger of the Internet, of its problems and contradictions. For Carey, Internet was seen as the contemporary heir of the communications revolution triggered by the prototype of transmission technologies, namely the telegraph in the 19th century. In the telegraph Carey saw the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of conflict of interest for the control over patents; an inducer of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. After a brief approach to Carey’s communication theory, this paper focuses on his seminal essay "Technology and ideology. The case of the telegraph", bearing in mind the prospect of the communication revolution introduced by Internet. We maintain that this essay has seminal relevance for critically studying the information society. Our reading of it highlights the reach, as well as the problems, of an approach which conceives the innovation of the telegraph as a metaphor for all innovations, announcing the modern stage of history and determining to this day the major lines of development in modern communication systems.
Resumo:
Workplace aggression is a factor that shapes the interaction between individuals and their work environment and produces many undesirable outcomes, sometimes introducing heavy costs for organizations. Only through a comprehensive understanding of the genesis of workplace aggression is possible to develop strategies and interventions to minimize its nefarious effects. The existent body of knowledge has already identified several individual, situational and contextual antecedents of workplace aggression, although this is a research area where significant gaps occur and many issues were still not addressed Dupré and Barling (2006). According to Baron and Neuman (1998) one of these predictors is organizational change, since certain changes in the work environment (e.g., changes in management) can lead to increased aggression. This paper intends to contribute to workplace aggression research by studying its relationship with organizational change, considering a moderating role of political behaviors and organizational cynicism (Ammeter et al., 2002, Ferris et al., 2002). The literature review suggests that mediators and moderators that intervene in the relationships between workplace aggression and its antecedents are understudied topics. James (2005) sustains that organizational politics is related to cynicism and the empirical research of Miranda (2008) has identified leadership political behavior as an antecedent of cynicism but these two variables were not yet investigated regarding their relationship with workplace aggression. This investigation was operationalized using several scales including the Organizational Change Questionnaire-climate of change, processes, and readiness (Bouckenooghe, Devos and Broeck, 2009), a Workplace Aggression Scale (Vicente and D’Oliveira, 2008, 2009, 2010), an Organizational Cynicism Scale (Wanous, Reichers and Austin, 1994) and a Political Behavior Questionnaire (Yukl and Falbe, 1990). Participants representing a wide variety of jobs across many organizations were surveyed. The results of the study and its implications will be presented and discussed. This study contribution is also discussed in what concerns organizational change practices in organizations.
Resumo:
On 19 and 20 October 2006, the Research Centre on Enterprise and Work Organisation (IET) organised the first international conference on “Foresight Studies on Work in the Knowledge Society”. It took place at the auditorium of the new Library of FCT-UNL and had the support of the research project “CodeWork@VO” (financed by FCT-MCTES and co-ordinated by INESC, Porto). The conference related to the European research project “Work Organisation and Restructuring in the Knowledge Society” (WORKS), which is financed by the European Commission. The main objective of the conference was to analyse and discuss research findings on the trends of work structures in the knowledge society, and to debate on new work organisation models and new forms of work supported by ICT.
Resumo:
WORKS final conference report
Resumo:
THE ninth edition of the International Conference on Remote Engineering and Virtual Instrumentation (REV) [1] was held at the Faculty of Engineering of the University of Deusto, Bilbao (Spain), from the 4th to the 6th of July, 2012. A world-class research community in the subject of remote and virtual laboratories joined the event.
Resumo:
Proceedings of the 2nd International Conference on Computational Cybernetics, Vienna University of Technology, August 30 - September 1, 2004
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Resumo:
Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.