918 resultados para Communication, information technologies
Resumo:
Aquest document de treball mira d'establir un nou camp d'investigació a la cruïlla entre els fluxos de migració i d'informació i comunicació. Hi ha diversos factors que fan que valgui la pena adoptar aquesta perspectiva. El punt central és que la migració internacional contemporània és incrustada en la dinàmica de la societat de la informació, seguint models comuns i dinàmiques interconnectades. Per consegüent, s'està començant a identificar els fluxos d'informació com a qüestions clau en les polítiques de migració. A més, hi ha una manca de coneixement empíric en el disseny de xarxes d'informació i l'ús de les tecnologies d'informació i comunicació en contextos migratoris. Aquest document de treball també mira de ser una font d'hipòtesis per a investigacions posteriors.
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
Earthquakes occurring around the world each year cause thousands ofdeaths, millions of dollars in damage to infrastructure, and incalculablehuman suffering. In recent years, satellite technology has been asignificant boon to response efforts following an earthquake and itsafter-effects by providing mobile communications between response teamsand remote sensing of damaged areas to disaster management organizations.In 2007, an international team of students and professionals assembledduring theInternational Space University’s Summer Session Program in Beijing, Chinato examine how satellite and ground-based technology could be betterintegrated to provide an optimised response in the event of an earthquake.The resulting Technology Resources for Earthquake MOnitoring and Response(TREMOR) proposal describes an integrative prototype response system thatwill implement mobile satellite communication hubs providing telephone anddata links between response teams, onsite telemedicine consultation foremergency first-responders, and satellite navigation systems that willlocate and track emergency vehicles and guide search-and-rescue crews. Aprototype earthquake simulation system is also proposed, integratinghistorical data, earthquake precursor data, and local geomatics andinfrastructure information to predict the damage that could occur in theevent of an earthquake. The backbone of these proposals is a comprehensiveeducation and training program to help individuals, communities andgovernments prepare in advance. The TREMOR team recommends thecoordination of these efforts through a centralised, non-governmentalorganization.
Resumo:
In this paper we present the theoretical and methodologicalfoundations for the development of a multi-agentSelective Dissemination of Information (SDI) servicemodel that applies Semantic Web technologies for specializeddigital libraries. These technologies make possibleachieving more efficient information management,improving agent–user communication processes, andfacilitating accurate access to relevant resources. Othertools used are fuzzy linguistic modelling techniques(which make possible easing the interaction betweenusers and system) and natural language processing(NLP) techniques for semiautomatic thesaurus generation.Also, RSS feeds are used as “current awareness bulletins”to generate personalized bibliographic alerts.
Resumo:
Today, information technology is strategically important to the goals and aspirations of the business enterprises, government and high-level education institutions – university. Universities are facing new challenges with the emerging global economy characterized by the importance of providing faster communication services and improving the productivity and effectiveness of individuals. New challenges such as provides an information network that supports the demands and diversification of university issues. A new network architecture, which is a set of design principles for build a network, is one of the pillar bases. It is the cornerstone that enables the university’s faculty, researchers, students, administrators, and staff to discover, learn, reach out, and serve society. This thesis focuses on the network architecture definitions and fundamental components. Three most important characteristics of high-quality architecture are that: it’s open network architecture; it’s service-oriented characteristics and is an IP network based on packets. There are four important components in the architecture, which are: Services and Network Management, Network Control, Core Switching and Edge Access. The theoretical contribution of this study is a reference model Architecture of University Campus Network that can be followed or adapted to build a robust yet flexible network that respond next generation requirements. The results found are relevant to provide an important complete reference guide to the process of building campus network which nowadays play a very important role. Respectively, the research gives university networks a structured modular model that is reliable, robust and can easily grow.
Resumo:
Ce mémoire est l’occasion de partager le résultat de la mise en place de ce dispositif de formation à distance que nous avons mené dans l’université du Cap-Vert. Dans la première partie, nous décrirons la nature de la politique éducative au Cap-Vert. Nous contextualiserons les principes de l’installation de l’université publique dans ce pays, ainsi que les intentions d’innovation pédagogique de cette université. La deuxième partie portera un regard complémentaire sur l’utilisation des TIC et de l’internet dans l’enseignement/apprentissage d’une langue, cas du français langue étrangère, et nous nous inspirerons des théories constructiviste et socio-constructiviste. Finalement, la troisième partie détaillera toutes les étapes de la conception et de la mise en place du dispositif de formation à distance. Dans cette troisième partie, nous aborderons dans un premier temps la question des enjeux et des risques du e-Learning et nous présenterons notre mission dans le projet « e-Learning.cv » mené par l’université. Puis, dans un deuxième temps nous analyserons quelques cours que nous avons mis en ligne, en sachant qu’un cours en ligne n’est pas la simple reproduction d’un support pédagogique imprimé mais il offre à l’apprenant un environnement multimédia et interactif. Finalement dans un troisième temps nous essayerons de prendre un peu de recul pour faire une analyse critique de ce que nous avons réalisé et essayer par là même de dégager les perspectives pour améliorer le travail effectué.
Resumo:
Earthquakes represent a major hazard for populations around the world, causing frequent loss of life,human suffering and enormous damage to homes, other buildings and infrastructure. The Technology Resources forEarthquake Monitoring and Response (TREMOR) Team of 36 space professionals analysed this problem over thecourse of the International Space University Summer Session Program and published their recommendations in the formof a report. The TREMOR Team proposes a series of space- and ground-based systems to provide improved capabilityto manage earthquakes. The first proposed system is a prototype earthquake early-warning system that improves theexisting knowledge of earthquake precursors and addresses the potential of these phenomena. Thus, the system willat first enable the definitive assessment of whether reliable earthquake early warning is possible through precursormonitoring. Should the answer be affirmative, the system itself would then form the basis of an operational earlywarningsystem. To achieve these goals, the authors propose a multi-variable approach in which the system will combine,integrate and process precursor data from space- and ground-based seismic monitoring systems (already existing andnew proposed systems) and data from a variety of related sources (e.g. historical databases, space weather data, faultmaps). The second proposed system, the prototype earthquake simulation and response system, coordinates the maincomponents of the response phase to reduce the time delays of response operations, increase the level of precisionin the data collected, facilitate communication amongst teams, enhance rescue and aid capabilities and so forth. It isbased in part on an earthquake simulator that will provide pre-event (if early warning is proven feasible) and post-eventdamage assessment and detailed data of the affected areas to corresponding disaster management actors by means of ageographic information system (GIS) interface. This is coupled with proposed mobile satellite communication hubs toprovide links between response teams. Business- and policy-based implementation strategies for these proposals, suchas the establishment of a non-governmental organisation to develop and operate the systems, are included.
Resumo:
There is no doubt about the necessity of protecting digital communication: Citizens are entrusting their most confidential and sensitive data to digital processing and communication, and so do governments, corporations, and armed forces. Digital communication networks are also an integral component of many critical infrastructures we are seriously depending on in our daily lives. Transportation services, financial services, energy grids, food production and distribution networks are only a few examples of such infrastructures. Protecting digital communication means protecting confidentiality and integrity by encrypting and authenticating its contents. But most digital communication is not secure today. Nevertheless, some of the most ardent problems could be solved with a more stringent use of current cryptographic technologies. Quite surprisingly, a new cryptographic primitive emerges from the ap-plication of quantum mechanics to information and communication theory: Quantum Key Distribution. QKD is difficult to understand, it is complex, technically challenging, and costly-yet it enables two parties to share a secret key for use in any subsequent cryptographic task, with an unprecedented long-term security. It is disputed, whether technically and economically fea-sible applications can be found. Our vision is, that despite technical difficulty and inherent limitations, Quantum Key Distribution has a great potential and fits well with other cryptographic primitives, enabling the development of highly secure new applications and services. In this thesis we take a structured approach to analyze the practical applicability of QKD and display several use cases of different complexity, for which it can be a technology of choice, either because of its unique forward security features, or because of its practicability.
Resumo:
Podeu consultar la versió en castellà a http://hdl.handle.net/2445/8955
Resumo:
Podeu consultar la versió en català a http://hdl.handle.net/2445/8954
Resumo:
Lapplication de la mathématique et de la statistique à létude des phénomènes informationnels a entraîné la naissance en science de linformation dun nouvel axe de recherche et de développement, linfométrie. Après avoir montré lintérêt de cette application mais aussi avoir mis en garde contre certains abus et contre certains mauvais usages, nous présentons quelques exemples dinfométrie mathématique et dinfométrie statistique appliquées aux revues scientifiques. Ils illustrent létendue et lefficacité des analyses qui peuvent être faites sur une ou plusieurs variables informationnelles.
Resumo:
New and alternative scientific publishing business models is a reality driven mostly by the information and communication technologies, by the movements towards the recovery of control of the scientific communication activities by the academic community, and by the open access approaches. The hybrid business model, mixing open and toll-access is a reality and they will probably co-exist with respective trade-offs. This essay discusses the changes driven by the epublishing and the impacts on the scholarly communication system stakeholders' interrelationships (publishers-researchers, publishers-libraries and publishers-users interrelationships), and the changes on the scientific publishing business models, followed by a discussion of possible evolving business models. Whatever the model which evolves and dominates, a huge cultural change in authors' and institutions publishing practices will be necessary in order to make the open access happen and to consolidate the right business models for the traditional publishers. External changes such as policies, rewarding systems and institutions mandates should also happen in order to sustain the whole changing scenario.
Resumo:
This paper analyses the adoption of new information and communication technologies (ICTs) by Spanish journalists specialising in science. Applying an ethnographic research model, this study was based on a wide sample of professionals, aiming to evaluate the extent by which science journalists have adopted the new media and changed the way they use information sources. In addition, interviewees were asked whether in their opinion the Web 2.0 has had an impact on the quality of the news. The integration of formats certainly implies a few issues for today’s newsrooms. Finally, with the purpose of improving the practice of science information dissemination, the authors put forward a few proposals, namely: Increasing the training of Spanish science journalists in the field of new technologies; Emphasising the accuracy of the information and the validation of sources; Rethinking the mandates and the tasks of information professionals.