67 resultados para Integration and security technologies
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
The aim of this paper is to analyse the effects of human capital, advanced manufacturing technologies (AMT), and new work organizational practices on firm productivity, while taking into account the synergies existing between them. This study expands current knowledge in this area in two ways. First, in contrast with previous works, we focus on AMT and not ICT (information and communication technologies). Second, we use a unique employer-employee data set for small firms in a particular area of southern Europe (Catalonia, Spain). Using a small firm data set, allows us to analyse the particular case of small and medium enterprises, since we cannot assume they have the same characteristics as large firms. The results provide evidence in favor of the complementarity hypothesis between human capital, advanced manufacturing technologies, and new work organization practices, although we show that the complementarity effects depend on what type of work organization practices are used by a firm. For small and medium Catalan firms, the only set of work organization practices that improve the benefits of human capital and technology investment are those practices which are more quality oriented, such as quality circles, problem-solving groups or total quality management.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la Satandford University, EEUU, entre 2007 i 2009. Els darrers anys, hi ha hagut un avanç espectacular en la tecnologia aplicada a l’anàlisi del genoma i del proteoma (microarrays, PCR quantitativa real time, electroforesis dos dimensions, espectroscòpia de masses, etc.) permetent la resolució de mostres complexes i la detecció quantitativa de diferents gens i proteïnes en un sol experiment. A més a més, la seva importància radica en la capacitat d’identificar potencials dianes terapèutiques i possibles fàrmacs, així com la seva aplicació en el disseny i desenvolupament de noves eines de diagnòstic. L’aplicabilitat de les tècniques actuals, però, està limitada al nivell al que el teixit pot ser disseccionat. Si bé donen valuosa informació sobre expressió de gens i proteïnes implicades en una malaltia o en resposta a un fàrmac per exemple, en cap cas, s’obté una informació in situ ni es pot obtenir informació espacial o una resolució temporal, així com tampoc s’obté informació de sistemes in vivo. L’objectiu d’aquest projecte és desenvolupar i validar un nou microscopi, d’alta resolució, ultrasensible i de fàcil ús, que permeti tant la detecció de metabòlits, gens o proteïnes a la cèl•lula viva en temps real com l’estudi de la seva funció. Obtenint així una descripció detallada de les interaccions entre proteïnes/gens que es donen dins la cèl•lula. Aquest microscopi serà un instrument sensible, selectiu, ràpid, robust, automatitzat i de cost moderat que realitzarà processos de cribatge d’alt rendiment (High throughput screening) genètics, mèdics, químics i farmacèutics (per aplicacions diagnòstiques i de identificació i selecció de compostos actius) de manera més eficient. Per poder realitzar aquest objectius el microscopi farà ús de les més noves tecnologies: 1)la microscopia òptica i d’imatge, per millorar la visualització espaial i la sensibilitat de l’imatge; 2) la utilització de nous mètodes de detecció incloent els més moderns avanços en nanopartícules; 3) la creació de mètodes informàtics per adquirir, emmagatzemar i processar les imatges obtingudes.
Resumo:
Aquest document de treball mira d'establir un nou camp d'investigació a la cruïlla entre els fluxos de migració i d'informació i comunicació. Hi ha diversos factors que fan que valgui la pena adoptar aquesta perspectiva. El punt central és que la migració internacional contemporània és incrustada en la dinàmica de la societat de la informació, seguint models comuns i dinàmiques interconnectades. Per consegüent, s'està començant a identificar els fluxos d'informació com a qüestions clau en les polítiques de migració. A més, hi ha una manca de coneixement empíric en el disseny de xarxes d'informació i l'ús de les tecnologies d'informació i comunicació en contextos migratoris. Aquest document de treball també mira de ser una font d'hipòtesis per a investigacions posteriors.
Resumo:
El desenvolupament de les tecnologies de la informació i la comunicació (TIC) durant els darrers quaranta anys del segle XX i la seva incorporació en els diferents àmbits de l'activitat humana ens porten a plantejar-nos, al començament del segle XXI, quines són les transformacions profundes que acompanyen aquests fets i quines són les conseqüències que, com a mínim a curt termini, comporten. El focus d'aquest projecte és l'anàlisi dels processos de transformació de la vida acadèmica universitària en l'àmbit català, la seva vinculació amb la realitat actual i les repercussions que els processos esmentats tenen en la societat en general. De manera més específica, l'objectiu és, en primer lloc, explorar amb una perspectiva global la incorporació d'Internet a les universitats catalanes i, en segon lloc, analitzar els processos de canvi que aquest fet comporta en els processos de formació i recerca de la Universitat Rovira i Virgili (URV). Aquest informe presenta els resultats de tres estudis concrets, cadascun dels quals té uns objectius, una metodologia i una discussió particulars: Configuració de la xarxa d'universitats catalanes: connexió física i projectes compartits, Presència de les universitats catalanes a Internet, i Estudi de cas: la URV.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
La recent revolució en les tècniques de generació de dades genòmiques ha portat a una situació de creixement exponencial de la quantitat de dades generades i fa més necessari que mai el treball en la optimització de la gestió i maneig d'aquesta informació. En aquest treball s'han atacat tres vessants del problema: la disseminació de la informació, la integració de dades de diverses fonts i finalment la seva visualització. Basant-nos en el Sistema d'Anotacions Distribuides, DAS, hem creat un aplicatiu per a la creació automatitzada de noves fonts de dades en format estandaritzat i accessible programàticament a partir de fitxers de dades simples. Aquest progrtamari, easyDAS, està en funcionament a l'Institut Europeu de Bioinformàtica. Aquest sistema facilita i encoratja la compartició i disseminació de dades genòmiques en formats usables. jsDAS és una llibreria client de DAS que permet incorporar dades DAS en qualsevol aplicatiu web de manera senzilla i ràpida. Aprofitant els avantatges que ofereix DAS és capaç d'integrar dades de múltiples fonts de manera coherent i robusta. GenExp és el prototip de navegador genòmic basat en web altament interactiu i que facilita l'exploració dels genomes en temps real. És capaç d'integrar dades de quansevol font DAS i crear-ne una representació en client usant els últims avenços en tecnologies web.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
The aim of this book is to survey on different Land Use Planning and safety approaches in vicinity of industrial plants. As this research is associated with three broad fields of Land Use Planning, safety and security, the set principle is to avoid unnecessary and over detailed information, but including the useful ones to provide a comprehensive resource which can be applicable for several purposes. Besides, the proposed method, which is explained in Chapter 7, can initiate a new field for future of Land Use Planning in vicinity of industrial plants.
Resumo:
This paper examines the importance that the current Convention on the Future of Europe is giving (or not) to the question of democratic accountability in European foreign and defence policy. As all European Union (EU) member states are parliamentary democracies1, and as there is a European Parliament (EP) which also covers CFSP (Common Foreign and Security Policy) and ESDP (European Security and Defence Policy2) matters, I will concentrate on parliamentary accountability rather than democratic accountability more widely defined. Where appropriate, I will also refer to the work of other transnational parliamentary bodies such as the North Atlantic Assembly or NAA (NATO´s Parliamentary Assembly) or the Western European Union (WEU) Parliamentary Assembly3. The article will consist of three sections. First, I will briefly put the question under study within its wider context (section 1). Then, I will examine the current level of parliamentary accountability in CFSP and defence matters (section 2). Finally, I will consider the current Convention debate and assess how much attention is being given to the question of accountability in foreign and defence policies (section 3). This study basically argues that, once again, there is very little interest in an issue that should be considered as vital for the future democratic development of a European foreign and defence policy. It is important to note however that this paper does not cover the wider debate about how to democratise and make the EU more transparent and closer to its citizens. It concentrates on its Second Pillar because its claim is that very little if any attention is being given to this question
Resumo:
Next Generation Access Networks (NGAN) are the new step forward to deliver broadband services and to facilitate the integration of different technologies. It is plausible to assume that, from a technological standpoint, the Future Internet will be composed of long-range high-speed optical networks; a number of wireless networks at the edge; and, in between, several access technologies, among which, the Passive Optical Networks (xPON) are very likely to succeed, due to their simplicity, low-cost, and increased bandwidth. Among the different PON technologies, the Ethernet-PON (EPON) is the most promising alternative to satisfy operator and user needs, due to its cost, flexibility and interoperability with other technologies. One of the most interesting challenges in such technologies relates to the scheduling and allocation of resources in the upstream (shared) channel. The aim of this research project is to study and evaluate current contributions and propose new efficient solutions to address the resource allocation issues in Next Generation EPON (NG-EPON). Key issues in this context are future end-user needs, integrated quality of service (QoS) support and optimized service provisioning for real time and elastic flows. This project will unveil research opportunities, issue recommendations and propose novel mechanisms associated with the convergence within heterogeneous access networks and will thus serve as a basis for long-term research projects in this direction. The project has served as a platform for the generation of new concepts and solutions that were published in national and international conferences, scientific journals and also in book chapter. We expect some more research publications in addition to the ones mentioned to be generated in a few months.
Resumo:
L’estudi examina les relacions entre (1) les xarxes socials personals de la població immigrant resident a Barcelona i (2) les seves identitats culturals múltiples. L’objectiu principal de l’estudi és entendre com el contingut i l’estructura de les relacions socials dels immigrants facilita o dificulta (1) tenir un sentiment de pertinença a les noves cultures d’acollida, la catalana i la espanyola, i (2) la integració d’aquestes noves identitats socioculturals amb la seva identitat d’origen en una nova identitat bicultural cohesiva. El nostre plantejament inicial era que els immigrants amb xarxes socials més diverses des del punt de vista de la seva composició cultural tindrien més recursos socials i experiències cognitives més diverses , factors que afavoreixen les identificacions múltiples i la participació cívica. Els resultats de l’estudi mostren que el grau d’identificació dels participants amb la seva cultura ètnica o d’origen és força alt i, en certa mesura, més alt en comparació amb les cultures d’acollida ( catalana, cívica i espanyola). Tanmateix, el vincle dels participants amb les cultures d’acollida (p. ex., la cultura catalana) és prou rellevant per a indicar una orientació bicultural (catalana i ètnica). Les anàlisis de correlacions revelen que sentir-se català no impedeix sentir-se part de la comunitat etnocultural d’origen. A més, existeix una interrelació entre l'orientació cultural catalana i la identificació amb les comunitats cíviques locals. De la mateixa manera, tenir competències en llengua catalana no va en detriment de les competències en llengua castellana. Les anàlisis també mostren que factors com l’orientació cultural catalana, l’ús del català i la identificació amb la cultura catalana tenen una correlació positiva amb el grau de chohesio de la indentitat bicultural, afavoreixen el benestar psicològic i disminueixen l’estrès aculturatiu. L’anàlisi de les xarxes socials mostra que la identificació amb la cultura catalana, l’orientació cultural catalana i la integració de la identitat són factors clau per tenir xarxes socials més diverses des del punt de vista ètnic i lingüístic, amb menys membres del col•lectiu d’origen, i amb subgrups o “cliques” culturalment més heterogenis. La identificació espanyola també prediu, en mesura més reduïda, la diversitat de les xarxes. Els nostres resultats contribueixen a la recerca actual i les teories sobre interculturalitat i identitat cultural.
Resumo:
Information and communication technologies pose accessibility problems to people with disabilities because its design fails to take into account their communication and usability requirements. The impossibility to access the services provided by these technologies creates a situation of exclusion that reduces the self-suficiency of disabled individuals and causes social isolation, which in turn diminishes their overall quality of life. Considering the importance of these technologies and services in our society, we have developed a pictogram-based Instant Messaging service for individuals with cognitive disabilities who have reading and writing problems. Along the paper we introduce and discuss the User Centred Design methodology that we have used to develop and evaluate the pictogram-based Instant Messaging service and client with individuals with cognitive disabilities taking into account their communication and usability requirements. From the results obtained in the evaluation process we can state that individuals with cognitive disabilities have been able to use the pictogram-based Instant Messaging service and client to communicate with their relatives and acquaintances, thus serving as a tool to help reducing their social and digital exclusion situation.
Resumo:
A recent finding of the structural VAR literature is that the response of hours worked to a technology shock depends on the assumption on the order of integration of the hours. In this work we relax this assumption, allowing for fractional integration and long memory in the process for hours and productivity. We find that the sign and magnitude of the estimated impulse responses of hours to a positive technology shock depend crucially on the assumptions applied to identify them. Responses estimated with short-run identification are positive and statistically significant in all datasets analyzed. Long-run identification results in negative often not statistically significant responses. We check validity of these assumptions with the Sims (1989) procedure, concluding that both types of assumptions are appropriate to recover the impulse responses of hours in a fractionally integrated VAR. However, the application of longrun identification results in a substantial increase of the sampling uncertainty. JEL Classification numbers: C22, E32. Keywords: technology shock, fractional integration, hours worked, structural VAR, identification
Resumo:
One of the most relevant difficulties faced by first-year undergraduate students is to settle into the educational environment of universities. This paper presents a case study that proposes a computer-assisted collaborative experience designed to help students in their transition from high school to university. This is done by facilitating their first contact with the campus and its services, the university community, methodologies and activities. The experience combines individual and collaborative activities, conducted in and out of the classroom, structured following the Jigsaw Collaborative Learning Flow Pattern. A specific environment including portable technologies with network and computer applications has been developed to support and facilitate the orchestration of a flow of learning activities into a single integrated learning setting. The result is a Computer-Supported Collaborative Blended Learning scenario, which has been evaluated with first-year university students of the degrees of Software and Audiovisual Engineering within the subject Introduction to Information and Communications Technologies. The findings reveal that the scenario improves significantly students’ interest in their studies and their understanding about the campus and services provided. The environment is also an innovative approach to successfully support the heterogeneous activities conducted by both teachers and students during the scenario. This paper introduces the goals and context of the case study, describes how the technology was employed to conduct the learning scenario, the evaluation methods and the main results of the experience.