12 resultados para New Information and Communication technology
em Université de Lausanne, Switzerland
Resumo:
This article examines the extent and limits of nonstate forms of authority in international relations. It analyzes how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for the adjustment of ICT-related skills. By focusing on the challenge that highly volatile and short-lived cycles of demands for this type of knowledge pose for ensuring the right qualification of the labor force, the article explores how companies and associations provide training and certification programs as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasized that the consent of actors, subject to informal rules and some form of state support, remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues concerned and fail to fully explore the differentiated space in which non state authority is emerging. This article develops a three-dimensional analytical framework that brings together the scope of the issues involved, the range of nonstate actors concerned, and the spatial scope of their authority. The empirical findings highlight the limits of these new forms of nonstate authority and shed light on the role of the state and international governmental organizations in this new context.
Resumo:
This article examines the extent and limits of non-state forms of authority in international relations. It analyses how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for adjustment of ICT-related skills. Companies and associations provide training and certification programmes as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasised that the consent of actors subject to informal rules and explicit or implicit state recognition remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues and fail to fully explore the differentiated space in which non-state authority is emerging. This paper examines the form of authority underpinning the global knowledge-based economy within the broader perspective of the issues likely to be standardised by technical ICT specification, the wide range of actors involved, and the highly differentiated space where standards become authoritative. The empirical findings highlight the role of different private actors in establishing international educational norms in this field. They also pinpoint the limits of profit-oriented standard-settings, notably with regard to generic norms.
Resumo:
Résumé La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences. Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ». Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel. Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences. Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières. Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser. Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV. Abstract The globalization of markets, the easing of economical regulation and finally the impact of new information and communication technologies have obliged firms to re-examine the way they manage their knowledge capital (knowledge management) and their human capital (competence management). It is commonly admitted that knowledge plays a slightly strategical role in the organization. The firms who want to establish one politic of management of these capitals will have to face with different problems. To manage that knowledge, a long process of capitalization must be done. That one has different steps like identification, extraction and representation of knowledge and competences. There are some different methods of knowledge management like MASK, CommonKADS or KOD. Unfortunately, those methods are very difficult to implement and are using only some types of knowledge and are consequently more limited in the functionalities they can offer. Knowledge management and competence management are two different domain where it could be interesting to unify those to one. Indeed, competence is very close than knowledge as underline this definition: "a set of knowledge in action in a specified context". We choose in our approach to rely on the concept of competence. Indeed, the competence is one of crucial knowledge in the company, particularly to avoid the loss of know-how or to prevent future needs. Because behind collaborator's competence, we can find company efficiency. Unfortunately, there is no real consensus on the definition of the concept of competence. Moreover, existing different definitions don't permit to develop an operational system. Among other key concept, we can find jobs, mission, project, and training... Moreover, we approach different problems of the competence management under the angle of the knowledge management. Indeed, knowledge and competence are closely linked. Then, we propose a method to build different company repositories (competence, jobs, projects repositories). To model those different repositories we choose ontology because it permits to obtain coherent and consensual definitions of the concepts with support of linguistics diversities too. This building repositories method coupled with this knowledge and competence management approach permitted the realization of a tool offering functionalities like mobility management, strategical analysis, yellow pages or CV management.
Resumo:
OBJECTIVES: Occupational ultraviolet (UV) exposure was evaluated in a population-based sample in France. METHODS: A random survey was conducted in 2012 in individuals aged 25 to 69 years. The median daily standard erythemal UV dose (SED) was estimated from exposure time and place and matched to satellite UV records. RESULTS: A total of 889 individuals were exposed to solar UV with highest doses observed among gardeners (1.19 SED), construction workers (1.13 SED), agricultural workers (0.95 SED), and culture/art/social science workers (0.92 SED). Information and communication technology, industry, and transport workers were highly exposed (>0.70 SED). Significant factors associated with high occupational UV exposure were sex (P < 0.0001), phototype (P = 0.0003), and taking lunch outdoors (P < 0.0001). CONCLUSIONS: This study identified not only expected occupations with high UV exposure but also unexpected occupations with high exposures. This could serve as a basis for future prevention.
Resumo:
There is no doubt about the necessity of protecting digital communication: Citizens are entrusting their most confidential and sensitive data to digital processing and communication, and so do governments, corporations, and armed forces. Digital communication networks are also an integral component of many critical infrastructures we are seriously depending on in our daily lives. Transportation services, financial services, energy grids, food production and distribution networks are only a few examples of such infrastructures. Protecting digital communication means protecting confidentiality and integrity by encrypting and authenticating its contents. But most digital communication is not secure today. Nevertheless, some of the most ardent problems could be solved with a more stringent use of current cryptographic technologies. Quite surprisingly, a new cryptographic primitive emerges from the ap-plication of quantum mechanics to information and communication theory: Quantum Key Distribution. QKD is difficult to understand, it is complex, technically challenging, and costly-yet it enables two parties to share a secret key for use in any subsequent cryptographic task, with an unprecedented long-term security. It is disputed, whether technically and economically fea-sible applications can be found. Our vision is, that despite technical difficulty and inherent limitations, Quantum Key Distribution has a great potential and fits well with other cryptographic primitives, enabling the development of highly secure new applications and services. In this thesis we take a structured approach to analyze the practical applicability of QKD and display several use cases of different complexity, for which it can be a technology of choice, either because of its unique forward security features, or because of its practicability.
Resumo:
This study explores biomonitoring communication with workers exposed to risks. Using a qualitative approach, semi-directive interviews were performed. Results show that occupational physicians and workers share some perceptions, but also point out communication gaps. Consequently, informed consent is not guaranteed. This article proposes some recommendations for occupational physicians' practices.
Resumo:
Voting is fundamental for democracy, however, this decisive democratic act requires quite an effort. Decision making at elections depends largely on the interest to gather information about candidates and parties, the effort to process the information at hand and the motivation to reach a vote choice. Especially in electoral systems with highly fragmented party systems and hundreds of candidates running for office, the process of decision making in the pre‐election sphere is highly demanding. In the age of information and communication technologies, new possibilities for gathering and processing such information are available. Voting Advice Applications (VAAs) provide guidance to voters prior to the act of voting and assist voters in choosing between different candidates and parties on the basis of issue congruence. Meanwhile widely used all over the world, scientific inquiry into the effect of such tools on electoral behavior is ongoing. This paper adds to the current debate by focusing on whether the popularity of candidates on the Swiss VAA smartvote eventually paid off at the 2007 Swiss federal elections and whether there is a direct link between the performance of a candidate on the tool and his or her electoral performance.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10,000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
OBJECTIVES: In this study, we investigated the structural plasticity of the contralesional motor network in ischemic stroke patients using diffusion magnetic resonance imaging (MRI) and explored a model that combines a MRI-based metric of contralesional network integrity and clinical data to predict functional outcome at 6 months after stroke. METHODS: MRI and clinical examinations were performed in 12 patients in the acute phase, at 1 and 6 months after stroke. Twelve age- and gender-matched controls underwent 2 MRIs 1 month apart. Structural remodeling after stroke was assessed using diffusion MRI with an automated measurement of generalized fractional anisotropy (GFA), which was calculated along connections between contralesional cortical motor areas. The predictive model of poststroke functional outcome was computed using a linear regression of acute GFA measures and the clinical assessment. RESULTS: GFA changes in the contralesional motor tracts were found in all patients and differed significantly from controls (0.001 ≤ p < 0.05). GFA changes in intrahemispheric and interhemispheric motor tracts correlated with age (p ≤ 0.01); those in intrahemispheric motor tracts correlated strongly with clinical scores and stroke sizes (p ≤ 0.001). GFA measured in the acute phase together with a routine motor score and age were a strong predictor of motor outcome at 6 months (r(2) = 0.96, p = 0.0002). CONCLUSION: These findings represent a proof of principle that contralesional diffusion MRI measures may provide reliable information for personalized rehabilitation planning after ischemic motor stroke. Neurology® 2012;79:39-46.