956 resultados para concept of globalization
Resumo:
Migration partnerships (MPs) have become a key instrument in global migration governance. In contrast to traditional unilateral approaches, MPs emphasize a more comprehensive and inclusive tackling of migration issues between countries of origin, transit, and destination. Due to this cooperation-oriented concept, most of the existing studies on MPs neglect power questions within partnerships in line with the official discourse, reflecting a broader trend in the international migration governance literature. Others take an instrumentalist view in analysing the power of partnerships or focus on soft power. Illustrated with the examples of the European Mobility Partnerships (EU MPs) and the Swiss Migration Partnerships (CH MPs), we conduct an analysis based on a concept of productive power drawing on post-structural and post-colonial insights. Our main argument is that in contrast to their seemingly consent-oriented and technical character, MPs are sites of intense (discursive) struggles, and (re-)produce meanings, subjects, and resistances. A productive power analysis allows us to move beyond the dichotomy in the literature between coercion and cooperation, as well as between power and resistance more broadly.
Resumo:
This paper addresses the surprising lack of quality control on the analysis and selection on energy policies observable in the last decades. As an example, we discuss the delusional idea that it is possible to replace fossil energy with large scale ethanol production from agricultural crops. But if large scale ethanol production is not practical in energetic terms, why huge amount of money has been invested in it and is it still being invested? In order to answer this question we introduce two concepts useful to frame, in general terms, the predicament of quality control in science: (i) the concept of “granfalloons” proposed by K. Vonnegut (1963) flagging the danger of the formation of “crusades to save the world” void of real meaning. These granfalloons are often used by powerful lobbies to distort policy decisions; and (ii) the concept of Post-Normal science by S. Funtowicz and J. Ravetz (1990) indicating a standard predicament faced by science when producing information for governance. When mixing together uncertainty, multiple-scale and legitimate but contrasting views it becomes impossible to deal with complex issue using the conventional scientific approach based on reductionism. We finally discuss the implications of a different approach to the assessment of alternative energy sources by introducing the concept of Promethean technology.
Resumo:
The generic concept of the artificial meteorite experiment STONE is to fix rock samples bearing microorganisms on the heat shield of a recoverable space capsule and to study their modifications during atmospheric re-entry. The STONE-5 experiment was performed mainly to answer astrobiological questions. The rock samples mounted on the heat shield were used (i) as a carrier for microorganisms and (ii) as internal control to verify whether physical conditions during atmospheric re-entry were comparable to those experienced by "real" meteorites. Samples of dolerite (an igneous rock), sandstone (a sedimentary rock), and gneiss impactite from Haughton Crater carrying endolithic cyanobacteria were fixed to the heat shield of the unmanned recoverable capsule FOTON-M2. Holes drilled on the back side of each rock sample were loaded with bacterial and fungal spores and with dried vegetative cryptoendoliths. The front of the gneissic sample was also soaked with cryptoendoliths. <p>The mineralogical differences between pre- and post-flight samples are detailed. Despite intense ablation resulting in deeply eroded samples, all rocks in part survived atmospheric re-entry. Temperatures attained during re-entry were high enough to melt dolerite, silica, and the gneiss impactite sample. The formation of fusion crusts in STONE-5 was a real novelty and strengthens the link with real meteorites. The exposed part of the dolerite is covered by a fusion crust consisting of silicate glass formed from the rock sample with an admixture of holder material (silica). Compositionally, the fusion crust varies from silica-rich areas (undissolved silica fibres of the holder material) to areas whose composition is "basaltic". Likewise, the fusion crust on the exposed gneiss surface was formed from gneiss with an admixture of holder material. The corresponding composition of the fusion crust varies from silica-rich areas to areas with "gneiss" composition (main component potassium-rich feldspar). The sandstone sample was retrieved intact and did not develop a fusion crust. Thermal decomposition of the calcite matrix followed by disintegration and liberation of the silicate grains prevented the formation of a melt.</p> <p>Furthermore, the non-exposed surface of all samples experienced strong thermal alterations. Hot gases released during ablation pervaded the empty space between sample and sample holder leading to intense local heating. The intense heating below the protective sample holder led to surface melting of the dolerite rock and to the formation of calcium-silicate rims on quartz grains in the sandstone sample. (c) 2008 Elsevier Ltd. All rights reserved.</p>
Resumo:
The theory of language has occupied a special place in the history of Indian thought. Indian philosophers give particular attention to the analysis of the cognition obtained from language, known under the generic name of śābdabodha. This term is used to denote, among other things, the cognition episode of the hearer, the content of which is described in the form of a paraphrase of a sentence represented as a hierarchical structure. Philosophers submit the meaning of the component items of a sentence and their relationship to a thorough examination, and represent the content of the resulting cognition as a paraphrase centred on a meaning element, that is taken as principal qualificand (mukhyaviśesya) which is qualified by the other meaning elements. This analysis is the object of continuous debate over a period of more than a thousand years between the philosophers of the schools of Mimāmsā, Nyāya (mainly in its Navya form) and Vyākarana. While these philosophers are in complete agreement on the idea that the cognition of sentence meaning has a hierarchical structure and share the concept of a single principal qualificand (qualified by other meaning elements), they strongly disagree on the question which meaning element has this role and by which morphological item it is expressed. This disagreement is the central point of their debate and gives rise to competing versions of this theory. The Mïmāmsakas argue that the principal qualificand is what they call bhāvanā ̒bringing into being̒, ̒efficient force̒ or ̒productive operation̒, expressed by the verbal affix, and distinct from the specific procedures signified by the verbal root; the Naiyāyikas generally take it to be the meaning of the word with the first case ending, while the Vaiyākaranas take it to be the operation expressed by the verbal root. All the participants rely on the Pāninian grammar, insofar as the Mimāmsakas and Naiyāyikas do not compose a new grammar of Sanskrit, but use different interpretive strategies in order to justify their views, that are often in overt contradiction with the interpretation of the Pāninian rules accepted by the Vaiyākaranas. In each of the three positions, weakness in one area is compensated by strength in another, and the cumulative force of the total argumentation shows that no position can be declared as correct or overall superior to the others. This book is an attempt to understand this debate, and to show that, to make full sense of the irreconcilable positions of the three schools, one must go beyond linguistic factors and consider the very beginnings of each school's concern with the issue under scrutiny. The texts, and particularly the late texts of each school present very complex versions of the theory, yet the key to understanding why these positions remain irreconcilable seems to lie elsewhere, this in spite of extensive argumentation involving a great deal of linguistic and logical technicalities. Historically, this theory arises in Mimāmsā (with Sabara and Kumārila), then in Nyāya (with Udayana), in a doctrinal and theological context, as a byproduct of the debate over Vedic authority. The Navya-Vaiyākaranas enter this debate last (with Bhattoji Dïksita and Kaunda Bhatta), with the declared aim of refuting the arguments of the Mïmāmsakas and Naiyāyikas by bringing to light the shortcomings in their understanding of Pāninian grammar. The central argument has focused on the capacity of the initial contexts, with the network of issues to which the principal qualificand theory is connected, to render intelligible the presuppositions and aims behind the complex linguistic justification of the classical and late stages of this debate. Reading the debate in this light not only reveals the rationality and internal coherence of each position beyond the linguistic arguments, but makes it possible to understand why the thinkers of the three schools have continued to hold on to three mutually exclusive positions. They are defending not only their version of the principal qualificand theory, but (though not openly acknowledged) the entire network of arguments, linguistic and/or extra-linguistic, to which this theory is connected, as well as the presuppositions and aims underlying these arguments.
Resumo:
We propose a class of models of social network formation based on a mathematical abstraction of the concept of social distance. Social distance attachment is represented by the tendency of peers to establish acquaintances via a decreasing function of the relative distance in a representative social space. We derive analytical results (corroborated by extensive numerical simulations), showing that the model reproduces the main statistical characteristics of real social networks: large clustering coefficient, positive degree correlations, and the emergence of a hierarchy of communities. The model is confronted with the social network formed by people that shares confidential information using the Pretty Good Privacy (PGP) encryption algorithm, the so-called web of trust of PGP.
Resumo:
Since the introduction of the principle of respect of autonomy in medical ethics, the respect of the will of the patient occupied a central place in the decision-making process. To face up to the difficulties that appeared during the application of this principle in clinical medicine, Bruce Miller proposed in the early eighties one way to clarify the significance of this notion in the field of medical practice. He showed that the concept of autonomy can be understood under four senses which deserve to be explored in case of ethical conflict. This article shows, through the analysis of a clinical situation, the relevance of the approach suggested by this author and proposes to refer to this approach in case of ethical dilemmas in clinical practice.
Iowa Development of Rubblized Concrete Pavement Base Mills County, Construction Report, HR-315, 1990
Resumo:
The concept of rubblizing existing concrete pavement prior to the placement of an asphaltic cement concrete overlay has been around for several years and, in fact, has been tried successfully in the states of New York, Michigan, and Ohio. With available construction and maintenance dollars usually not meeting the demands of the needed work, much of the necessary rehabilitation of existing Portland cement concrete pavements is not being completed when it would be most beneficial. Research project HR-315, "Iowa Development of Rubblized Concrete", has been undertaken to determine the effects of rubblized concrete pavement base as it affects the cracking pattern and longevity of the ACC overlay.
Resumo:
The objective of this work was to assess the spatial and temporal variability of sugarcane yield efficiency and yield gap in the state of São Paulo, Brazil, throughout 16 growing seasons, considering climate and soil as main effects, and socioeconomic factors as complementary. An empirical model was used to assess potential and attainable yields, using climate data series from 37 weather stations. Soil effects were analyzed using the concept of production environments associated with a soil aptitude map for sugarcane. Crop yield efficiency increased from 0.42 to 0.58 in the analyzed period (1990/1991 to 2005/2006 crop seasons), and yield gap consequently decreased from 58 to 42%. Climatic factors explained 43% of the variability of sugarcane yield efficiency, in the following order of importance: solar radiation, water deficit, maximum air temperature, precipitation, and minimum air temperature. Soil explained 15% of the variability, considering the average of all seasons. There was a change in the correlation pattern of climate and soil with yield efficiency after the 2001/2002 season, probably due to the crop expansion to the west of the state during the subsequent period. Socioeconomic, biotic and crop management factors together explain 42% of sugarcane yield efficiency in the state of São Paulo.
Resumo:
Biochar has a relatively long half-life in soil and can fundamentally alter soil properties, processes, and ecosystem services. The prospect of global-scale biochar application to soils highlights the importance of a sophisticated and rigorous certification procedure. The objective of this work was to discuss the concept of integrating biochar properties with environmental and socioeconomic factors, in a sustainable biochar certification procedure that optimizes complementarity and compatibility between these factors over relevant time periods. Biochar effects and behavior should also be modelled at temporal scales similar to its expected functional lifetime in soils. Finally, when existing soil data are insufficient, soil sampling and analysis procedures need to be described as part of a biochar certification procedure.
Resumo:
Diagnostic reference levels (DRLs) were established for 21 indication-based CT examinations for adults in Switzerland. One hundred and seventy-nine of 225 computed tomography (CT) scanners operated in hospitals and private radiology institutes were audited on-site and patient doses were collected. For each CT scanner, a correction factor was calculated expressing the deviation of the measured weighted computed tomography dose index (CTDI) to the nominal weighted CTDI as displayed on the workstation. Patient doses were corrected by this factor providing a realistic basis for establishing national DRLs. Results showed large variations in doses between different radiology departments in Switzerland, especially for examinations of the petrous bone, pelvis, lower limbs and heart. This indicates that the concept of DRLs has not yet been correctly applied for CT examinations in clinical routine. A close collaboration of all stakeholders is mandatory to assure an effective radiation protection of patients. On-site audits will be intensified to further establish the concept of DRLs in Switzerland.
Resumo:
Peer-reviewed
Resumo:
Peer-reviewed
Resumo:
La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.
Resumo:
A firm that wishes to launch a new product to the market is faced with a difficult task of deciding what the best moment for the launch is. Timing may also be critical when a firm plans to adopt new processes or intends to head for new markets. The critical question the firm needs to tackle is whether it will try to reach the so-called first-mover advantage by acting earlier than its rivals. The first-mover position may reward the entrant with various opportunities to gain competitive advantage over later movers. However, there are also great risks involved in the early market entry, and sometimes the very first entrant fails even before the followers enter the market. The follower, on the other hand, may be able to free-ride on the earlier entrants' investments and gain from the languished uncertainties that characterize the new markets. According to the current understanding the occurrence of entry order advantages depends not only on the mechanism and attributes in the firm's environment that provide the initial opportunities but also on the firm's ability to capitalize on these advantage opportunities. This study contributes to this discussion by analyzing the linkages between the asset base of the firm, characteristics of the operating environment and the firm's entry timing orientation. To shed light on the relationship between the entry timing strategy and competitive advantage, this study utilizes the concept of entry timing orientation. The rationale for choosing this type of approach arises from the inability of previously employed research tools to reach the underlying factors that result in entry timing advantage. The work consists of an introductory theoretical discussion on entry timing advantages and of four research publication. The empirical findings support the understanding that entry timing advantage is related to the characteristics of the firm's operating environment but may also be related to firm-specific factors. This in turn suggests that some of the traditional ways of detecting and measuring first-mover advantage - which to some extent ignore these dimensions - may be outdated.
Resumo:
In this letter, we obtain the Maximum LikelihoodEstimator of position in the framework of Global NavigationSatellite Systems. This theoretical result is the basis of a completelydifferent approach to the positioning problem, in contrastto the conventional two-steps position estimation, consistingof estimating the synchronization parameters of the in-viewsatellites and then performing a position estimation with thatinformation. To the authors’ knowledge, this is a novel approachwhich copes with signal fading and it mitigates multipath andjamming interferences. Besides, the concept of Position–basedSynchronization is introduced, which states that synchronizationparameters can be recovered from a user position estimation. Weprovide computer simulation results showing the robustness ofthe proposed approach in fading multipath channels. The RootMean Square Error performance of the proposed algorithm iscompared to those achieved with state-of-the-art synchronizationtechniques. A Sequential Monte–Carlo based method is used todeal with the multivariate optimization problem resulting fromthe ML solution in an iterative way.