978 resultados para Quadratic Number Field


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of side-branching in solidifying dendrites in a regime of large values of the Peclet number is studied by means of a phase-field model. We have compared our numerical results with experiments of the preceding paper and we obtain good qualitative agreement. The growth rate of each side branch shows a power-law behavior from the early stages of its life. From their birth, branches which finally succeed in the competition process of side-branching development have a greater growth exponent than branches which are stopped. Coarsening of branches is entirely defined by their geometrical position relative to their dominant neighbors. The winner branches escape from the diffusive field of the main dendrite and become independent dendrites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Results of a field and microstructural study between the northern and the central bodies of the Lanzo plagioclase peridotite massif (NW Italy) indicate that the spatial distribution of deformation is asymmetric across kilometre-scale mantle shear zones. The southwestern part of the shear zone (footwall) shows a gradually increasing degree of deformation from porphyroclastic peridotites to mylonite, whereas the northeastern part (hanging wall) quickly grades into weakly deformed peridotites. Discordant gabbroic and basaltic dykes are asymmetrically distributed and far more abundant in the footwall of the shear zone. The porphyroclastic peridotite displays porphyroclastic zones and domains of igneous crystallization whereas mylonites are characterized by elongated porphyroclasts, embedded between fine-grained, polycrystalline bands of olivine, plagioclase, clinopyroxene, orthopyroxene, spinel, rare titanian pargasite, and domains of recrystallized olivine. Two types of melt impregnation textures have been found: (1) clinopyroxene porphyroclasts incongruently reacted with migrating melt to form orthopyroxene plagioclase; (2) olivine porphyroclasts are partially replaced by interstitial orthopyroxene. The meltrock reaction textures tend to disappear in the mylonites, indicating that deformation in the mylonite continued under subsolidus conditions. The pyroxene chemistry is correlated with grain size. High-Al pyroxene cores indicate high temperatures (11001030C), whereas low-Al neoblasts display lower final equilibration temperatures (860C). The spinel Cr-number [molar Cr/(Cr Al)] and TiO2 concentrations show extreme variability covering almost the entire range known from abyssal peridotites. The spinel compositions of porphyroclastic peridotites from the central body are more variable than spinel from mylonite, mylonite with ultra-mylonite bands, and porphyroclastic rocks of the northern body. The spinel compositions probably indicate disequilibrium and would favour rapid cooling, and a faster exhumation of the central peridotite body, relative to the northern one. Our results indicate that melt migration and high-temperature deformation are juxtaposed both in time and space. Meltrock reaction may have caused grain-size reduction, which in turn led to localization of deformation. It is likely that melt-lubricated, actively deforming peridotites acted as melt focusing zones, with permeabilities higher than the surrounding, less deformed peridotites. Later, under subsolidus conditions, pinning in polycrystalline bands in the mylonites inhibited substantial grain growth and led to permanent weak zones in the upper mantle peridotite, with a permeability that is lower than in the weakly deformed peridotites. Such an inversion in permeability might explain why actively deforming, fine-grained peridotite mylonite acted as a permeability barrier and why ascending mafic melts might terminate and crystallize as gabbros along actively deforming shear zones. Melt-lubricated mantle shear zones provide a mechanism for explaining the discontinuous distribution of gabbros in oceancontinent transition zones, oceanic core complexes and ultraslow-spreading ridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To perform in vivo imaging of the cerebellum with an in-plane resolution of 120 mm to observe its cortical granular and molecular layers by taking advantage of the high signal-to-noise ratio and the increased magnetic susceptibility-related contrast available at high magnetic field strength such as 7 T. Materials and Methods: The study was approved by the institutional review board, and all patients provided written consent. Three healthy persons (two men, one woman; mean age, 30 years; age range, 28-31 years) underwent MR imaging with a 7-T system. Gradient-echo images (repetition time msec/echo time msec, 1000/25) of the human cerebellum were acquired with a nominal in-plane resolution of approximately 120 mum and a section thickness of 1 mm. Results: Structures with dimensions as small as 240 mum, such as the granular and molecular layers in the cerebellar cortex, were detected in vivo. The detection of these structures was confirmed by comparing the contrast obtained on T2*-weighted and phase images with that obtained on images of rat cerebellum acquired at 14 T with 30 mum in-plane resolution. Conclusion: In vivo cerebellar imaging at near-microscopic resolution is feasible at 7 T. Such detailed observation of an anatomic area that can be affected by a number of neurologic and psychiatric diseases, such as stroke, tumors, autism, and schizophrenia, could potentially provide newer markers for diagnosis and follow-up in patients with such pathologic conditions. (c) RSNA, 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Roughly 242 million used tires are generated annually in the United States. Many of these tires end up being landfilled or stockpiled. The stockpiles are unsightly, unsanitary, and also collect water which creates the perfect breeding ground for mosquitoes, some of which carry disease. In an effort to reduce the number of used tire stockpiles the federal government mandated the use of recycled rubber in federally funded, state implemented department of transportation (DOT) projects. This mandate required the use of recycled rubber in 5% of the asphalt cement concrete (ACC) tonnage used in federally funded projects in 1994, increasing that amount by 5% each year until 20% was reached, and remaining at 20% thereafter. The mandate was removed as part of the appropriations process in 1994, after the projects in this research had been completed. This report covers five separate projects that were constructed by the Iowa Department Of Transportation (DOT) in 1991 and 1992. These projects had all had some form of rubber incorporated into their construction and were evaluated for 5 years. The conclusion of the study is that the pavements with tire rubber added performed essentially the same as conventional ACC pavement. An exception was the use of rubber chips in a surface lift. This performed better at crack control and worse with friction values than conventional ACC. The cost of the pavement with rubber additive was significantly higher. As a result, the benefits do not outweigh the costs of using this recycled rubber process in pavements in Iowa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the variety of road users and vehicle types that travel on U.S. public roadways, slow moving vehicles (SMVs) present unique safety and operations issues. SMVs include vehicles that do not maintain a constant speed of 25 mph, such as large farm equipment, construction vehicles, or horse-drawn buggies. Though the number of crashes involving SMVs is relatively small, SMV crashes tend to be severe. Additionally, SMVs can be encountered regularly on non-Interstate/non-expressway public roadways, but motorists may not be accustomed to these vehicles. This project was designed to improve transportation safety for SMVs on Iowa’s public roadway system. This report includes a literature review that shows various SMV statistics and laws across the United States, a crash study based on three years of Iowa SMV crash data, and recommendations from the SMV community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In jointed portland cement concrete pavements, dowel bars are typically used to transfer loads between adjacent slabs. A common practice is for designers to place dowel bars at a certain, consistent spacing such that a sufficient number of dowels are available to effectively transfer anticipated loads. In many cases, however, the standards developed today for new highway construction simply do not reflect the design needs of low traffic volume, rural roads. The objective of this research was to evaluate the impact of the number of dowel bars and dowel location on joint performance and ultimately on pavement performance. For this research, test sections were designed, constructed, and tested in actual field service pavement. Test sections were developed to include areas with load transfer assemblies having three and four dowels in the outer wheel path only, areas with no joint reinforcement whatsoever, and full lane dowel basket assemblies as the control. Two adjacent paving projects provided both rural and urban settings and differing base materials. This report documents the approach to implementing the study and provides discussion and suggestions based on the results of the research. The research results indicate that the use of single three or four dowel basket assemblies in the outer wheel path is acceptable for use in low truck volume roads. In the case of roadways with relatively stiff bases such as asphalt treated or stabilized bases, the use of the three dowel bar pattern in the outside wheel path is expected to provide adequate performance over the design life of the pavement. In the case of untreated or granular bases, the results indicate that the use of the three or four dowel bar basket in both wheel paths provides the best long-term solution to load transfer and faulting measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent data compiled by the National Bridge Inventory revealed 29% of Iowa's approximate 24,600 bridges were either structurally deficient or functionally obsolete. This large number of deficient bridges and the high cost of needed repairs create unique problems for Iowa and many other states. The research objective of this project was to determine the load capacity of a particular type of deteriorating bridge – the precast concrete deck bridge – which is commonly found on Iowa's secondary roads. The number of these precast concrete structures requiring load postings and/or replacement can be significantly reduced if the deteriorated structures are found to have adequate load capacity or can be reliably evaluated. Approximately 600 precast concrete deck bridges (PCDBs) exist in Iowa. A typical PCDB span is 19 to 36 ft long and consists of eight to ten simply supported precast panels. Bolts and either a pipe shear key or a grouted shear key are used to join adjacent panels. The panels resemble a steel channel in cross-section; the web is orientated horizontally and forms the roadway deck and the legs act as shallow beams. The primary longitudinal reinforcing steel bundled in each of the legs frequently corrodes and causes longitudinal cracks in the concrete and spalling. The research team performed service load tests on four deteriorated PCDBs; two with shear keys in place and two without. Conventional strain gages were used to measure strains in both the steel and concrete, and transducers were used to measure vertical deflections. Based on the field results, it was determined that these bridges have sufficient lateral load distribution and adequate strength when shear keys are properly installed between adjacent panels. The measured lateral load distribution factors are larger than AASHTO values when shear keys were not installed. Since some of the reinforcement had hooks, deterioration of the reinforcement has a minimal affect on the service level performance of the bridges when there is minimal loss of cross-sectional area. Laboratory tests were performed on the PCDB panels obtained from three bridge replacement projects. Twelve deteriorated panels were loaded to failure in a four point bending arrangement. Although the panels had significant deflections prior to failure, the experimental capacity of eleven panels exceeded the theoretical capacity. Experimental capacity of the twelfth panel, an extremely distressed panel, was only slightly below the theoretical capacity. Service tests and an ultimate strength test were performed on a laboratory bridge model consisting of four joined panels to determine the effect of various shear connection configurations. These data were used to validate a PCDB finite element model that can provide more accurate live load distribution factors for use in rating calculations. Finally, a strengthening system was developed and tested for use in situations where one or more panels of an existing PCDB need strengthening.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The key goals in winter maintenance operations are preserving the safety and mobility of the traveling public. To do this, it is in general necessary to try to increase the friction of the road surface above the typical friction levels found on a snow or ice covered roadway. Because of prior work on the performance of abrasives (discussed in greater detail in chapter 2) a key concern when using abrasives has become how to ensure the greatest increase in pavement friction when using abrasives for the longest period of time. There are a number of ways in which the usage of abrasives can be optimized, and these methods are discussed and compared in this report. In addition, results of an Iowa DOT test of zero-velocity spreaders are presented. Additionally in this study the results of field studies conducted in Johnson County Iowa on the road surface friction of pavements treated with abrasive applications using different modes of delivery are presented. The experiments were not able to determine any significant difference in material placement performance between a standard delivery system and a chute based delivery system. The report makes a number of recommendations based upon the reviews and the experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of the collapse of a 140 foot high-mast lighting tower in Sioux City, Iowa in November of 2003, a thorough investigation into the behavior and design of these tall, yet relatively flexible structures was undertaken. Extensive work regarding the root cause of this failure was carried out by Robert Dexter of The University of Minnesota. Furthermore, a statewide inspection of all the high-mast towers in Iowa revealed fatigue cracks and loose anchor bolts on other existing structures. The current study was proposed to examine the static and dynamic behavior of a variety of towers in the State of Iowa utilizing field testing, specifically long-term monitoring and load testing. This report presents the results and conclusions from this project. The field work for this project was divided into two phases. Phase 1 of the project was conducted in October 2004 and focused on the dynamic properties of ten different towers in Clear Lake, Ames, and Des Moines, Iowa. Of those ten, two were also instrumented to obtain stress distributions at various details and were included in a 12 month long-term monitoring study. Phase 2 of this investigation was conducted in May of 2005, in Sioux City, Iowa, and focused on determining the static and dynamic behavior of a tower similar to the one that collapsed in November 2003. Identical tests were performed on a similar tower which was retrofitted with a more substantial replacement bottom section in order to assess the effect of the retrofit. A third tower with different details was dynamically load tested to determine its dynamic characteristics, similar to the Phase 1 testing. Based on the dynamic load tests, the modal frequencies of the towers fall within the same range. Also, the damping ratios are significantly lower in the higher modes than the values suggested in the AASHTO and CAN/CSA specifications. The comparatively higher damping ratios in the first mode may be due to aerodynamic damping. These low damping ratios in combination with poor fatigue details contribute to the accumulation of a large number of damage-causing cycles. As predicted, the stresses in the original Sioux City tower are much greater than the stresses in the retrofitted towers at Sioux City. Additionally, it was found that poor installation practices which often lead to loose anchor bolts and out-of-level leveling nuts can cause high localized stresses in the towers, which can accelerate fatigue damage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to improve the simulation of node number in soybean cultivars with determinate stem habits. A nonlinear model considering two approaches to input daily air temperature data (daily mean temperature and daily minimum/maximum air temperatures) was used. The node number on the main stem data of ten soybean cultivars was collected in a three-year field experiment (from 2004/2005 to 2006/2007) at Santa Maria, RS, Brazil. Node number was simulated using the Soydev model, which has a nonlinear temperature response function [f(T)]. The f(T) was calculated using two methods: using daily mean air temperature calculated as the arithmetic average among daily minimum and maximum air temperatures (Soydev tmean); and calculating an f(T) using minimum air temperature and other using maximum air temperature and then averaging the two f(T)s (Soydev tmm). Root mean square error (RMSE) and deviations (simulated minus observed) were used as statistics to evaluate the performance of the two versions of Soydev. Simulations of node number in soybean were better with the Soydev tmm version, with a 0.5 to 1.4 node RMSE. Node number can be simulated for several soybean cultivars using only one set of model coefficients, with a 0.8 to 2.4 node RMSE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to assess the effects of a forest-field ecotone on earthworm assemblages. Five sites (blocks) differing in the type of crop rotation used in the field were studied in Central Bohemia, Czech Republic. In each block, sampling was carried out in seven parallel rows perpendicular to a transect from a forest (oak or oak-pine) to the centre of a field, both in spring and autumn 2001-2003. Individual rows were located in the forest (5 m from the edge), in the forest edge, and in the field (at 5, 10, 25, 50 and 100 m distances from the forest edge). The density and biomass of earthworms were lowest in the forest, increased markedly in the forest edge, decreased again at 5 or 10 m distance from the forest edge and then continuously increased along the distance to the field boundary. The highest number of species was found in the forest edge and in the field boundary. Individual species differed in their distribution along the transect. Both density and biomass of earthworms were correlated with distance from forest edge, soil organic matter content, soil porosity, and water infiltration rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combinatorial optimization involves finding an optimal solution in a finite set of options; many everyday life problems are of this kind. However, the number of options grows exponentially with the size of the problem, such that an exhaustive search for the best solution is practically infeasible beyond a certain problem size. When efficient algorithms are not available, a practical approach to obtain an approximate solution to the problem at hand, is to start with an educated guess and gradually refine it until we have a good-enough solution. Roughly speaking, this is how local search heuristics work. These stochastic algorithms navigate the problem search space by iteratively turning the current solution into new candidate solutions, guiding the search towards better solutions. The search performance, therefore, depends on structural aspects of the search space, which in turn depend on the move operator being used to modify solutions. A common way to characterize the search space of a problem is through the study of its fitness landscape, a mathematical object comprising the space of all possible solutions, their value with respect to the optimization objective, and a relationship of neighborhood defined by the move operator. The landscape metaphor is used to explain the search dynamics as a sort of potential function. The concept is indeed similar to that of potential energy surfaces in physical chemistry. Borrowing ideas from that field, we propose to extend to combinatorial landscapes the notion of the inherent network formed by energy minima in energy landscapes. In our case, energy minima are the local optima of the combinatorial problem, and we explore several definitions for the network edges. At first, we perform an exhaustive sampling of local optima basins of attraction, and define weighted transitions between basins by accounting for all the possible ways of crossing the basins frontier via one random move. Then, we reduce the computational burden by only counting the chances of escaping a given basin via random kick moves that start at the local optimum. Finally, we approximate network edges from the search trajectory of simple search heuristics, mining the frequency and inter-arrival time with which the heuristic visits local optima. Through these methodologies, we build a weighted directed graph that provides a synthetic view of the whole landscape, and that we can characterize using the tools of complex networks science. We argue that the network characterization can advance our understanding of the structural and dynamical properties of hard combinatorial landscapes. We apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and predict problem hardness as measured from the performances of trajectory-based local search heuristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: The objective of this work was to evaluate Trichoderma harzianum isolates for biological control of white mold in common bean (Phaseolus vulgaris). Five isolates were evaluated for biocontrol of white mold in 'Perola' common bean under field conditions, in the 2009 and 2010 crop seasons. A commercial isolate (1306) and a control treatment were included. Foliar applications at 2x109 conidia mL-1 were performed at 42 and 52 days after sowing (DAS), in 2009, and at 52 DAS in 2010. The CEN287, CEN316, and 1306 isolates decreased the number of Sclerotinia sclerotiorum apothecia per square meter in comparison to the control, in both crop seasons. CEN287, CEN316, and 1306 decreased white mold severity during the experimental period, when compared to the control.