983 resultados para All terrain vehicles
Resumo:
We present a measurement of the top quark mass in the all-hadronic channel (\tt $\to$ \bb$q_{1}\bar{q_{2}}q_{3}\bar{q_{4}}$) using 943 pb$^{-1}$ of \ppbar collisions at $\sqrt {s} = 1.96$ TeV collected at the CDF II detector at Fermilab (CDF). We apply the standard model production and decay matrix-element (ME) to $\ttbar$ candidate events. We calculate per-event probability densities according to the ME calculation and construct template models of signal and background. The scale of the jet energy is calibrated using additional templates formed with the invariant mass of pairs of jets. These templates form an overall likelihood function that depends on the top quark mass and on the jet energy scale (JES). We estimate both by maximizing this function. Given 72 observed events, we measure a top quark mass of 171.1 $\pm$ 3.7 (stat.+JES) $\pm$ 2.1 (syst.) GeV/$c^{2}$. The combined uncertainty on the top quark mass is 4.3 GeV/$c^{2}$.
Resumo:
An inexpensive all glass sealed stirrer may be constructed using a surgical syringe of about 10-ml capacity.
Resumo:
The concept of globalization has become a shorthand for making sense of contemporary society. It reflects large-scale economic and social change, which affects people differently and evokes different viewpoints. Globalization is thus a highly contested concept and phenomenon. Contradictory and competing views, in turn, seem to be based on different interpretations of the present dominant forms of globalization, and of the material, economic, social and cultural conditions that these forms produce and give rise to. We view globalization not only as a significant set of economic, financial, social, political and cultural forces but as a powerful and contested discursive space. In this article, we present an overview of recent literature to introduce different thematic perspectives on globalization, to specify different ideological and discursive bases to approach globalization, and to place multinational corporations (MNC:s) within this context. Our account is not exhaustive, rather, it is intended as a basis for further discussion on the nature and role of multinational corporations in complex ”global” society
Resumo:
This article focuses on cultural identity-building in the cross-border merger context. To provide an alternative to the dominant essentialist analyses of cultures and cultural differences, cultural identitybuilding is conceptualized as a metaphoric process. The focus is on two processes inherent in the cross-border merger context: construction of images of Us and Them and construction of images of a Common Future. Based on an analysis of a special metaphor exercise carried out in a recent Finnish–Swedish merger, the article illustrates how the metaphoric perspective reveals specific cognitive, emotional and political aspects of cultural identity-building that easily remain ‘hidden’ in the case of more traditional approaches.
Resumo:
In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.
Resumo:
With the objective of better understanding the significance of New Car Assessment Program (NCAP) tests conducted by the National Highway Traffic Safety Administration (NHTSA), head-on collisions between two identical cars of different sizes and between cars and a pickup truck are studied in the present paper using LS-DYNA models. Available finite element models of a compact car (Dodge Neon), midsize car (Dodge Intrepid), and pickup truck (Chevrolet C1500) are first improved and validated by comparing theanalysis-based vehicle deceleration pulses against corresponding NCAP crash test histories reported by NHTSA. In confirmation of prevalent perception, simulation-bascd results indicate that an NCAP test against a rigid barrier is a good representation of a collision between two similar cars approaching each other at a speed of 56.3 kmph (35 mph) both in terms of peak deceleration and intrusions. However, analyses carried out for collisions between two incompatible vehicles, such as an Intrepid or Neon against a C1500, point to the inability of the NCAP tests in representing the substantially higher intrusions in the front upper regions experienced by the cars, although peak decelerations in cars arc comparable to those observed in NCAP tests. In an attempt to improve the capability of a front NCAP test to better represent real-world crashes between incompatible vehicles, i.e., ones with contrasting ride height and lower body stiffness, two modified rigid barriers are studied. One of these barriers, which is of stepped geometry with a curved front face, leads to significantly improved correlation of intrusions in the upper regions of cars with respect to those yielded in the simulation of collisions between incompatible vehicles, together with the yielding of similar vehicle peak decelerations obtained in NCAP tests.
Resumo:
Let G - (V, E) be a weighted undirected graph having nonnegative edge weights. An estimate (delta) over cap (u, v) of the actual distance d( u, v) between u, v is an element of V is said to be of stretch t if and only if delta(u, v) <= (delta) over cap (u, v) <= t . delta(u, v). Computing all-pairs small stretch distances efficiently ( both in terms of time and space) is a well-studied problem in graph algorithms. We present a simple, novel, and generic scheme for all-pairs approximate shortest paths. Using this scheme and some new ideas and tools, we design faster algorithms for all-pairs t-stretch distances for a whole range of stretch t, and we also answer an open question posed by Thorup and Zwick in their seminal paper [J. ACM, 52 (2005), pp. 1-24].
Resumo:
Instrument landing systems (ILS) are normally designed assuming the site around them to be flat. Uneven terrain results in undulations in the glidescope. In recent years, models have been evolved for predicting such aberrations as a simpler alternative to experimental methods. Such modeling normally assumes the ground to be fully conducting. A method is presented for considering imperfect terrain conductivity within the framework of the uniform theory of diffraction (UTD). A single impedance wedge formulation is developed to a form that resembles the standard form of UTD, with only one extra term in the diffraction coefficient. This extends the applicability of the standard UTD formulation and software packages to the case of the imperfectly conducting terrain. The method has been applied to a real airport site in India and improved agreement with measured glidescope parameters is demonstrated
Resumo:
In this thesis I examine the U.S. foreign policy discussion that followed the war between Russia and Georgia in August 2008. In the politically charged setting that preceded the presidential elections, the subject of the debate was not only Washington's response to the crisis in the Caucasus but, more generally, the direction of U.S. foreign policy after the presidency of George W. Bush. As of November 2010, the reasons for and consequences of the Russia-Georgia war continue to be contested. My thesis demonstrates that there were already a number of different stories about the conflict immediately after the outbreak of hostilities. I want to argue that among these stories one can discern a “neoconservative narrative” that described the war as a confrontation between the East and the West and considered it as a test for Washington’s global leadership. I draw on the theory of securitization, particularly on a framework introduced by Holger Stritzel. Accordingly, I consider statements about the conflict as “threat texts” and analyze these based on the existing discursive context, the performative force of the threat texts and the positional power of the actors presenting them. My thesis suggests that a notion of narrativity can complement Stritzel’s securitization framework and take it further. Threat texts are established as narratives by attaching causal connections, meaning and actorship to the discourse. By focusing on this process I want to shed light on the relationship between the text and the context, capture the time dimension of a speech act articulation and help to explain how some interpretations of the conflict are privileged and others marginalized. I develop the theoretical discussion through an empirical analysis of the neoconservative narrative. Drawing on Stritzel’s framework, I argue that the internal logic of the narrative which was presented as self-evident can be analyzed in its historicity. Asking what was perceived to be at stake in the conflict, how the narrative was formed and what purposes it served also reveals the possibility for alternative explanations. My main source material consists of transcripts of think tank seminars organized in Washington, D.C. in August 2008. In addition, I resort to the foreign policy discussion in the mainstream media.
Resumo:
This paper proposes a differential evolution based method of improving the performance of conventional guidance laws at high heading errors, without resorting to techniques from optimal control theory, which are complicated and suffer from several limitations. The basic guidance law is augmented with a term that is a polynomial function of the heading error. The values of the coefficients of the polynomial are found by applying the differential evolution algorithm. The results are compared with the basic guidance law, and the all-aspect proportional navigation laws in the literature. A scheme for online implementation of the proposed law for application in practice is also given. (c) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Linear optimization model was used to calculate seven wood procurement scenarios for years 1990, 2000 and 2010. Productivity and cost functions for seven cutting, five terrain transport, three long distance transport and various work supervision and scaling methods were calculated from available work study reports. All method's base on Nordic cut to length system. Finland was divided in three parts for description of harvesting conditions. Twenty imaginary wood processing points and their wood procurement areas were created for these areas. The procurement systems, which consist of the harvesting conditions and work productivity functions, were described as a simulation model. In the LP-model the wood procurement system has to fulfil the volume and wood assortment requirements of processing points by minimizing the procurement cost. The model consists of 862 variables and 560 restrictions. Results show that it is economical to increase the mechanical work in harvesting. Cost increment alternatives effect only little on profitability of manual work. The areas of later thinnings and seed tree- and shelter wood cuttings increase on cost of first thinnings. In mechanized work one method, 10-tonne one grip harvester and forwarder, is gaining advantage among other methods. Working hours of forwarder are decreasing opposite to the harvester. There is only little need to increase the number of harvesters and trucks or their drivers from today's level. Quite large fluctuations in level of procurement and cost can be handled by constant number of machines, by alternating the number of season workers and by driving machines in two shifts. It is possible, if some environmental problems of large scale summer time harvesting can be solved.
Resumo:
We report on a search for the production of the Higgs boson decaying to two bottom quarks accompanied by two additional quarks. The data sample used corresponds to an integrated luminosity of approximately 4 fb-1 of pp̅ collisions at √s=1.96 TeV recorded by the CDF II experiment. This search includes twice the integrated luminosity of the previous published result, uses analysis techniques to distinguish jets originating from light flavor quarks and those from gluon radiation, and adds sensitivity to a Higgs boson produced by vector boson fusion. We find no evidence of the Higgs boson and place limits on the Higgs boson production cross section for Higgs boson masses between 100 GeV/c2 and 150 GeV/c2 at the 95% confidence level. For a Higgs boson mass of 120 GeV/c2, the observed (expected) limit is 10.5 (20.0) times the predicted standard model cross section.
Resumo:
It is shown that the euclideanized Yukawa theory, with the Dirac fermion belonging to an irreducible representation of the Lorentz group, is not bounded from below. A one parameter family of supersymmetric actions is presented which continuously interpolates between the N = 2 SSYM and the N = 2 supersymmetric topological theory. In order to obtain a theory which is bounded from below and satisfies Osterwalder-Schrader positivity, the Dirac fermion should belong to a reducible representation of the Lorentz group and the scalar fields have to be reinterpreted as the extra components of a higher dimensional vector field.
Resumo:
Much of the benefits of deploying unmanned aerial vehicles can be derived from autonomous missions. For such missions, however, sense-and-avoid capability (i.e., the ability to detect potential collisions and avoid them) is a critical requirement. Collision avoidance can be broadly classified into global and local path-planning algorithms, both of which need to be addressed in a successful mission. Whereas global path planning (which is mainly done offline) broadly lays out a path that reaches the goal point, local collision-avoidance algorithms, which are usually fast, reactive, and carried out online, ensure safety of the vehicle from unexpected and unforeseen obstacles/collisions. Even though many techniques for both global and local collision avoidance have been proposed in the recent literature, there is a great interest around the globe to solve this important problem comprehensively and efficiently and such techniques are still evolving. This paper presents a brief overview of a few promising and evolving ideas on collision avoidance for unmanned aerial vehicles, with a preferential bias toward local collision avoidance.