893 resultados para running reward
Resumo:
In this dissertation we study the interaction between Saturn's moon Titan and the magnetospheric plasma and magnetic field. The method of research is a three-dimensional computer simulation model, that is used to simulate this interaction. The simulation model used is a hybrid model. Hybrid models enable individual tracking or tracing of ions and also take into account the particle motion in the propagation of the electromagnetic fields. The hybrid model has been developed at the Finnish Meteorological Institute. This thesis gives a general description of the effects that the solar wind has on Earth and other planets of our solar system. Planetary satellites can also have similar interactions with the solar wind but also with the plasma flows of planetary magnetospheres. Titan is clearly the largest among the satellites of Saturn and also the only known satellite with a dense atmosphere. It is the atmosphere that makes Titan's plasma interaction with the magnetosphere of Saturn so unique. Nevertheless, comparisons with the plasma interactions of other solar system bodies are valuable. Detecting charged plasma particles requires in situ measurements obtainable through scientific spacecraft. The Cassini mission has been one of the most remarkable international efforts in space science. Since 2004 the measurements and images obtained from instruments onboard the Cassini spacecraft have increased the scientific knowledge of Saturn as well as its satellites and magnetosphere in a way no one was probably able to predict. The current level of science on Titan is practically unthinkable without the Cassini mission. Many of the observations by Cassini instrument teams have influenced this research both the direct measurements of Titan as well as observations of its plasma environment. The theoretical principles of the hybrid modelling approach are presented in connection to the broader context of plasma simulations. The developed hybrid model is described in detail: e.g. the way the equations of the hybrid model are solved is shown explicitly. Several simulation techniques, such as the grid structure and various boundary conditions, are discussed in detail as well. The testing and monitoring of simulation runs is presented as an essential routine when running sophisticated and complex models. Several significant improvements of the model, that are in preparation, are also discussed. A main part of this dissertation are four scientific articles based on the results of the Titan model. The Titan model developed during the course of the Ph.D. research has been shown to be an important tool to understand Titan's plasma interaction. One reason for this is that the structures of the magnetic field around Titan are very much three-dimensional. The simulation results give a general picture of the magnetic fields in the vicinity of Titan. The magnetic fine structure of Titan's wake as seen in the simulations seems connected to Alfvén waves an important wave mode in space plasmas. The particle escape from Titan is also a major part of these studies. Our simulations show a bending or turning of Titan's ionotail that we have shown to be a direct result of the basic principles in plasma physics. Furthermore, the ion flux from the magnetosphere of Saturn into Titan's upper atmosphere has been studied. The modelled ion flux has asymmetries that would likely have a large impact in the heating in different parts of Titan's upper atmosphere.
Resumo:
Discoveries at the LHC will soon set the physics agenda for future colliders. This report of a CERN Theory Institute includes the summaries of Working Groups that reviewed the physics goals and prospects of LHC running with 10 to 300 fb(-1) of integrated luminosity, of the proposed sLHC luminosity upgrade, of the ILC, of CLIC, of the LHeC and of a muon collider. The four Working Groups considered possible scenarios for the first 10 fb(-1) of data at the LHC in which (i) a state with properties that are compatible with a Higgs boson is discovered, (ii) no such state is discovered either because the Higgs properties are such that it is difficult to detect or because no Higgs boson exists, (iii) a missing-energy signal beyond the Standard Model is discovered as in some supersymmetric models, and (iv) some other exotic signature of new physics is discovered. In the contexts of these scenarios, the Working Groups reviewed the capabilities of the future colliders to study in more detail whatever new physics may be discovered by the LHC. Their reports provide the particle physics community with some tools for reviewing the scientific priorities for future colliders after the LHC produces its first harvest of new physics from multi-TeV collisions.
Resumo:
This dissertation concerns the Punan Vuhang, former hunter-gatherers who are now part-time farmers living in an area of remote rainforest in the Malaysian state of Sarawak. It covers two themes: first, examining their methods of securing a livelihood in the rainforest, and second looking at their adaptation to a settled life and agriculture, and their response to rapid and large-scale commercial logging. This study engages the long-running debates among anthropologists and ecologists on whether recent hunting-gathering societies were able to survive in the tropical rainforest without dependence on farming societies for food resources. In the search for evidence, the study poses three questions: What food resources were available to rainforest hunter-gatherers? How did they hunt and gather these foods? How did they cope with periodic food shortages? In fashioning a life in the rainforest, the Punan Vuhang survived resource scarcity by developing adaptive strategies through intensive use of their knowledge of the forest and its resources. They also adopted social practices such as sharing and reciprocity, and resource tenure to sustain themselves without recourse to external sources of food. In the 1960s, the Punan Vuhang settled down in response to external influences arising in part from the Indonesian-Malaysian Confrontation. This, in turn, initiated a series of processes with political, economic and religious implications. However, elements of the traditional economy have remained resilient as the people continue to hunt, fish and gather, and are able to farm on an individual basis, unlike neighboring shifting cultivators who need to cooperate with each other. At the beginning of the 21st century, the Punan Vuhang face a new challenge arising from the issue of rights in the context of the state and national law and large-scale commercial logging in their forest habitat. The future seems bleak as they face the social problems of alcoholism, declining leadership, and dependence on cash income and commodities from the market.
Resumo:
We develop extensions of the Simulated Annealing with Multiplicative Weights (SAMW) algorithm that proposed a method of solution of Finite-Horizon Markov Decision Processes (FH-MDPs). The extensions developed are in three directions: a) Use of the dynamic programming principle in the policy update step of SAMW b) A two-timescale actor-critic algorithm that uses simulated transitions alone, and c) Extending the algorithm to the infinite-horizon discounted-reward scenario. In particular, a) reduces the storage required from exponential to linear in the number of actions per stage-state pair. On the faster timescale, a 'critic' recursion performs policy evaluation while on the slower timescale an 'actor' recursion performs policy improvement using SAMW. We give a proof outlining convergence w.p. 1 and show experimental results on two settings: semiconductor fabrication and flow control in communication networks.
Resumo:
This article discusses the physics programme of the TOTEM experiment at the LHC. A new special beam optics with beta* = 90 m, enabling the measurements of the total cross-section, elastic pp scattering and diffractive phenomena already at early LHC runs, is explained. For this and the various other TOTEM running scenarios, the acceptances of the leading proton detectors and of the forward tracking stations for some physics processes are described.
Resumo:
Wear studies of engine components of high-speed diesel engines running under various operating conditions are presented. Tests were conducted under controlled conditions over long periods. The results of the various tests are discussed and attempts have been made to examine the effects of engine operating variables and the quality of the lubricating oil on the wear of engine components.
Resumo:
The status of the TOTEM experiment is described as well as the prospects for the measurements in the early LHC runs. The primary goal of TOTEM is the measurement of the total p-p cross section, using a method independent of the luminosity. A final accuracy of 1% is ex- pected with dedicated β∗ = 1540 m runs, while at the beginning a 5% resolution is achievable with a β∗ = 90 m optics. Accordingly to the running scenarios TOTEM will be able to measure the elastic scattering in a wide range of t and to study the cross-sections and the topologies of diffractive events. In a later stage, physics studies will be extended to low-x and forward physics collaborating with CMS as a whole experimental apparatus.
Resumo:
Query incentive networks capture the role of incentives in extracting information from decentralized information networks such as a social network. Several game theoretic tilt:Kids of query incentive networks have been proposed in the literature to study and characterize the dependence, of the monetary reward required to extract the answer for a query, on various factors such as the structure of the network, the level of difficulty of the query, and the required success probability.None of the existing models, however, captures the practical andimportant factor of quality of answers. In this paper, we develop a complete mechanism design based framework to incorporate the quality of answers, in the monetization of query incentive networks. First, we extend the model of Kleinberg and Raghavan [2] to allow the nodes to modulate the incentive on the basis of the quality of the answer they receive. For this qualify conscious model. we show are existence of a unique Nash equilibrium and study the impact of quality of answers on the growth rate of the initial reward, with respect to the branching factor of the network. Next, we present two mechanisms; the direct comparison mechanism and the peer prediction mechanism, for truthful elicitation of quality from the agents. These mechanisms are based on scoring rules and cover different; scenarios which may arise in query incentive networks. We show that the proposed quality elicitation mechanisms are incentive compatible and ex-ante budget balanced. We also derive conditions under which ex-post budget balance can beachieved by these mechanisms.
Resumo:
Volatile organic compounds (VOCs) are emitted into the atmosphere from natural and anthropogenic sources, vegetation being the dominant source on a global scale. Some of these reactive compounds are deemed major contributors or inhibitors to aerosol particle formation and growth, thus making VOC measurements essential for current climate change research. This thesis discusses ecosystem scale VOC fluxes measured above a boreal Scots pine dominated forest in southern Finland. The flux measurements were performed using the micrometeorological disjunct eddy covariance (DEC) method combined with proton transfer reaction mass spectrometry (PTR-MS), which is an online technique for measuring VOC concentrations. The measurement, calibration, and calculation procedures developed in this work proved to be well suited to long-term VOC concentration and flux measurements with PTR-MS. A new averaging approach based on running averaged covariance functions improved the determination of the lag time between wind and concentration measurements, which is a common challenge in DEC when measuring fluxes near the detection limit. The ecosystem scale emissions of methanol, acetaldehyde, and acetone were substantial. These three oxygenated VOCs made up about half of the total emissions, with the rest comprised of monoterpenes. Contrary to the traditional assumption that monoterpene emissions from Scots pine originate mainly as evaporation from specialized storage pools, the DEC measurements indicated a significant contribution from de novo biosynthesis to the ecosystem scale monoterpene emissions. This thesis offers practical guidelines for long-term DEC measurements with PTR-MS. In particular, the new averaging approach to the lag time determination seems useful in the automation of DEC flux calculations. Seasonal variation in the monoterpene biosynthesis and the detailed structure of a revised hybrid algorithm, describing both de novo and pool emissions, should be determined in further studies to improve biological realism in the modelling of monoterpene emissions from Scots pine forests. The increasing number of DEC measurements of oxygenated VOCs will probably enable better estimates of the role of these compounds in plant physiology and tropospheric chemistry. Keywords: disjunct eddy covariance, lag time determination, long-term flux measurements, proton transfer reaction mass spectrometry, Scots pine forests, volatile organic compounds
Resumo:
Two optimal non-linear reinforcement schemes—the Reward-Inaction and the Penalty-Inaction—for the two-state automaton functioning in a stationary random environment are considered. Very simple conditions of symmetry of the non-linear function figuring in the reinforcement scheme are shown to be necessary and sufficient for optimality. General expressions for the variance and rate of learning are derived. These schemes are compared with the already existing optimal linear schemes in the light of average variance and average rate of learning.
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
This paper summarizes literature explaining workplace bullying and focuses on organisational antecedents of bullying. In order to better understand the logic behind bullying, a model discussing different types of explanations is put forward. Thus, explanations for and factors associated with bullying are classified into three groups, i.e. enabling structures or necessary antecedents (e.g. perceived power imbalances, low perceived costs, and dissatisfaction and frustration), motivating structures or incentives (e.g. internal competition, reward systems, and expected benefits), and precipitating processes or triggering circumstances (e.g. downsizing and restructuring, organisational changes, changes in the composition of the workgroup). The paper concludes that bullying is often an interaction between structures and processes from all three groupings.
Resumo:
As globalization and capital free movement has increased, so has interest in the effects of that global money flow, especially during financial crises. The concern has been that large global money flows will affect the pricing of small local markets by causing, in particular, overreaction. The purpose of this thesis is to contribute to the body of work concerning short-term under- and overreaction and the short-term effects of foreign investment flow in the small Finnish equity markets. This thesis also compares foreign execution return to domestic execution return. This study’s results indicate that short-term under- and overreaction occurs in domestic-buy portfolios (domestic net buying) rather than in foreign-buy portfolios. This under- and overreaction, however, is not economically meaningful after controlling for the bid-ask bounce effect. Based on this finding, one can conclude that foreign investors do not have a destabilizing effect in the short-term in the Finnish markets. Foreign activity affects short-term returns. When foreign investors are net buyers (sellers) there are positive (negative) market adjusted returns. Literature related to nationality and institutional effect leads us to expect these kind of results. These foreign flows are persistent at a 5 % to 21 % level and the persistence of foreign buy flow is higher than the foreign sell flow. Foreign daily trading execution is worse than domestic execution. Literature which quantifies foreign investors as liquidity demanders and literature related to front-running leads us to expect poorer foreign execution than domestic execution.
Resumo:
The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.
Resumo:
Motivated by certain situations in manufacturing systems and communication networks, we look into the problem of maximizing the profit in a queueing system with linear reward and cost structure and having a choice of selecting the streams of Poisson arrivals according to an independent Markov chain. We view the system as a MMPP/GI/1 queue and seek to maximize the profits by optimally choosing the stationary probabilities of the modulating Markov chain. We consider two formulations of the optimization problem. The first one (which we call the PUT problem) seeks to maximize the profit per unit time whereas the second one considers the maximization of the profit per accepted customer (the PAC problem). In each of these formulations, we explore three separate problems. In the first one, the constraints come from bounding the utilization of an infinite capacity server; in the second one the constraints arise from bounding the mean queue length of the same queue; and in the third one the finite capacity of the buffer reflect as a set of constraints. In the problems bounding the utilization factor of the queue, the solutions are given by essentially linear programs, while the problems with mean queue length constraints are linear programs if the service is exponentially distributed. The problems modeling the finite capacity queue are non-convex programs for which global maxima can be found. There is a rich relationship between the solutions of the PUT and PAC problems. In particular, the PUT solutions always make the server work at a utilization factor that is no less than that of the PAC solutions.