52 resultados para [JEL:C79] Mathematical and Quantitative Methods - Game Theory and Bargaining Theory - Other
Resumo:
We present new optical and near-infrared (NIR) photometry and spectroscopy of the Type IIP supernova (SN), SN 2004et. In combination with already published data, this provides one of the most complete studies of optical and NIR data for any Type IIP SN from just after explosion to +500 d. The contribution of the NIR flux to the bolometric light curve is estimated to increase from 15 per cent at explosion to around 50 per cent at the end of the plateau and then declines to 40 per cent at 300 d. SN 2004et is one of the most luminous IIP SNe which has been well studied and characterized, and with a luminosity of log L = 42.3 erg s-1 and a 56Ni mass of 0.06 +/- 0.04 M-circle dot, it is two times brighter than SN 1999em. We provide parametrized bolometric corrections as a function of time since explosion for SN 2004et and three other IIP SNe that have extensive optical and NIR data. These can be used as templates for future events in optical and NIR surveys without full wavelength coverage. We compare the physical parameters of SN 2004et with those of other well-studied IIP SNe and find that the kinetic energies span a range of 1050-1051 erg. We compare the ejected masses calculated from hydrodynamic models with the progenitor masses and limits derived from pre-discovery images. Some of the ejected mass estimates are significantly higher than the progenitor mass estimates, with SN 2004et showing perhaps the most serious mass discrepancy. With the current models, it appears difficult to reconcile 100 d plateau lengths and high expansion velocities with the low ejected masses of 5-6 M-circle dot implied from 7-8 M-circle dot progenitors. The nebular phase is studied using very late-time Hubble Space Telescope photometry, along with optical and NIR spectroscopy. The light curve shows a clear flattening at 600 d in the optical and the NIR, which is likely due to the ejecta impacting on circumstellar material. We further show that the [O i] 6300, 6364 A line strengths in the nebular spectra of four Type IIP SNe imply ejected oxygen masses of 0.5-1.5 M-circle dot.
Resumo:
A web-service is a remote computational facility which is made available for general use by means of the internet. An orchestration is a multi-threaded computation which invokes remote services. In this paper game theory is used to analyse the behaviour of orchestration evaluations when underlying web-services are unreliable. Uncertainty profiles are proposed as a means of defining bounds on the number of service failures that can be expected during an orchestration evaluation. An uncertainty profile describes a strategic situation that can be analyzed using a zero-sum angel-daemon game with two competing players: an angel a whose objective is to minimize damage to an orchestration and a daemon d who acts in a destructive fashion. An uncertainty profile is assessed using the value of its angel daemon game. It is shown that uncertainty profiles form a partial order which is monotonic with respect to assessment.
Resumo:
This article examines the role that qualitative methods can play in the study of children's racial attitudes and behaviour. It does this by discussing a number of examples taken from a qualitative, ethnographic study of five- and six-year-old children in an English multi-ethnic, inner-city primary school. The examples are used to highlight the limitations of research that relies solely on quantitative methods and the potential that qualitative methods have for addressing these limitations. Within this context the article contrasts the strengths and weaknesses of qualitative and quantitative methods in the study of children's racial attitudes and identities. The article concludes by arguing that a much more integrated multi-method approach is needed in this area and sets out some of the most effective ways this could be achieved.
Resumo:
Contestants are predicted to adjust the cost of a fight in line with the perceived value of the resource and this provides a way of determining whether the resource has been assessed. An assessment of resource value is predicted to alter an animal's motivational state and we note different methods of measuring that state. We provide a categorical framework in which the degree of resource assessment may be evaluated and also note limitations of various approaches. We place studies in six categories: (1) cases of no assessment, (2) cases of internal state such as hunger influencing apparent value, (3) cases of the contestants differing in assessment ability, (4) cases of mutual and equal assessment of value, (5) cases where opponents differ in resource value and (6) cases of particularly complex assessment abilities that involve a comparison of the value of two resources. We examine the extent to which these studies support game theory predictions and suggest future areas of research. (C) 2008 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.
Resumo:
Aim: This paper reports a study on how men cope with the side-effects of radiotherapy and neo-adjuvant androgen deprivation for prostate cancer up to 1 year after treatment.
Background: With early detection and improved treatments, prostate cancer survivors are living longer with the disease and the side-effects of treatment. How they cope affects their long-term physical and mental health.
Design: A prospective, longitudinal, exploratory design using both qualitative and quantitative methods was used in this study.
Method: Between September 2006–September 2007 149 men who were about to undergo radical radiotherapy ± androgen deprivation for localized prostate cancer in Northern Ireland were recruited to the study. They completed the Brief Cope scale at four time-points.
Results: Acceptance, positive reframing, emotional support, planning and, just getting on with it, were the most common ways of coping. Fewer men used coping strategies less at 6 months and 1 year after radiotherapy in comparison to pre-treatment and 4–6 weeks after radiotherapy. Interviews with these men demonstrated that men adapted to a new norm, with the support of their wives/partners and did not readily seek professional help. A minority of men used alcohol, behavioural disengagement and self blame as ways of coping.
Conclusion: Men used a variety of ways of coping to help them deal with radiotherapy and neo-adjuvant androgen deprivation for up to 12 months after radiotherapy. Interventions need to be developed to take account of the specific needs of partners of men with prostate cancer and single men who have prostate cancer.
Resumo:
From the early 1900s, some psychologists have attempted to establish their discipline as a quantitative science. In using quantitative methods to investigate their theories, they adopted their own special definition of measurement of attributes such as cognitive abilities, as though they were quantities of the type encountered in Newtonian science. Joel Michell has presented a carefully reasoned argument that psychological attributes lack additivity, and therefore cannot be quantities in the same way as the attributes of classical Newtonian physics. In the early decades of the 20th century, quantum theory superseded Newtonian mechanics as the best model of physical reality. This paper gives a brief, critical overview of the evolution of current measurement practices in psychology, and suggests the need for a transition from a Newtonian to a quantum theoretical paradigm for psychological measurement. Finally, a case study is presented that considers the implications of a quantum theoretical model for educational measurement. In particular, it is argued that, since the OECD’s Programme for International Student Assessment (PISA) is predicated on a Newtonian conception of measurement, this may constrain the extent to which it can make accurate comparisons of the achievements of different education systems.
Resumo:
Security is a critical concern around the world. Since resources for security are always limited, lots of interest have arisen in using game theory to handle security resource allocation problems. However, most of the existing work does not address adequately how a defender chooses his optimal strategy in a game with absent, inaccurate, uncertain, and even ambiguous strategy profiles' payoffs. To address this issue, we propose a general framework of security games under ambiguities based on Dempster-Shafer theory and the ambiguity aversion principle of minimax regret. Then, we reveal some properties of this framework. Also, we present two methods to reduce the influence of complete ignorance. Our investigation shows that this new framework is better in handling security resource allocation problems under ambiguities.
Resumo:
Threat prevention with limited security resources is a challenging problem. An optimal strategy is to eectively predict attackers' targets (or goals) based on current available information, and use such predictions to prevent (or disrupt) their planned attacks. In this paper, we propose a game-theoretic framework to address this challenge which encompasses the following three elements. First, we design a method to analyze an attacker's types in order to determine the most plausible type of an attacker. Second, we propose an approach to predict possible targets of an attack and the course of actions that the attackers may take even when the attackers' types are ambiguous. Third, a game-theoretic based strategy is developed to determine the best protection actions for defenders (security resources).
Resumo:
Traditional internal combustion engine vehicles are a major contributor to global greenhouse gas emissions and other air pollutants, such as particulate matter and nitrogen oxides. If the tail pipe point emissions could be managed centrally without reducing the commercial and personal user functionalities, then one of the most attractive solutions for achieving a significant reduction of emissions in the transport sector would be the mass deployment of electric vehicles. Though electric vehicle sales are still hindered by battery performance, cost and a few other technological bottlenecks, focused commercialisation and support from government policies are encouraging large scale electric vehicle adoptions. The mass proliferation of plug-in electric vehicles is likely to bring a significant additional electric load onto the grid creating a highly complex operational problem for power system operators. Electric vehicle batteries also have the ability to act as energy storage points on the distribution system. This double charge and storage impact of many uncontrollable small kW loads, as consumers will want maximum flexibility, on a distribution system which was originally not designed for such operations has the potential to be detrimental to grid balancing. Intelligent scheduling methods if established correctly could smoothly integrate electric vehicles onto the grid. Intelligent scheduling methods will help to avoid cycling of large combustion plants, using expensive fossil fuel peaking plant, match renewable generation to electric vehicle charging and not overload the distribution system causing a reduction in power quality. In this paper, a state-of-the-art review of scheduling methods to integrate plug-in electric vehicles are reviewed, examined and categorised based on their computational techniques. Thus, in addition to various existing approaches covering analytical scheduling, conventional optimisation methods (e.g. linear, non-linear mixed integer programming and dynamic programming), and game theory, meta-heuristic algorithms including genetic algorithm and particle swarm optimisation, are all comprehensively surveyed, offering a systematic reference for grid scheduling considering intelligent electric vehicle integration.
Resumo:
This chapter focuses on the relationship between improvisation and indeterminacy. We discuss the two practices by referring to play theory and game studies and situate it in recent network music performance. We will develop a parallel with game theory in which indeterminacy is seen as a way of articulating situations where structural decisions are left to the discernment of the performers and discuss improvisation as a method of play. The improvisation-indeterminacy relationship is discussed in the context of network music performance, which employs digital networks in the exchange of data between performers and hence relies on topological structures with varying degrees of openness and flexibility. Artists such as Max Neuhaus and The League of Automatic Music Composers initiated the development of a multitude of practices and technologies exploring the network as an environment for music making. Even though the technologies behind “the network” have shifted dramatically since Neuhaus’ use of radio in the 1960’s, a preoccupation with distribution and sharing of artistic agency has remained at the centre of networked practices. Gollo Föllmer, after undertaking an extensive review of network music initiatives, produced a typology that comprises categories as diverse as remix lists, sound toys, real/virtual space installations and network performances. For Föllmer, “the term ‘Net music’ comprises all formal and stylistic kinds of music upon which the specifics of electronic networks leave considerable traces, whereby the electronic networks strongly influence the process of musical production, the musical aesthetic, or the way music is received” (2005: 185).
Resumo:
Two experiments examined identification and bisection of tones varying in temporal duration (Experiment 1) or frequency (Experiment 2). Absolute identification of both durations and frequencies was influenced by prior stimuli and by stimulus distribution. Stimulus distribution influenced bisection for both stimulus types consistently, with more positively skewed distributions producing lower bisection points. The effect of distribution was greater when the ratio of the largest to smallest stimulus magnitude was greater. A simple mathematical model, temporal range frequency theory, was applied. It is concluded that (a) similar principles describe identification of temporal durations and other stimulus dimensions and (b) temporal bisection point shifts can be understood in terms of psychophysical principles independently developed in nontemporal domains, such as A. Parducci's (1965) range frequency theory.
Resumo:
Quantitative examination of prostate histology offers clues in the diagnostic classification of lesions and in the prediction of response to treatment and prognosis. To facilitate the collection of quantitative data, the development of machine vision systems is necessary. This study explored the use of imaging for identifying tissue abnormalities in prostate histology. Medium-power histological scenes were recorded from whole-mount radical prostatectomy sections at × 40 objective magnification and assessed by a pathologist as exhibiting stroma, normal tissue (nonneoplastic epithelial component), or prostatic carcinoma (PCa). A machine vision system was developed that divided the scenes into subregions of 100 × 100 pixels and subjected each to image-processing techniques. Analysis of morphological characteristics allowed the identification of normal tissue. Analysis of image texture demonstrated that Haralick feature 4 was the most suitable for discriminating stroma from PCa. Using these morphological and texture measurements, it was possible to define a classification scheme for each subregion. The machine vision system is designed to integrate these classification rules and generate digital maps of tissue composition from the classification of subregions; 79.3% of subregions were correctly classified. Established classification rates have demonstrated the validity of the methodology on small scenes; a logical extension was to apply the methodology to whole slide images via scanning technology. The machine vision system is capable of classifying these images. The machine vision system developed in this project facilitates the exploration of morphological and texture characteristics in quantifying tissue composition. It also illustrates the potential of quantitative methods to provide highly discriminatory information in the automated identification of prostatic lesions using computer vision.
Resumo:
The O removal through water formation is an important process in the Fischer-Tropsch synthesis. In this study, both steps in water formation (O + H --> OH, OH + H --> H2O) are studied on the stepped Co(0001) at high coverages using density functional theory. We find the following. (i) In both O-O and O-OH co-adsorption systems, two transition states (TSs) were located for the O hydrogenation: in one TS, both O and H are on the same terrace, and in the other they are at the interface between the step edge and the terrace below. (ii) In both the O-O and O-OH co-adsorption systems, the O hydrogenation at the interface is easier (E-a = 0.32 eV in the O-O system, E-a = 1.10 eV in the O-OH system) than that on the same terrace (E-a = 1.49 eV in the O-O system, E-a = 1.80 eV in the O-OH system). (iii) In both the O-O and O-OH co-adsorption systems, only one TS for the OH hydrogenation was located, in which both OH and H are on the same terrace. (iv) Compared to the OH hydrogenation in the O-OH system (E-a = 1.46 eV), the reaction in the OH-OH system (E-a = 0.64 eV) is much easier. The barrier differences and the water effect on the Fischer-Tropsch synthesis are discussed. A possible route with low barriers for water formation is proposed.