854 resultados para Simulator of Performance in Error
Resumo:
Placing a child in out-of-home care is one of the most important decisions made by professionals in the child care system, with substantial social, psychological, educational, medical and economic consequences. This paper considers the challenges and difficulties of building statistical models of this decision by reviewing the available international evidence. Despite the large number of empirical investigations over a 50 year period, a consensus on the variables associated with this decision is hard to identify. In addition, the individual models have low explanatory and predictive power and should not be relied on to make placement decisions. A number of reasons for this poor performance are offered, and some ways forwards suggested. This paper also aims to facilitate the emergence of a coherent and integrated international literature from the disconnected and fragmented empirical studies. Rather than one placement problem, there are many slightly different problems, and therefore it is expected that a number of related sub-literatures will emerge, each concentrating on a particular definition of the placement problem.
Resumo:
We examine to what degree we can expect to obtain accurate temperature trends for the last two decades near the surface and in the lower troposphere. We compare temperatures obtained from surface observations and radiosondes as well as satellite-based measurements from the Microwave Soundings Units (MSU), which have been adjusted for orbital decay and non-linear instrument-body effects, and reanalyses from the European Centre for Medium-Range Weather Forecasts (ERA) and the National Centre for Environmental Prediction (NCEP). In regions with abundant conventional data coverage, where the MSU has no major influence on the reanalysis, temperature anomalies obtained from microwave sounders, radiosondes and from both reanalyses agree reasonably. Where coverage is insufficient, in particular over the tropical oceans, large differences are found between the MSU and either reanalysis. These differences apparently relate to changes in the satellite data availability and to differing satellite retrieval methodologies, to which both reanalyses are quite sensitive over the oceans. For NCEP, this results from the use of raw radiances directly incorporated into the analysis, which make the reanalysis sensitive to changes in the underlying algorithms, e.g. those introduced in August 1992. For ERA, the bias-correction of the one-dimensional variational analysis may introduce an error when the satellite relative to which the correction is calculated is biased itself or when radiances change on a time scale longer than a couple of months, e.g. due to orbit decay. ERA inhomogeneities are apparent in April 1985, October/November 1986 and April 1989. These dates can be identified with the replacements of satellites. It is possible that a negative bias in the sea surface temperatures (SSTs) used in the reanalyses may have been introduced over the period of the satellite record. This could have resulted from a decrease in the number of ship measurements, a concomitant increase in the importance of satellite-derived SSTs, and a likely cold bias in the latter. Alternately, a warm bias in SSTs could have been caused by an increase in the percentage of buoy measurements (relative to deeper ship intake measurements) in the tropical Pacific. No indications for uncorrected inhomogeneities of land surface temperatures could be found. Near-surface temperatures have biases in the boundary layer in both reanalyses, presumably due to the incorrect treatment of snow cover. The increase of near-surface compared to lower tropospheric temperatures in the last two decades may be due to a combination of several factors, including high-latitude near-surface winter warming due to an enhanced NAO and upper-tropospheric cooling due to stratospheric ozone decrease.
Resumo:
A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.
Resumo:
An analysis of the attribution of past and future changes in stratospheric ozone and temperature to anthropogenic forcings is presented. The analysis is an extension of the study of Shepherd and Jonsson (2008) who analyzed chemistry-climate simulations from the Canadian Middle Atmosphere Model (CMAM) and attributed both past and future changes to changes in the external forcings, i.e. the abundances of ozone-depleting substances (ODS) and well-mixed greenhouse gases. The current study is based on a new CMAM dataset and includes two important changes. First, we account for the nonlinear radiative response to changes in CO2. It is shown that over centennial time scales the radiative response in the upper stratosphere to CO2 changes is significantly nonlinear and that failure to account for this effect leads to a significant error in the attribution. To our knowledge this nonlinearity has not been considered before in attribution analysis, including multiple linear regression studies. For the regression analysis presented here the nonlinearity was taken into account by using CO2 heating rate, rather than CO2 abundance, as the explanatory variable. This approach yields considerable corrections to the results of the previous study and can be recommended to other researchers. Second, an error in the way the CO2 forcing changes are implemented in the CMAM was corrected, which significantly affects the results for the recent past. As the radiation scheme, based on Fomichev et al. (1998), is used in several other models we provide some description of the problem and how it was fixed.
Resumo:
Considerable progress has taken place in numerical weather prediction over the last decade. It has been possible to extend predictive skills in the extra-tropics of the Northern Hemisphere during the winter from less than five days to seven days. Similar improvements, albeit on a lower level, have taken place in the Southern Hemisphere. Another example of improvement in the forecasts is the prediction of intense synoptic phenomena such as cyclogenesis which on the whole is quite successful with the most advanced operational models (Bengtsson (1989), Gadd and Kruze (1988)). A careful examination shows that there are no single causes for the improvements in predictive skill, but instead they are due to several different factors encompassing the forecasting system as a whole (Bengtsson, 1985). In this paper we will focus our attention on the role of data-assimilation and the effect it may have on reducing the initial error and hence improving the forecast. The first part of the paper contains a theoretical discussion on error growth in simple data assimilation systems, following Leith (1983). In the second part we will apply the result on actual forecast data from ECMWF. The potential for further forecast improvements within the framework of the present observing system in the two hemispheres will be discussed.
Resumo:
Within generative L2 acquisition research there is a longstanding debate as to what underlies observable differences in L1/L2 knowledge/ performance. On the one hand, Full Accessibility approaches maintain that target L2 syntactic representations (new functional categories and features) are acquirable (e.g., Schwartz & Sprouse, 1996). Conversely, Partial Accessibility approaches claim that L2 variability and/or optionality, even at advanced levels, obtains as a result of inevitable deficits in L2 narrow syntax and is conditioned upon a maturational failure in adulthood to acquire (some) new functional features (e.g., Beck, 1998; Hawkins & Chan, 1997; Hawkins & Hattori, 2006; Tsimpli & Dimitrakopoulou, 2007). The present study tests the predictions of these two sets of approaches with advanced English learners of L2 Brazilian Portuguese (n = 21) in the domain of inflected infinitives. These advanced L2 learners reliably differentiate syntactically between finite verbs, uninflected and inflected infinitives, which, as argued, only supports Full Accessibility approaches. Moreover, we will discuss how testing the domain of inflected infinitives is especially interesting in light of recent proposals that Brazilian Portuguese colloquial dialects no longer actively instantiate them (Lightfoot, 1991; Pires, 2002, 2006; Pires & Rothman, 2009; Rothman, 2007).
Resumo:
An unusually strong and prolonged stratospheric sudden warming (SSW) in January 2006 was the first major SSW for which globally distributed long-lived trace gas data are available covering the upper troposphere through the lower mesosphere. We use Aura Microwave Limb Sounder (MLS), Atmospheric Chemistry Experiment-Fourier Transform Spectrometer (ACE-FTS) data, the SLIMCAT Chemistry Transport Model (CTM), and assimilated meteorological analyses to provide a comprehensive picture of transport during this event. The upper tropospheric ridge that triggered the SSW was associated with an elevated tropopause and layering in trace gas profiles in conjunction with stratospheric and tropospheric intrusions. Anomalous poleward transport (with corresponding quasi-isentropic troposphere-to-stratosphere exchange at the lowest levels studied) in the region over the ridge extended well into the lower stratosphere. In the middle and upper stratosphere, the breakdown of the polar vortex transport barrier was seen in a signature of rapid, widespread mixing in trace gases, including CO, H2O, CH4 and N2O. The vortex broke down slightly later and more slowly in the lower than in the middle stratosphere. In the middle and lower stratosphere, small remnants with trace gas values characteristic of the pre-SSW vortex lingered through the weak and slow recovery of the vortex. The upper stratospheric vortex quickly reformed, and, as enhanced diabatic descent set in, CO descended into this strong vortex, echoing the fall vortex development. Trace gas evolution in the SLIMCAT CTM agrees well with that in the satellite trace gas data from the upper troposphere through the middle stratosphere. In the upper stratosphere and lower mesosphere, the SLIMCAT simulation does not capture the strong descent of mesospheric CO and H2O values into the reformed vortex; this poor CTM performance in the upper stratosphere and lower mesosphere results primarily from biases in the diabatic descent in assimilated analyses.
Resumo:
In Britain, substantial cuts in police budgets alongside controversial handling of incidents such as politically sensitive enquiries, public disorder and relations with the media have recently triggered much debate about public knowledge and trust in the police. To date, however, little academic research has investigated how knowledge of police performance impacts citizens’ trust. We address this long-standing lacuna by exploring citizens’ trust before and after exposure to real performance data in the context of a British police force. The results reveal that being informed of performance data affects citizens’ trust significantly. Furthermore, direction and degree of change in trust are related to variations across the different elements of the reported performance criteria. Interestingly, the volatility of citizens’ trust is related to initial performance perceptions (such that citizens with low initial perceptions of police performance react more significantly to evidence of both good and bad performance than citizens with high initial perceptions), and citizens’ intentions to support the police do not always correlate with their cognitive and affective trust towards the police. In discussing our findings, we explore the implications of how being transparent with performance data can both hinder and be helpful in developing citizens’ trust towards a public organisation such as the police. From our study, we pose a number of ethical challenges that practitioners face when deciding what data to highlight, to whom, and for what purpose.
Resumo:
In this paper we consider transcripts which originated from a practical series of Turing’s Imitation Game which was held on 23rd June 2012 at Bletchley Park, England. In some cases the tests involved a 3-participant simultaneous comparison of two hidden entities whereas others were the result of a direct 2-participant interaction. Each of the transcripts considered here resulted in a human interrogator being fooled, by a machine, into concluding that they had been conversing with a human. Particular features of the conversation are highlighted, successful ploys on the part of each machine discussed and likely reasons for the interrogator being fooled are considered. Subsequent feedback from the interrogators involved is also included
Resumo:
In the literature on achievement goals, performance-approach goals (striving to do better than others) and performance-avoidance goals (striving to avoid doing worse than others) tend to exhibit a moderate to high correlation, raising questions about whether the 2 goals represent distinct constructs. In the current article, we sought to examine the separability of these 2 goals using a broad factor-analytic approach that attended to issues that have been overlooked or underexamined in prior research. Five studies provided strong evidence for the separation of these 2 goal constructs: Separation was observed not only with exploratory factor analysis across different age groups and countries (Studies 1a and 1b) but also with change analysis (Study 2), ipsative factor analysis (Study 3), within-person analysis (Study 4), and behavioral genetics analysis (Study 5). We conclude by discussing the implications of the present research for the achievement goal literature, as well as the psychological literature in general.
Resumo:
The recent change in funding structure in the UK higher education system has fuelled an animated debate about the role that arts and humanities (A&H) subjects play not only within higher education but more broadly in the society and the economy. The debate has engaged with a variety of arguments and perspectives, from the intrinsic value of A&H, to their contribution to the broader society and their economic impact, particularly in relation to the creative economy, through knowledge exchange activities. The paper argues that in the current debate very little attention has been placed on the role that A&H graduates play in the economy, through their work after graduation, and specifically in the creative economy. Using Higher Education Statistical Agency data, we analyse the performance of A&H graduates (compared with other graduates) and particularly explore how embedded they are with the creative economy and its associated industries. The results highlight a complex intersection of different subdisciplines of the A&H with the creative economy but also reveal the salary gap and unstable working conditions experienced by graduates in this field.
Resumo:
The narrative of Rosemary’s Baby hinges on a central hesitation between pregnancy induced madness and the existence of Satanism. Accordingly, the monstrous element is embodied in both the real and the supernatural: Rosemary’s husband Guy (John Cassavetes) is responsible for her victimisation through rape in either explanation. However, I will argue that the inherent ambiguity of the plot makes it difficult to place him as such a figure typical to the archetypal horror binaries of normality/monster, human/inhuman. By displacing generic convention the film complicates the issue of monstrosity, whilst simultaneously offering the possibility for the depiction of female experience of marriage to be at the centre of the narrative, for the real to be possibly of more significance than the supernatural. Previous writing has tended to concentrate on Rosemary and her pregnancy, so through detailed consideration of Cassavetes’ performance and its placement in the mise-en-scène this focus on Guy aims to demonstrate that he changes almost as much as Rosemary does. The chapter will focus on the film’s depiction of rape, during Rosemary’s nightmare and after it, in order to demonstrate how the notion of performance reveals Guy’s monstrousness and the difficulties this represents in our engagement with him.
Resumo:
The Back to the Future Trilogy incorporates several different generic elements, including aspects of the fifties teen movie, science fiction, comedy and the western. These different modes playfully intertwine with each other creating a complex world of repetitions, echoes and modulations. This essay seeks to interrogate the construction of generic elements and the play between them through a close analysis of a repeated performance. Genre is signalled through various strategies employed within the construction of mise-en-scène, a significant portion of this, as I would like to argue, is transmitted through performance. The material detail of a performance – incorporating gesture, movement, voice, and even surrounding elements such as costume – as well as the way it its presented within a film is key to the establishment, invocation and coherence of genre. Furthermore, attention to the complexity of performance details, particularly in the manner in which they reverberate across texts, demonstrates the intricacy of genre and its inherent mutability. The Back to the Future trilogy represents a specific interest in the flexibility of genre. Within each film, and especially across all three, aspects of various genres are interlaced through both visual and narrative detail, thus constructing a dense layer of references both within and without the texts. To explore this patterning in more detail I will interrogate the contribution of performance to generic play through close analysis of Thomas F. Wilson’s performance of Biff/Griff/Burford Tannen and his central encounter with Marty McFly (Michael J. Fox) in each film. These moments take place in a fifties diner, a 1980s retro diner and a saloon respectively, each space contributing the similarities and differences in each repetition. Close attention to Wilson’s performance of each related character, which contains both modulations and repetitions used specifically to place each film’s central generic theme, demonstrates how embedded the play between genres and their flexibility is within the trilogy.
Resumo:
We extend the theory of the multinational enterprise (MNE) by exploring the concept of subsidiary-specific advantages (SSAs) as a driver for subsidiary performance. We investigate the relationship of host country-specific advantages (host CSAs) in the form of market attractiveness, SSAs and subsidiary sales as they affect subsidiary performance. From an original primary dataset of 101 British multinational (MNE) subsidiaries in six South East Asian countries, our analysis reveals three significant findings. First, host market attractiveness has a statistically positive impact on the performance of subsidiaries. Second, the three traditional SSAs of general management, marketing capabilities and invested capital enhance subsidiary performance. Third, we examine geographic direction and types of customers for subsidiary sales by following international accounting standards. We find that these subsidiaries generate on average 95 percent of total sales from the Asia Pacific region and 91 percent of total sales from external customers. Our findings have important research and managerial implications.