36 resultados para Gone Girl
Resumo:
This is a primarily meta-critical essay and is in direct dialogue with the argument presented in my essay on Gone with the Wind (published in 'Screen'). Here, however, the emphasis is much more on the challenge my definition of spectacle represents for important traditions of film analysis, particularly ‘mise-en-scène criticism’. I argue for the possibility of spectacle to form part of the ‘organic’ whole of a film’s texture and form, while noting the challenge the concept represents (by dint of certain ideological associations and its taint of commercialism) with ‘organicist’ traditions of interpretative film analysis.
Resumo:
This article presents findings and seeks to establish the theoretical markers that indicate the growing importance of fact-based drama in screen and theatre performance to the wider Anglophone culture. During the final decade of the twentieth century and the opening one of the twenty-first, television docudrama and documentary theatre have grown in visibility and importance in the UK, providing key responses to social, cultural and political change over the millennial period. Actors were the prime focus for the enquiry principally because so little research has been done into the special demands that fact-based performance makes on them. The main emphasis in actor training (in the UK at any rate) is, as it always has been, on preparation for fictional drama. Preparation in acting schools is also heavily geared towards stage performance. Our thesis was that performers called upon to play the roles of real people, in whatever medium, have added responsibilities both towards history and towards real individuals and their families. Actors must engage with ethical questions whether they like it or not, and we found them keenly aware of this. In the course of the research, we conducted 30 interviews with a selection of actors ranging from the experienced to the recently-trained. We also interviewed a few industry professionals and actor trainers. Once the interviews started it was clear that actors themselves made little or no distinction between how they set about their work for television and film. The essential disciplines for work in front of the camera, they told us, are the same whether the camera is electronic or photographic. Some adjustments become necessary, of course in the multi-camera TV studio. But much serious drama for the screen is made on film anyway. We found it was also the case that young actors now tend to get their first paid employment before a camera rather than on a stage. The screen-before-stage tendency, along with the fundamental re-shaping that has gone on in the British theatre since at least the early 1980s, had implications for actor training. We have also found that theatre work still tends to be most valued by actors. For all the actors we interviewed, theatre was what they liked doing best because it was there they could practice and develop their skills, there they could work most collectively towards performance, and there they could more directly experience audience feedback in the real time of the stage play. The current world of television has been especially constrained in regard to rehearsal time in comparison to theatre (and, to a lesser extent, film). This has also affected actors’ valuation of their work. Theatre is, and is not, the most important medium in which they find work. Theatre is most important spiritually and intellectually, because in theatre is collaborative, intensive, and involving; theatre is not as important in financial and career terms, because it is not as lucrative and not as visible to a large public as acting for the screen. Many actors took the view that, for all the industrial differences that do affect them and inevitably interest the academic, acting for the visible media of theatre, film and television involved fundamentally the same process with slightly different emphases.
Resumo:
WAGGGS, the World Association of Girl Guides and Girl Scouts, is the umbrella organization for Member Organizations from 145 countries around the world. As such one of its remits is to provide programmes that promote leadership development and opportunities for girls and young women to advocate on issues they care about. One of the ways WAGGGS is exploring to do this more widely and efficiently is through the use of digital technologies. This paper presents the results of an audit undertaken of the technologies already used by potential participants in online communities and courses and investigates the challenges faced in using technology to facilitate learning, within this context.
Resumo:
This paper constructs a housing market model to analyse conditions for different generations of households in the UK. Previous policy work has suggested that baby-boomers have benefitted at the expense of younger generations. The model relies on a form of financial accelerator in which existing homeowners reinvest a proportion of the capital gains on moving home. The model is extended to look at homeownership probabilities. It also explains why an increasing share of mortgages has gone to existing owners, despite market liberalisation and securitisation. In addition, the model contributes to the explanation of volatility.
Resumo:
The World Association of Girl Guides and Girl Scouts (WAGGGS) is the umbrella organisation for Member Organisations from 145 countries around the world, with a total membership of ten million. While Member Organisations offer training and development within their own countries, WAGGGS offers international opportunities. This project seeks to explore how technology can be used to offer similar opportunities to those provided by the face-to-face courses to a much wider audience, while retaining the community and interactive learning aspects of the existing programmes.
Resumo:
This study documents the size and nature of “Hindu-Muslim” and “boy-girl” gaps in children’s school participation and attainments in India. Individual-level data from two successive rounds of the National Sample Survey suggest that considerable progress has been made in decreasing the Hindu-Muslim gap. Nonetheless, the gap remains sizable even after controlling for numerous socio-economic and parental covariates, and the Muslim educational disadvantage in India today is greater than that experienced by girls and Scheduled Caste Hindu children. A gender gap still appears within as well as between communities, though it is smaller within Muslim communities. While differences in gender and other demographic and socio-economic covariates have recently become more important in explaining the Hindu-Muslim gap, those differences altogether explain only 25 percent to 45 percent of the observed schooling gap.
Resumo:
Bangladesh Rural Advancement Committee (BRAC), a non-governmental organisation (NGO), runs a large number of non-formal primary schools in Bangladesh which target out-of-school children from poor families. These schools are well-known for their effectiveness in closing the gender gap in primary school enrolment. On the other hand, registered non-government secondary madrasas (or Islamic schools) today enrol one girl against every boy student. In this article, we document a positive spillover effect of BRAC schools on female secondary enrolment in registered madrasas. Drawing upon school enrolment data aggregated at the region level, we first show that regions that had more registered madrasas experienced greater secondary female enrolment growth during 1999–2003, holding the number of secular secondary schools constant. In this context we test the impact of BRAC-run primary schools on female enrolment in registered madrasas. We deal with the potential endogeneity of placement of BRAC schools using an instrumental variable approach. Controlling for factors such as local-level poverty, road access and distance from major cities, we show that regions with a greater presence of BRAC schools have higher female enrolment growth in secondary madrasas. The effect is much bigger when compared to that on secondary schools.
Resumo:
A series of numerical models have been used to investigate the predictability of atmospheric blocking for an episode selected from FGGE Special Observing Period I. Level II-b FGGE data have been used in the experiment. The blocking took place over the North Atlantic region and is a very characteristic example of high winter blocking. It is found that the very high resolution models developed at ECMWF, in a remarkable way manage to predict the blocking event in great detail, even beyond 1 week. Although models with much less resolution manage to predict the blocking phenomenon as such, the actual evolution differs very much from the observed and consequently the practical value is substantially reduced. Wind observations from the geostationary satellites are shown to have a substantial impact on the forecast beyond 5 days, as well as an extension of the integration domain to the whole globe. Quasi-geostrophic baroclinic models and, even more, barotropic models, are totally inadequate to predict blocking except in its initial phase. The prediction experiment illustrates clearly that efforts which have gone into the improvement of numerical prediction models in the last decades have been worth while.
Resumo:
The concept of slow vortical dynamics and its role in theoretical understanding is central to geophysical fluid dynamics. It leads, for example, to “potential vorticity thinking” (Hoskins et al. 1985). Mathematically, one imagines an invariant manifold within the phase space of solutions, called the slow manifold (Leith 1980; Lorenz 1980), to which the dynamics are constrained. Whether this slow manifold truly exists has been a major subject of inquiry over the past 20 years. It has become clear that an exact slow manifold is an exceptional case, restricted to steady or perhaps temporally periodic flows (Warn 1997). Thus the concept of a “fuzzy slow manifold” (Warn and Ménard 1986) has been suggested. The idea is that nearly slow dynamics will occur in a stochastic layer about the putative slow manifold. The natural question then is, how thick is this layer? In a recent paper, Ford et al. (2000) argue that Lighthill emission—the spontaneous emission of freely propagating acoustic waves by unsteady vortical flows—is applicable to the problem of balance, with the Mach number Ma replaced by the Froude number F, and that it is a fundamental mechanism for this fuzziness. They consider the rotating shallow-water equations and find emission of inertia–gravity waves at O(F2). This is rather surprising at first sight, because several studies of balanced dynamics with the rotating shallow-water equations have gone beyond second order in F, and found only an exponentially small unbalanced component (Warn and Ménard 1986; Lorenz and Krishnamurthy 1987; Bokhove and Shepherd 1996; Wirosoetisno and Shepherd 2000). We have no technical objection to the analysis of Ford et al. (2000), but wish to point out that it depends crucially on R 1, where R is the Rossby number. This condition requires the ratio of the characteristic length scale of the flow L to the Rossby deformation radius LR to go to zero in the limit F → 0. This is the low Froude number scaling of Charney (1963), which, while originally designed for the Tropics, has been argued to be also relevant to mesoscale dynamics (Riley et al. 1981). If L/LR is fixed, however, then F → 0 implies R → 0, which is the standard quasigeostrophic scaling of Charney (1948; see, e.g., Pedlosky 1987). In this limit there is reason to expect the fuzziness of the slow manifold to be “exponentially thin,” and balance to be much more accurate than is consistent with (algebraic) Lighthill emission.
Resumo:
This is an analysis of Iris Murdoch's plays, including The Italian Girl, The Severed Head, The Black Prince, The Three Arrows and The Servants and the Snow. It also assesses Murdoch's significance for theatre in the early 1960s and 70s, as Women's Theatre was beginning to make its mark.
Resumo:
Aerosols affect the Earth's energy budget directly by scattering and absorbing radiation and indirectly by acting as cloud condensation nuclei and, thereby, affecting cloud properties. However, large uncertainties exist in current estimates of aerosol forcing because of incomplete knowledge concerning the distribution and the physical and chemical properties of aerosols as well as aerosol-cloud interactions. In recent years, a great deal of effort has gone into improving measurements and datasets. It is thus feasible to shift the estimates of aerosol forcing from largely model-based to increasingly measurement-based. Our goal is to assess current observational capabilities and identify uncertainties in the aerosol direct forcing through comparisons of different methods with independent sources of uncertainties. Here we assess the aerosol optical depth (τ), direct radiative effect (DRE) by natural and anthropogenic aerosols, and direct climate forcing (DCF) by anthropogenic aerosols, focusing on satellite and ground-based measurements supplemented by global chemical transport model (CTM) simulations. The multi-spectral MODIS measures global distributions of aerosol optical depth (τ) on a daily scale, with a high accuracy of ±0.03±0.05τ over ocean. The annual average τ is about 0.14 over global ocean, of which about 21%±7% is contributed by human activities, as estimated by MODIS fine-mode fraction. The multi-angle MISR derives an annual average AOD of 0.23 over global land with an uncertainty of ~20% or ±0.05. These high-accuracy aerosol products and broadband flux measurements from CERES make it feasible to obtain observational constraints for the aerosol direct effect, especially over global the ocean. A number of measurement-based approaches estimate the clear-sky DRE (on solar radiation) at the top-of-atmosphere (TOA) to be about -5.5±0.2 Wm-2 (median ± standard error from various methods) over the global ocean. Accounting for thin cirrus contamination of the satellite derived aerosol field will reduce the TOA DRE to -5.0 Wm-2. Because of a lack of measurements of aerosol absorption and difficulty in characterizing land surface reflection, estimates of DRE over land and at the ocean surface are currently realized through a combination of satellite retrievals, surface measurements, and model simulations, and are less constrained. Over the oceans the surface DRE is estimated to be -8.8±0.7 Wm-2. Over land, an integration of satellite retrievals and model simulations derives a DRE of -4.9±0.7 Wm-2 and -11.8±1.9 Wm-2 at the TOA and surface, respectively. CTM simulations derive a wide range of DRE estimates that on average are smaller than the measurement-based DRE by about 30-40%, even after accounting for thin cirrus and cloud contamination. A number of issues remain. Current estimates of the aerosol direct effect over land are poorly constrained. Uncertainties of DRE estimates are also larger on regional scales than on a global scale and large discrepancies exist between different approaches. The characterization of aerosol absorption and vertical distribution remains challenging. The aerosol direct effect in the thermal infrared range and in cloudy conditions remains relatively unexplored and quite uncertain, because of a lack of global systematic aerosol vertical profile measurements. A coordinated research strategy needs to be developed for integration and assimilation of satellite measurements into models to constrain model simulations. Enhanced measurement capabilities in the next few years and high-level scientific cooperation will further advance our knowledge.
Resumo:
The long observational record is critical to our understanding of the Earth’s climate, but most observing systems were not developed with a climate objective in mind. As a result, tremendous efforts have gone into assessing and reprocessing the data records to improve their usefulness in climate studies. The purpose of this paper is to both review recent progress in reprocessing and reanalyzing observations, and summarize the challenges that must be overcome in order to improve our understanding of climate and variability. Reprocessing improves data quality through more scrutiny and improved retrieval techniques for individual observing systems, while reanalysis merges many disparate observations with models through data assimilation, yet both aim to provide a climatology of Earth processes. Many challenges remain, such as tracking the improvement of processing algorithms and limited spatial coverage. Reanalyses have fostered significant research, yet reliable global trends in many physical fields are not yet attainable, despite significant advances in data assimilation and numerical modeling. Oceanic reanalyses have made significant advances in recent years, but will only be discussed here in terms of progress toward integrated Earth system analyses. Climate data sets are generally adequate for process studies and large-scale climate variability. Communication of the strengths, limitations and uncertainties of reprocessed observations and reanalysis data, not only among the community of developers, but also with the extended research community, including the new generations of researchers and the decision makers is crucial for further advancement of the observational data records. It must be emphasized that careful investigation of the data and processing methods are required to use the observations appropriately.
Resumo:
Environmental building assessment tools have been developed to measure how well or poorly a building is performing, or likely to perform, against a declared set of criteria, or environmental considerations, in order to achieve sustainability principles. Knowledge of environmental building assessment tools is therefore important for successful design and construction of environmentally friendly buildings for countries. The purpose of the research is to investigate the knowledge and level of awareness of environmental building assessment tools among industry practitioners in Botswana. One hundred and seven paper-based questionnaires were delivered to industry practitioners, including architects, engineers, quantity surveyors, real estate developers and academics. Users were asked what they know about building assessment, whether they have used any building assessment tool in the past, and what they perceive as possible barriers to the implementation of environmental building assessment tools in Botswana. Sixty five were returned and statistical analysis, using IBM SPSS V19 software, was used for analysis. Almost 85 per cent of respondents indicate that they are extremely or moderately aware of environmental design. Furthermore, the results indicate that 32 per cent of respondents have gone through formal training, which suggests ‘reasonable knowledge’. This however does not correspond with the use of the tools on the ground as 69 per cent of practitioners report never to have used any environmental building assessment tool in any project. The study highlights the need to develop an assessment tool for Botswana to enhance knowledge and further improve the level of awareness of environmental issues relating to building design and construction.
Resumo:
Cet article porte sur l’analyse de trois configurations institutionnelles de la protection de l’enfance : celle en place au Burkina Faso, en Belgique et au Québec. Pour chaque configuration, le texte explore les transformations qui ont marqué le passage de la prise en charge exclusive de l’enfant par la famille vers la présence accrue de l’État et la manière dont la Convention internationale des droits de l’enfant de 1989 a influencé ce passage. Il montre, au travers d’une lecture historique, que l’implication de l’État dans la protection des enfants a connu des formes et des modalités variables selon le temps et l’espace. L’approche actuelle en matière de protection de l’enfance au Nord et au Sud, approche largement inspirée de la perspective des droits, représente un défi tant pour les intervenants que pour les familles, car son application dépend à fois des ressources disponibles pour aider les enfants et les familles en difficulté, de la capacité d’action des institutions publiques et de l’efficacité des interventions. This article deals with the analysis of three institutional configurations of child protection: those in Burkina Faso, in Belgium, and in Québec. With respect to each configuration, the text explores the changes from situations in which the family had sole control of the child to those where the State played a greater role, and the manner in which the 1989 International Convention on the Rights of the Child has affected such changes. It shows, through a reading of history, that the involvement of the State in child protection has gone through different forms and stages over time and space. The current strategies as regards child protection in the North and in the South—an approach that is largely subject to a legal perspective—represent a challenge both for intervenors and for families, since their application depends on the resources available for helping children and families in difficulty, on the ability of the public institutions to intervene, and on the efficiency of such interventions.
Resumo:
This paper has three aims. First, it argues that the present use of ‘ideal theory’ is unhelpful, and that an earlier and apparently more natural use focusing on perfection would be preferable. Second, it has tried to show that revision of the use of the term would better expose two distinctive normative issues, and illustrated that claim by showing how some contributors to debates about ideal theory have gone wrong partly through not distinguishing them. Third, in exposing those two distinct normative issues, it has revealed a particular way in which ideal theory even under the more specific, revisionary definition will often be central to working out what to do in non-ideal circumstances. By clarifying the terms on which debates over ideal and non-ideal theory should take place, and highlighting the particular problems each deals with, it tries to clear the ground for turning to the actual problem, which is what to do in our non-ideal and often tragic circumstances.