961 resultados para classic permanence


Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do changing notions of children’s reading practices alter or even create classic texts? This article looks at how the nineteenth-century author Jules Verne (1828-1905) was modernised by Hachette for their Bibliothèque Verte children’s collection in the 1950s and 60s. Using the methodology of adaptation studies, the article reads the abridged texts in the context of the concerns that emerged in postwar France about what children were reading. It examines how these concerns shaped editorial policy, and the transformations that Verne’s texts underwent before they were considered suitable for the children of the baby-boom generation. It asks whether these adapted versions damaged Verne’s reputation, as many literary scholars have suggested, or if the process of dividing his readership into children and adults actually helped to reinforce the new idea of his texts as complex and multilayered. In so doing, this article provides new insights into the impact of postwar reforms on children’s publishing and explores the complex interplay between abridgment, censorship, children’s literature and the adult canon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes and demonstrates an approach, Skilloscopy, to the assessment of decision makers. In an increasingly sophisticated, connected and information-rich world, decision making is becoming both more important and more difficult. At the same time, modelling decision-making on computers is becoming more feasible and of interest, partly because the information-input to those decisions is increasingly on record. The aims of Skilloscopy are to rate and rank decision makers in a domain relative to each other: the aims do not include an analysis of why a decision is wrong or suboptimal, nor the modelling of the underlying cognitive process of making the decisions. In the proposed method a decision-maker is characterised by a probability distribution of their competence in choosing among quantifiable alternatives. This probability distribution is derived by classic Bayesian inference from a combination of prior belief and the evidence of the decisions. Thus, decision-makers’ skills may be better compared, rated and ranked. The proposed method is applied and evaluated in the gamedomain of Chess. A large set of games by players across a broad range of the World Chess Federation (FIDE) Elo ratings has been used to infer the distribution of players’ rating directly from the moves they play rather than from game outcomes. Demonstration applications address questions frequently asked by the Chess community regarding the stability of the Elo rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The method of Skilloscopy may be applied in any decision domain where the value of the decision-options can be quantified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The DNA sequence of the chromosomal gene cluster encoding the SEF14 fimbriae of Salmonella enterica serovar Enteritidis was determined. Five contiguous open reading frames, sefABCDE, were identified. The sefE gene shared significant homology with araC-like positive regulators. Serovar-associated virulence plasmid (SAP) genes orf7,8,9 and pefI were identified immediately adjacent to the sef operon. The pefI gene encoded a putative regulator of the Plasmid-encoded fimbrial antigen (PEF) expression. The entire sef-pef region, Ranked by two IS-like elements, was inserted adjacent to leuX that encoded a transfer RNA molecule. The organisation of this region was suggestive of a classic pathogenicity islet. Southern hybridisation confirmed two copies of the SAP derived orf7,8,9 and pefI region in S. Enteritidis, one in the chromosome and one on the SAP. Of other group D Salmonella, only S. Blegdam and S. Moscow harboured both chromosomal and plasmid copies of pefI-orf9 region although polymorphism was evident. Crown Copyright (C) 2001 Published by Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sarah Kane's notorious 1995 debut, Blasted, has been widely though belatedly recognized as a defining example of experiential or ‘in-yer-face’ theatre. However, Graham Saunders here argues that the best playwrights not only innovate in use of language and dramatic form, but also rewrite the classic plays of the past. He believes that too much stress has been placed on the play's radical structure and contemporary sensibility, with the effect of obscuring the influence of Shakespearean tradition on its genesis and content. He clarifies Kane's gradually dawning awareness of the influence of Shakespeare's King Lear on her work and how elements of that tragedy were rewritten in terms of dialogue, recast thematically, and reworked in terms of theatrical image. He sees Blasted as both a response to contemporary reality and an engagement with the history of drama. Graham Saunders is Senior Lecturer in Theatre Studies at the University of the West of England, Bristol, and author of the first full-length study of Kane's work: ‘Love Me or Kill Me’: Sarah Kane and the Theatre of Extremes (Manchester University Press, 2002). An earlier version of this article was given as a paper at the ‘Crucible of Cultures: Anglophone Drama at the Dawn of a New Millennium’ conference in Brussels, May 2001. Saunders is currently working on articles about Samuel Beckett and Edward Bond

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The classic Reynolds flocking model is formally analysed, with results presented and discussed. Flocking behaviour was investigated through the development of two measurements of flocking, flock area and polarisation, with a view to applying the findings to robotic applications. Experiments varying the flocking simulation parameters individually and simultaneously provide new insight into the control of flock behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

X-ray Rheology is an experimental technique which uses time-ressolved x-ray scattering as probe of the molecular level structural reorganisation which accompanies flow. It provides quantitative information on the direction alignment and on the level of global orientation. This information is very helpful in interpreting the classic rheological data on liquid crystal polymers. In this research we use data obtained from a cellulose derivate which exhibits a thermotropic liquid crystal phase. We show how increased shear rates lead to a rapid rise in the global orientation and we related this to therories of flow in liquid crystal polymers from the literature. We show that the relaxation time is independent of the prior shear rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of a slowest invariant manifold is investigated for the five-component model of Lorenz under conservative dynamics. It is shown that Lorenz's model is a two-degree-of-freedom canonical Hamiltonian system, consisting of a nonlinear vorticity-triad oscillator coupled to a linear gravity wave oscillator, whose solutions consist of regular and chaotic orbits. When either the Rossby number or the rotational Froude number is small, there is a formal separation of timescales, and one can speak of fast and slow motion. In the same regime, the coupling is weak, and the Kolmogorov–Arnold-Moser theorem is shown to apply. The chaotic orbits are inherently unbalanced and are confined to regions sandwiched between invariant tori consisting of quasi-periodic regular orbits. The regular orbits generally contain free fast motion, but a slowest invariant manifold may be geometrically defined as the set of all slow cores of invariant tori (defined by zero fast action) that are smoothly related to such cores in the uncoupled system. This slowest invariant manifold is not global; in fact, its structure is fractal; but it is of nearly full measure in the limit of weak coupling. It is also nonlinearly stable. As the coupling increases, the slowest invariant manifold shrinks until it disappears altogether. The results clarify previous definitions of a slowest invariant manifold and highlight the ambiguity in the definition of “slowness.” An asymptotic procedure, analogous to standard initialization techniques, is found to yield nonzero free fast motion even when the core solutions contain none. A hierarchy of Hamiltonian balanced models preserving the symmetries in the original low-order model is formulated; these models are compared with classic balanced models, asymptotically initialized solutions of the full system and the slowest invariant manifold defined by the core solutions. The analysis suggests that for sufficiently small Rossby or rotational Froude numbers, a stable slowest invariant manifold can be defined for this system, which has zero free gravity wave activity, but it cannot be defined everywhere. The implications of the results for more complex systems are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neurovascular coupling in response to stimulation of the rat barrel cortex was investigated using concurrent multichannel electrophysiology and laser Doppler flowmetry. The data were used to build a linear dynamic model relating neural activity to blood flow. Local field potential time series were subject to current source density analysis, and the time series of a layer IV sink of the barrel cortex was used as the input to the model. The model output was the time series of the changes in regional cerebral blood flow (CBF). We show that this model can provide excellent fit of the CBF responses for stimulus durations of up to 16 s. The structure of the model consisted of two coupled components representing vascular dilation and constriction. The complex temporal characteristics of the CBF time series were reproduced by the relatively simple balance of these two components. We show that the impulse response obtained under the 16-s duration stimulation condition generalised to provide a good prediction to the data from the shorter duration stimulation conditions. Furthermore, by optimising three out of the total of nine model parameters, the variability in the data can be well accounted for over a wide range of stimulus conditions. By establishing linearity, classic system analysis methods can be used to generate and explore a range of equivalent model structures (e.g., feed-forward or feedback) to guide the experimental investigation of the control of vascular dilation and constriction following stimulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Hadley Centre Global Environmental Model (HadGEM) includes two aerosol schemes: the Coupled Large-scale Aerosol Simulator for Studies in Climate (CLASSIC), and the new Global Model of Aerosol Processes (GLOMAP-mode). GLOMAP-mode is a modal aerosol microphysics scheme that simulates not only aerosol mass but also aerosol number, represents internally-mixed particles, and includes aerosol microphysical processes such as nucleation. In this study, both schemes provide hindcast simulations of natural and anthropogenic aerosol species for the period 2000–2006. HadGEM simulations of the aerosol optical depth using GLOMAP-mode compare better than CLASSIC against a data-assimilated aerosol re-analysis and aerosol ground-based observations. Because of differences in wet deposition rates, GLOMAP-mode sulphate aerosol residence time is two days longer than CLASSIC sulphate aerosols, whereas black carbon residence time is much shorter. As a result, CLASSIC underestimates aerosol optical depths in continental regions of the Northern Hemisphere and likely overestimates absorption in remote regions. Aerosol direct and first indirect radiative forcings are computed from simulations of aerosols with emissions for the year 1850 and 2000. In 1850, GLOMAP-mode predicts lower aerosol optical depths and higher cloud droplet number concentrations than CLASSIC. Consequently, simulated clouds are much less susceptible to natural and anthropogenic aerosol changes when the microphysical scheme is used. In particular, the response of cloud condensation nuclei to an increase in dimethyl sulphide emissions becomes a factor of four smaller. The combined effect of different 1850 baselines, residence times, and abilities to affect cloud droplet number, leads to substantial differences in the aerosol forcings simulated by the two schemes. GLOMAP-mode finds a presentday direct aerosol forcing of −0.49Wm−2 on a global average, 72% stronger than the corresponding forcing from CLASSIC. This difference is compensated by changes in first indirect aerosol forcing: the forcing of −1.17Wm−2 obtained with GLOMAP-mode is 20% weaker than with CLASSIC. Results suggest that mass-based schemes such as CLASSIC lack the necessary sophistication to provide realistic input to aerosol-cloud interaction schemes. Furthermore, the importance of the 1850 baseline highlights how model skill in predicting present-day aerosol does not guarantee reliable forcing estimates. Those findings suggest that the more complex representation of aerosol processes in microphysical schemes improves the fidelity of simulated aerosol forcings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a series of recent cases, courts have reasserted unconscionability as the basis of proprietary estoppel and in doing so have moved away from the structured form of discretion envisaged in the classic Taylors Fashions formula. In light of these developments, this paper traces the use of unconscionability in estoppel and examines the changing role attributed to the concept. In a parallel development, in exercising their remedial discretion once a claim to estoppel has been established, the courts have emphasised the foundation of estoppel in unconscionability to assert the need for proportionality between the detriment and remedy as ‘the most essential requirement’. Collectively, the cases demonstrate a lack of transparency or consistency, which raises concerns that the courts are descending into a form of individualised discretion. These developments are of particular concern as they come at a time when commentators are predicting a ‘boom’ in estoppel to follow the introduction of electronic conveyancing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of environmental valuation of natural disasters, an important component of the evaluation procedure lies in determining the periodicity of events. This paper explores alternative methodologies for determining such periodicity, illustrating the advantages and the disadvantages of the separate methods and their comparative predictions. The procedures employ Bayesian inference and explore recent advances in computational aspects of mixtures methodology. The procedures are applied to the classic data set of Maguire et al (Biometrika, 1952) which was subsequently updated by Jarrett (Biometrika, 1979) and which comprise the seminal investigations examining the periodicity of mining disasters within the United Kingdom, 1851-1962.