649 resultados para Poodle toy e Yorkshire terrier
Resumo:
Modern methods of analysis applied to cemeteries have often been used in our pages to suggest generalities about mobility and diet. But these same techniques applied to a single individual, together with the grave goods and burial rite, can open a special kind of personal window on the past. Here, the authors of a multidisciplinary project use a combination of scientific techniques to illuminate Roman York, and later Roman history in general, with their image of a glamorous mixed-race woman, in touch with Africa, Christianity, Rome and Yorkshire.
Resumo:
In order to investigate how the population diversity at major Romano-British urban centres compared to small towns and military outposts, we conducted multi-isotope (carbon, nitrogen, oxygen and strontium) analyses of bones (42 individuals) and teeth (26 individuals) of human skeletons from Cataractonium/ Roman Catterick in North Yorkshire (U.K.). The results suggest a markedly less diverse population at Catterick than at the larger towns. Significant differences are observed between burials from the town and fort area and the suburb of Bainesse to the south, and it is suggested that these reflect a shift to more localised recruitment for the Roman army in the Late Roman period. Isotope data for the ‘Bainesse Eunuch’, an unusual 4th century burial that has been interpreted as the remains of a ‘transvestite’ priest of Cybele, are not ultimately conclusive but consistent with origins in Southern Britain or areas with a similar climate abroad. This paper also presents strontium isotope data for modern vegetation samples from 17 sites in the Catterick/northern Vale of York area which contribute to a continuing effort to map the biosphere 87Sr/86Sr variation in Britain.
Resumo:
The low rates of child literacy in South Africa are cause for considerable concern. Research from the developed world shows that parental sharing of picture books with infants and young children is beneficial for child language and cognitive development, as well as literacy skills. We conducted a pilot study to examine whether such benefits might extend to an impoverished community in South Africa, by evaluating the impact of training mothers in book sharing with their 14–18 month old infants. Seventeen mothers received book sharing training; and 13 mothers did not, but instead received a comparison training in toy play. We assessed the mothers’ behavior during both book sharing and toy play before and after training, and we also assessed infant attention and language. Mothers receiving book sharing training engaged well with it, and they also benefited from it; thus, compared to the comparison group mothers, they became more sensitive, more facilitating, and more elaborative with their infants during book sharing, and they also became more sensitive to their infants during toy play. In addition, infants whose mothers received the book sharing training showed greater benefits than the comparison group infants in both their attention and language. Training in book sharing for families living in conditions of marked socio-economic adversity in South Africa has the potential to be of considerable benefit to child developmental progress. A large scale controlled trial is required to confirm this.
Resumo:
Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.
Resumo:
Fourth-century a.d. chalk tesserae from Roman Leicester (Ratae Corieltavorum) yield rich microfossil assemblages that identify a biostratigraphical age of Cretaceous Late Cenomanian to Early Turonian. The nearest chalk outcrops to Leicester lie in Hertfordshire, Lincolnshire, Yorkshire and north Norfolk, indicating that the material for the tesserae must have been sourced remotely and transported to Ratae. Superimposing the Roman road network onto a map of the relevant Chalk Group distribution provides a guide to possible sources. A process of evaluation identifies Baldock in Hertfordshire and Bridlington in Yorkshire as the most likely sources for the Leicester tesserae.
Resumo:
Where are the terps in Yorkshire, or for that matter where is any other evidence of exploitation of the wetlands in the early medieval period? Archaeological evidence remains largely elusive for the period between the early fifth and the late ninth century. Among the very few sites in wetland landscapes dated to this period are the settlement of York and the middle Anglo-Saxon bridge at Skerne in the Hull valley. Sites from the free-draining soils adjacent to wetlands are more frequent, and include a monastery (Beverley), settlements (e.g. Nafferton and North Frodingham), cemeteries (e.g. Hornsea, Burton Pidsea, Hessle, North Frodingham, Swine and Stamford Bridge) and various isolated finds (recently summarised in Van de Noort and Davies 1993).
Resumo:
Introduction: Resistance to anticoagulants in Norway rats (Rattus norvegicus) and house mice (Mus domesticus) has been studied in the UK since the early 1960s. In no other country in the world is our understanding of resistance phenomena so extensive and profound. Almost every aspect of resistance in the key rodent target species has been examined in laboratory and field trials and results obtained by independent researchers have been published. It is the principal purpose of this document to present a short synopsis of this information. More recently, however, the development of genetical techniques has provided a definitive means of detection of resistant genotypes among pest rodent populations. Preliminary information from a number of such surveys will also be presented. Resistance in Norway rats: A total of nine different anticoagulant resistance mutations (single nucleotide polymorphisms or SNPs) are found among Norway rats in the UK. In no other country worldwide are present so many different forms of Norway rat resistance. Among these nine SNPs, five are known to confer on rats that carry them a significant degree of resistance to anticoagulant rodenticides. These mutations are: L128Q, Y139S, L120Q, Y139C and Y139F. The latter three mutations confer, to varying degrees, practical resistance to bromadiolone and difenacoum, the two second-generation anticoagulants in predominant use in the UK. It is the recommendation of RRAG that bromadiolone and difenacoum should not be used against rats carrying the L120Q, Y139C and Y139F mutations because this will promote the spread of resistance and jeopardise the long-term efficacy of anticoagulants. Brodifacoum, flocoumafen and difethialone are effective against these three genotypes but cannot presently be used because of the regulatory restriction that they can only be applied against rats that are living and feeding predominantly indoors. Our understanding of the geographical distribution of Norway rat resistance in incomplete but is rapidly increasing. In particular, the mapping of the focus of L120Q Norway rat resistance in central-southern England by DNA sequencing is well advanced. We now know that rats carrying this resistance mutation are present across a large part of the counties of Hampshire, Berkshire and Wiltshire, and the resistance spreads into Avon, Oxfordshire and Surrey. It is also found, perhaps as outlier foci, in south-west Scotland and East Sussex. L120Q is currently the most severe form of anticoagulant resistance found in Norway rats and is prevalent over a considerable part of central-southern England. A second form of advanced Norway rat resistance is conferred by the Y139C mutation. This is noteworthy because it occurs in at least four different foci that are widely geographically dispersed, namely in Dumfries and Galloway, Gloucestershire, Yorkshire and Norfolk. Once again, bromadiolone and difenacoum are not recommended for use against rats carrying this genotype and a concern of RRAG is that continued applications of resisted active substances may result in Y139C becoming more or less ubiquitous across much of the UK. Another type of advanced resistance, the Y139F mutation, is present in Kent and Sussex. This means that Norway rats, carrying some degree of resistance to bromadiolone and difenacoum, are now found from the south coast of Kent, west into the city of Bristol, to Yorkshire in the north-east and to the south-west of Scotland. This difficult situation can only deteriorate further where these three genotypes exist and resisted anticoagulants are predominantly used against them. Resistance in house mice: House mouse is not so well understood but the presence in the UK of two resistant genotypes, L128S and Y139C, is confirmed. House mice are naturally tolerant to anticoagulants and such is the nature of this tolerance, and the presence of genetical resistance, that house mice resistant to the first-generation anticoagulants are considered to be widespread in the UK. Consequently, baits containing warfarin, sodium warfarin, chlorophacinone and coumatetralyl are not approved for use against mice. This regulatory position is endorsed by RRAG. Baits containing brodifacoum, flocoumafen and difethialone are effective against house mice and may be applied in practice because house mouse infestations are predominantly indoors. There are some reports of resistance among mice in some areas to the second-generation anticoagulant bromadiolone, while difenacoum remains largely efficacious. Alternatives to anticoagulants: The use of habitat manipulation, that is the removal of harbourage, denial of the availability of food and the prevention of ingress to structures, is an essential component of sustainable rodent pest management. All are of importance in the management of resistant rodents and have the advantage of not selecting for resistant genotypes. The use of these techniques may be particularly valuable in preventing the build-up of rat infestations. However, none can be used to remove any sizeable extant rat infestation and for practical reasons their use against house mice is problematic. Few alternative chemical interventions are available in the European Union because of the removal from the market of zinc phosphide, calciferol and bromethalin. Our virtual complete reliance on the use of anticoagulants for the chemical control of rodents in the UK, and more widely in the EU, calls for improved schemes for resistance management. Of course, these might involve the use of alternatives to anticoagulant rodenticides. Also important is an increasing knowledge of the distribution of resistance mutations in rats and mice and the use of only fully effective anticoagulants against them.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.
Resumo:
The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.
Resumo:
The purported migrations that have formed the peoples of Britain have been the focus of generations of scholarly controversy. However, this has not benefited from direct analyses of ancient genomes. Here we report nine ancient genomes (~1 x) of individuals from northern Britain: seven from a Roman era York cemetery, bookended by earlier Iron-Age and later Anglo-Saxon burials. Six of the Roman genomes show affinity with modern British Celtic populations, particularly Welsh, but significantly diverge from populations from Yorkshire and other eastern English samples. They also show similarity with the earlier Iron-Age genome, suggesting population continuity, but differ from the later Anglo-Saxon genome. This pattern concords with profound impact of migrations in the Anglo-Saxon period. Strikingly, one Roman skeleton shows a clear signal of exogenous origin, with affinities pointing towards the Middle East, confirming the cosmopolitan character of the Empire, even at its northernmost fringes.
Resumo:
This chapter analyses how children, and especially boys, are constructed as ‘savage’ in relation to warlike toys and representations that narrate particular versions of conflict, such as war and terrorism. The chapter uses Action Man toys as a case study that is contextualized against a wider background of other toys, television programmes and films. Action Man is most familiar as a twelve-inch costumed toy figure, but the brand also extends into related media representations such as television programmes, comics and advertising. The chapter focuses increasingly on the specifics of Action Man representations produced from the 1960s to the 1990s, prefacing this detailed discussion with some examples of transmedia texts aimed at children in film and television. This chapter suggests that making the toy a central object of analysis allows for insights into representations of the gendered body that are particularly useful for work on the child-savage analogy. Some of the cultural meanings of war toys, warlike play and representations of war that can be analysed from this perspective include their role in the construction of masculine identity, their representation of particular wars and warlikeness in general, and their relationship to consumer society. This complex of meanings exhibits many of the contradictions that inhabit the construction of ‘the child’ in general, such as that the often extreme masculinity of war toys and games is countered by an aesthetic of spatial disposition, collecting and sometimes nurturing that is more conventionally feminine. Such inter-dependent but apparently opposed meanings can also be seen in the construction of the child as untainted by adult corruption yet also savage, or as in need of adult guidance yet also offering a model of innocence and purity that adults are expected to admire.
Resumo:
I consider the case for genuinely anonymous web searching. Big data seems to have it in for privacy. The story is well known, particularly since the dawn of the web. Vastly more personal information, monumental and quotidian, is gathered than in the pre-digital days. Once gathered it can be aggregated and analyzed to produce rich portraits, which in turn permit unnerving prediction of our future behavior. The new information can then be shared widely, limiting prospects and threatening autonomy. How should we respond? Following Nissenbaum (2011) and Brunton and Nissenbaum (2011 and 2013), I will argue that the proposed solutions—consent, anonymity as conventionally practiced, corporate best practices, and law—fail to protect us against routine surveillance of our online behavior. Brunton and Nissenbaum rightly maintain that, given the power imbalance between data holders and data subjects, obfuscation of one’s online activities is justified. Obfuscation works by generating “misleading, false, or ambiguous data with the intention of confusing an adversary or simply adding to the time or cost of separating good data from bad,” thus decreasing the value of the data collected (Brunton and Nissenbaum, 2011). The phenomenon is as old as the hills. Natural selection evidently blundered upon the tactic long ago. Take a savory butterfly whose markings mimic those of a toxic cousin. From the point of view of a would-be predator the data conveyed by the pattern is ambiguous. Is the bug lunch or potential last meal? In the light of the steep costs of a mistake, the savvy predator goes hungry. Online obfuscation works similarly, attempting for instance to disguise the surfer’s identity (Tor) or the nature of her queries (Howe and Nissenbaum 2009). Yet online obfuscation comes with significant social costs. First, it implies free riding. If I’ve installed an effective obfuscating program, I’m enjoying the benefits of an apparently free internet without paying the costs of surveillance, which are shifted entirely onto non-obfuscators. Second, it permits sketchy actors, from child pornographers to fraudsters, to operate with near impunity. Third, online merchants could plausibly claim that, when we shop online, surveillance is the price we pay for convenience. If we don’t like it, we should take our business to the local brick-and-mortar and pay with cash. Brunton and Nissenbaum have not fully addressed the last two costs. Nevertheless, I think the strict defender of online anonymity can meet these objections. Regarding the third, the future doesn’t bode well for offline shopping. Consider music and books. Intrepid shoppers can still find most of what they want in a book or record store. Soon, though, this will probably not be the case. And then there are those who, for perfectly good reasons, are sensitive about doing some of their shopping in person, perhaps because of their weight or sexual tastes. I argue that consumers should not have to pay the price of surveillance every time they want to buy that catchy new hit, that New York Times bestseller, or a sex toy.
Resumo:
A logística é, atualmente, um dos mais importantes diferenciais competitivos existentes em empresas do mundo inteiro. Ela envolve todas as áreas da empresa, o que dificulta o esgotamento de uma discussão profunda sobre este tema. Este estudo de caso se propõe a analisar alguns aspectos deste conceito, mais precisamente aqueles voltados para o serviço ao cliente e seu reflexo no sistema de distribuição de uma empresa. O objetivo deste trabalho é fornecer informações que possibilitem definir uma estrutura de distribuição que seja capaz de adicionar valor para seus clientes e vantagem competitiva para a empresa. Nesta discussão, através de uma pesquisa bibliográfica, ficou claro que as necessidades do cliente definem os serviços a serem oferecidos pela empresa, e, através de uma pesquisa de campo, ficaram estabelecidos quais são estes serviços. Para este estudo de caso, foi escolhida uma empresa de brinquedos, a Plaything S/A Ind. e Com., com marcante atuação nacional, em que se discute seu conjunto de serviços sob o aspecto externo e interno. No aspecto externo, através de uma pesquisa, estabelece-se uma hierarquização de necessidades dos clientes e verifica-se se os serviços oferecidos por esta empresa estão de acordo com estas necessidades. Ao mesmo tempo, esta pesquisa posiciona a empresa em relação a seus concorrentes quanto à satisfação das necessidades de seus clientes, ou seja, define gaps existentes entre necessidades e serviços oferecidos. No aspecto interno, busca definir o perfil do pedido desta empresa como forma de adequar sua estrutura e processo de armazenagem. Através desta análise, constata-se que a empresa estudada oferece um nível de serviço cuja satisfação está de acordo com seus concorrentes, porém aquém do que esperam seus clientes. Conclui-se ainda que o perfil do cliente provoca diferenças de percepção quanto à importância e à satisfação dos atributos pesquisados. O estudo do perfil do pedido revela que esta empresa tem pedidos de pequeno valor, pouco freqüentes e com poucos itens. Este perfil obriga a empresa a maximizar sua estrutura interna de distribuição física, tanto em termos de processos de armazenagem como de separação e manuseio dos produtos. Na conclusão do trabalho, são apresentas algumas recomendações à empresa pesquisada, levando-se em consideração os resultados da pesquisa e a necessidade de optar-se por estratégias que sejam inovadoras, adequadas ao rápido processo de mudanças de conceitos da época atual.
Resumo:
O funcionamento dos bancos comerciais implica no sucesso de suas estratégias de captação de depósitos a prazo. O Certificado de Depósito Bancário (CDB) é um dos instrumentos mais utilizados para este fim. 95% dos CDBs são flutuantes e atrelados ao CDI, sendo que grande parte destes CDBs tem data de carência pré-definida para o resgate. Esta característica é responsável pela opção implícita de resgate oferecida ao investidor. Ou seja, o investidor tem a prerrogativa de resgatar seu investimento entre a data de carência e o vencimento do CDB sem que seja penalizado por isso. Este trabalho apresenta um método de apreçamento da opção de resgate implícita no CDB utilizando o modelo de Black Derman Toy. A técnica empregada inova ao considerar o nível da estrutura a termo de taxa de juros tanto em relação à curva de CDBs observada no mercado, quanto a sua volatilidade. Entretanto, a volatilidade é preservada e, por isso, não é contaminada pelas oscilações da estrutura a termo. No procedimento foram utilizados os CDBs do banco de dados da Cetip com valores maiores que quinhentos mil reais emitidos entre 2007 e 2009. Assumiu-se que todos os investidores eram racionais e não precisaram recorrer aos seus investimentos, portanto só resgataram seus recursos após o fim do prazo de carência. Com o intuito de verificar a validade dos preços calculados através do modelo Black Derman Toy, foi aplicada a técnica da simulação de Monte Carlo com a criação de dez mil trajetórias para o preço do ativo. Os resultados obtidos através do modelo proposto foram confirmados quando confrontados com a simulação de Monte Carlo.