966 resultados para Perfect Pyramids
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
The paper analyzes the performance of the unconstrained filtered-x LMS (FxLMS) algorithm for active noise control (ANC), where we remove the constraints on the controller that it must be causal and has finite impulse response. It is shown that the unconstrained FxLMS algorithm always converges to, if stable, the true optimum filter, even if the estimation of the secondary path is not perfect, and its final mean square error is independent of the secondary path. Moreover, we show that the sufficient and necessary stability condition for the feedforward unconstrained FxLMS is that the maximum phase error of the secondary path estimation must be within 90°, which is the only necessary condition for the feedback unconstrained FxLMS. The significance of the analysis on a practical system is also discussed. Finally we show how the obtained results can guide us to design a robust feedback ANC headset.
Resumo:
We consider reshaping an obstacle virtually by using transformation optics in acoustic and electromagnetic scattering. Among the general virtual reshaping results, the virtual minification and virtual magnification in particular are studied. Stability estimates are derived for scattering amplitude in terms of the diameter of a small obstacle, which implies that the limiting case for minification corresponds to a perfect cloaking, i.e., the obstacle is invisible to detection.
Resumo:
Virulence for bean and soybean is determined by effector genes in a plasmid-borne pathogenicity island (PAI) in race 7 strain 1449B of Pseudomonas syringae pv. phaseolicola. One of the effector genes, avrPphF, confers either pathogenicity, virulence, or avirulence depending on the plant host and is absent from races 2, 3, 4, 6, and 8 of this pathogen. Analysis of cosmid clones and comparison of DNA sequences showed that the absence of avrPphF from strain 1448A is due to deletion of a continuous 9.5-kb fragment. The remainder of the PAI is well conserved in strains 1448A and 1449B. The left junction of the deleted region consists of a chimeric transposable element generated from the fusion of homologs of IS1492 from Pseudomonas putida and IS1090 from Ralstonia eutropha. The borders of the deletion were conserved in 66 P. syringae pv. phaseolicola strains isolated in different countries and representing the five races lacking avrPphF. However, six strains isolated in Spain had a 10.5-kb deletion that extended 1 kb further from the right junction. The perfect conservation of the 28-nucleotide right repeat of the IS1090 homolog in the two deletion types and in the other 47 insertions of the IS1090 homolog in the 1448A genome strongly suggests that the avrPphF deletions were mediated by the activity of the chimeric mobile element. Our data strongly support a clonal origin for the races of P. syringae pv. phaseolicola lacking avrPphF.
Resumo:
In 1999, Elizabeth Hills pointed up the challenges that physically active women on film still posed, in cultural terms, and in relation to certain branches of feminist theory . Since then, a remarkable number of emphatically active female heroes have appeared on screen, from 'Charlie’s Angels' to 'Resident Evil', 'Aeon Flux', and the 'Matrix' and 'X-Men' trilogies. Nevertheless, in a contemporary Western culture frequently characterised as postfeminist, these seem to be the ‘acceptable face’ – and body – of female empowerment: predominantly white, heterosexual, often scantily clad, with the traditional hero’s toughness and resolve re-imagined in terms of gender-biased notions of decorum: grace and dignity alongside perfect hair and make-up, and a body that does not display unsightly markers of physical exertion. The homogeneity of these representations is worth investigating in relation to critical claims that valorise such air-brushed, high-kicking 'action babes' for their combination of sexiness and strength, and the feminist and postfeminist discourses that are refracted through such readings. Indeed, this arguably ‘safe’ set of depictions, dovetailing so neatly with certain postfeminist notions of ‘having it all’, suppresses particular kinds of spectacles in relation to the active female body: images of physical stress and extension, biological consequences of violence and dangerous motivations are all absent. I argue that the untidy female exertions refused in popular “action babe” representations are now erupting into view in a number of other contemporaneous movies – 'Kill Bill' Vols 1 & 2, 'Monster', and 'Hard Candy' – that mark the return of that which is repressed in the mainstream vision of female power – that is, a more viscerally realistic physicality, rage and aggression. As such, these films engage directly with the issue of how to represent violent female agency. This chapter explores what is at stake at a representational level and in terms of spectatorial processes of identification in the return of this particularly visceral rendering of the female avenger.
Resumo:
Abandon hope all ye who enter here: a society cannot be truly dystopian if travellers can come and go freely. Anti-utopias and 'satirical utopias' - that is, societies considered perfect by their advocates but not by the implied reader - must be well-regulated enough to prevent the possible disruption caused by a visitor. There is no exit at all from the classic twentieth-century dystopias, which end either in an actual death, like that of the Savage in Huxley's Brave New World (1932), or in a spiritual death like Winston Smith's in Orwell's Nineteen Eighty-Four (1949). Any glimmers of hope that the protagonist may have felt are quickly destroyed.
Resumo:
A square-planar compound [Cu(pyrimol)Cl] (pyrimol = 4-methyl-2-N-(2-pyridylmethylene)aminophenolate) abbreviated as CuL–Cl) is described as a biomimetic model of the enzyme galactose oxidase (GOase). This copper(II) compound is capable of stoichiometric aerobic oxidation of activated primary alcohols in acetonitrile/water to the corresponding aldehydes. It can be obtained either from Hpyrimol (HL) or its reduced/hydrogenated form Hpyramol (4-methyl-2-N-(2-pyridylmethyl)aminophenol; H2L) readily converting to pyrimol (L-) on coordination to the copper(II) ion. Crystalline CuL–Cl and its bromide derivative exhibit a perfect square-planar geometry with Cu–O(phenolate) bond lengths of 1.944(2) and 1.938(2) Å. The cyclic voltammogram of CuL–Cl exhibits an irreversible anodic wave at +0.50 and +0.57 V versus ferrocene/ferrocenium (Fc/Fc+) in dry dichloromethane and acetonitrile, respectively, corresponding to oxidation of the phenolate ligand to the corresponding phenoxyl radical. In the strongly donating acetonitrile the oxidation path involves reversible solvent coordination at the Cu(II) centre. The presence of the dominant CuII–L. chromophore in the electrochemically and chemically oxidised species is evident from a new fairly intense electronic absorption at 400–480 nm ascribed to a several electronic transitions having a mixed pi-pi(L.) intraligand and Cu–Cl -> L. charge transfer character. The EPR signal of CuL–Cl disappears on oxidation due to strong intramolecular antiferromagnetic exchange coupling between the phenoxyl radical ligand (L.) and the copper(II) centre, giving rise to a singlet ground state (S = 0). The key step in the mechanism of the primary alcohol oxidation by CuL–Cl is probably the alpha-hydrogen abstraction from the equatorially bound alcoholate by the phenoxyl moiety in the oxidised pyrimol ligand, Cu–L., through a five-membered cyclic transition state.
Resumo:
Increased penetration of generation and decentralised control are considered to be feasible and effective solution for reducing cost and emissions and hence efficiency associated with power generation and distribution. Distributed generation in combination with the multi-agent technology are perfect candidates for this solution. Pro-active and autonomous nature of multi-agent systems can provide an effective platform for decentralised control whilst improving reliability and flexibility of the grid.
Resumo:
Near-perfect vector phase conjugation was achieved at 488 nm in a methyl red dye impregnated polymethylmethacrylate film by employing a temperature tuning technique. Using a degenerate four-wave mixing geometry with vertically polarized counterpropagating pump beams, intensity and polarization gratings were written in the dye/polymer system using a vertically or horizontally polarized weak probe beam. Over a limited temperature range, as the sample was heated, the probe reflectivity from the polarization grating dropped but the reflectivity from the intensity grating rose sharply. At a sample temperature of approximately 50°C, the reflectivities of the gratings were measured to be equal and we confirmed that, at this temperature, the measured vector phase conjugate fidelity was very close to unity. We discuss a possible explanation of this effect.
Resumo:
This paper discusses concepts of value from the point of view of the user of the space and the counter view of the provider of the same. Land and property are factors of production. The value of the land flows from the use to which it is put, and that in turn, is dependent upon the demand (and supply) for the product or service that is produced/provided from that space. If there is a high demand for the product (at a fixed level of supply), the price will increase and the economic rent for the land/property will increase accordingly. This is the underlying paradigm of Ricardian rent theory where the supply of land is fixed and a single good is produced. In such a case the rent of land is wholly an economic rent. Economic theory generally distinguishes between two kinds of price, price of production or “value in use” (as determined by the labour theory of value), and market price or “value in exchange” (as determined by supply and demand). It is based on a coherent and consistent theory of value and price. Effectively the distinction is between what space is ‘worth’ to an individual and that space’s price of exchange in the market place. In a perfect market where any individual has access to the same information as all others in the market, price and worth should coincide. However in a market where access to information is not uniform, and where different uses compete for the same space, it is more likely that the two figures will diverge. This paper argues that the traditional reliance of valuers to use methods of comparison to determine “price” has led to an artificial divergence of “value in use” and “value in exchange”, but now such comparison are becoming more difficult due to the diversity of lettings in the market place, there will be a requirement to return to fundamentals and pay heed to the thought process of the user in assessing the worth of the space to be let.
Resumo:
This paper examines one of the central issues in the formulation of a sector/regional real estate portfolio strategy, i.e. whether the means, standard deviations and correlations between the returns are sufficiently stable over time to justify using ex-post measures as proxies of the ex-ante portfolio inputs required for MPT. To investigate these issues this study conducts a number of tests of the inter-temporal stability of the total returns of the 19 sector/regions in the UK of the IPDMI. The results of the analysis reveal that the theoretical gains in sector and or regional diversification, found in previous work, could not have been readily achieved in practice without almost perfect foresight on the part of an investor as means, standard deviations and correlations, varied markedly from period to period.
Resumo:
Gardner's popular model of perfect competition in the marketing sector is extended to a conjectural-variations oligopoly with endogenous entry. Revising Gardner's comparative statics on the "farm-retail price ratio," tests of hypotheses about food industry conduct are derived. Using data from a recent article by Wohlgenant, which employs Gardner's framework, tests are made of the validity of his maintained hypothesis-that the food industries are perfectly competitive. No evidence is found of departures from competition in the output markets of the food industries of eight commodity groups: (a) beef and veal, (b) pork, (c) poultry, (d) eggs, (e) dairy, (f) processed fruits and vegetables, (g) fresh fruit, and (h) fresh vegetables.
Resumo:
Data assimilation is predominantly used for state estimation; combining observational data with model predictions to produce an updated model state that most accurately approximates the true system state whilst keeping the model parameters fixed. This updated model state is then used to initiate the next model forecast. Even with perfect initial data, inaccurate model parameters will lead to the growth of prediction errors. To generate reliable forecasts we need good estimates of both the current system state and the model parameters. This paper presents research into data assimilation methods for morphodynamic model state and parameter estimation. First, we focus on state estimation and describe implementation of a three dimensional variational(3D-Var) data assimilation scheme in a simple 2D morphodynamic model of Morecambe Bay, UK. The assimilation of observations of bathymetry derived from SAR satellite imagery and a ship-borne survey is shown to significantly improve the predictive capability of the model over a 2 year run. Here, the model parameters are set by manual calibration; this is laborious and is found to produce different parameter values depending on the type and coverage of the validation dataset. The second part of this paper considers the problem of model parameter estimation in more detail. We explain how, by employing the technique of state augmentation, it is possible to use data assimilation to estimate uncertain model parameters concurrently with the model state. This approach removes inefficiencies associated with manual calibration and enables more effective use of observational data. We outline the development of a novel hybrid sequential 3D-Var data assimilation algorithm for joint state-parameter estimation and demonstrate its efficacy using an idealised 1D sediment transport model. The results of this study are extremely positive and suggest that there is great potential for the use of data assimilation-based state-parameter estimation in coastal morphodynamic modelling.
Resumo:
Producing projections of future crop yields requires careful thought about the appropriate use of atmosphere-ocean global climate model (AOGCM) simulations. Here we describe and demonstrate multiple methods for ‘calibrating’ climate projections using an ensemble of AOGCM simulations in a ‘perfect sibling’ framework. Crucially, this type of analysis assesses the ability of each calibration methodology to produce reliable estimates of future climate, which is not possible just using historical observations. This type of approach could be more widely adopted for assessing calibration methodologies for crop modelling. The calibration methods assessed include the commonly used ‘delta’ (change factor) and ‘nudging’ (bias correction) approaches. We focus on daily maximum temperature in summer over Europe for this idealised case study, but the methods can be generalised to other variables and other regions. The calibration methods, which are relatively easy to implement given appropriate observations, produce more robust projections of future daily maximum temperatures and heat stress than using raw model output. The choice over which calibration method to use will likely depend on the situation, but change factor approaches tend to perform best in our examples. Finally, we demonstrate that the uncertainty due to the choice of calibration methodology is a significant contributor to the total uncertainty in future climate projections for impact studies. We conclude that utilising a variety of calibration methods on output from a wide range of AOGCMs is essential to produce climate data that will ensure robust and reliable crop yield projections.
Resumo:
This paper shows the robust non-existence of competitive equilibria even in a simple three period representative agent economy with dynamically inconsistent preferences. We distinguish between a sophisticated and naive representative agent. Even when underlying preferences are monotone and convex, at given prices, we show by example that the induced preference of the sophisticated representative agent over choices in first-period markets is both non-convex and satiated. Even allowing for negative prices, the market-clearing allocation is not contained in the convex hull of demand. Finally, with a naive representative agent, we show that perfect foresight is incompatible with market clearing and individual optimization at given prices.