161 resultados para Semi-Representation
Resumo:
A UK field experiment compared a complete factorial combination of three backgrounds (cvs Mercia, Maris Huntsman and Maris Widgeon), three alleles at the Rht-B1 locus as Near Isogenic Lines (NILs: rht-B1a (tall), Rht-B1b (semi-dwarf), Rht-B1c (severe dwarf)) and four nitrogen (N) fertilizer application rates (0, 100, 200 and 350 kg N/ha). Linear+exponential functions were fitted to grain yield (GY) and nitrogen-use efficiency (NUE; GY/available N) responses to N rate. Averaged over N rate and background Rht-B1b conferred significantly (P<0.05) greater GY, NUE, N uptake efficiency (NUpE; N in above ground crop / available N) and N utilization efficiency (NUtEg; GY / N in above ground crop) compared with rht-B1a and Rht-B1c. However the economically optimal N rate (Nopt) for N:grain price ratios of 3.5:1 to 10:1 were also greater for Rht-B1b, and because NUE, NUpE and NUtE all declined with N rate, Rht-Blb failed to increase NUE or its components at Nopt. The adoption of semi-dwarf lines in temperate and humid regions, and the greater N rates that such adoption justifies economically, greatly increases land-use efficiency, but not necessarily, NUE.
Resumo:
In the early 2000s the threat of Highly Pathogenic Avian Influenza captured the attention of the world's media. While China is often considered the epicentre of the panzootic, few studies have explored coverage of this variant of avian flu in China. To address this issue, the authors examined the portrayal of Highly Pathogenic Avian Influenza across four Chinese newspapers at the local and national level. A textual analysis was performed on 160 articles across an eight-year period from 2001–2008. The study approach drew from Critical Discourse Analysis and Social Representation Theory. The headline analysis showed the extent that risk of the disease was subverted by the depiction of a strong and efficient ‘China’ that was a global leader in the fight against the disease. Ideological referents were called upon to stress teamwork in confronting the crisis. The diachronic analysis illustrated how the relationship between commercial interests, science and public health risks played out within the Chinese media.
Resumo:
Ice cloud representation in general circulation models remains a challenging task, due to the lack of accurate observations and the complexity of microphysical processes. In this article, we evaluate the ice water content (IWC) and ice cloud fraction statistical distributions from the numerical weather prediction models of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the UK Met Office, exploiting the synergy between the CloudSat radar and CALIPSO lidar. Using the last three weeks of July 2006, we analyse the global ice cloud occurrence as a function of temperature and latitude and show that the models capture the main geographical and temperature-dependent distributions, but overestimate the ice cloud occurrence in the Tropics in the temperature range from −60 °C to −20 °C and in the Antarctic for temperatures higher than −20 °C, but underestimate ice cloud occurrence at very low temperatures. A global statistical comparison of the occurrence of grid-box mean IWC at different temperatures shows that both the mean and range of IWC increases with increasing temperature. Globally, the models capture most of the IWC variability in the temperature range between −60 °C and −5 °C, and also reproduce the observed latitudinal dependencies in the IWC distribution due to different meteorological regimes. Two versions of the ECMWF model are assessed. The recent operational version with a diagnostic representation of precipitating snow and mixed-phase ice cloud fails to represent the IWC distribution in the −20 °C to 0 °C range, but a new version with prognostic variables for liquid water, ice and snow is much closer to the observed distribution. The comparison of models and observations provides a much-needed analysis of the vertical distribution of IWC across the globe, highlighting the ability of the models to reproduce much of the observed variability as well as the deficiencies where further improvements are required.
Resumo:
This piece is a contribution to the exhibition catalogue of Barbadian / Canadian artist Joscelyn Gardner's exhibition, 'Bleeding & Breeding' curated by Olexander Wlasenko, January 14-February 12, 2012 in the Station Gallery, Whitby, Ontario, Canada. The piece examines the ways in which Gardner's Creole Portraits II (2007) and Creole Portraits III (2009) issue a provocative and carefully crafted contestation to the journals of the slave-owner and amateur botanist Thomas Thistlewood. It argues that while Thistlewood’s journals make raced and gendered bodies seemingly available to knowledge, incorporating them within the colonial archive as signs of subjection, Gardener’s portraits disrupt these acts of history and knowledge. Her artistic response marks a radical departure from the significant body of scholarship that has drawn on the Thistlewood journals to date. Creatively contesting his narratives’ dispossession of Creole female subjects and yet aware of the problems of innocent recovery, her works style representations that retain the consciousness and effect of historical erasure. Through an oxymoronic aesthetic that assembles a highly crafted verisimilitude alongside the condition of invisibility and brings atrocity into the orbit of the aesthetic, these portraits force us to question what stakes are involved in bringing the lives of the enslaved and violated back into regimes of representation.
Resumo:
There has been considerable interest in the climate impact of trends in stratospheric water vapor (SWV). However, the representation of the radiative properties of water vapor under stratospheric conditions remains poorly constrained across different radiation codes. This study examines the sensitivity of a detailed line-by-line (LBL) code, a Malkmus narrow-band model and two broadband GCM radiation codes to a uniform perturbation in SWV in the longwave spectral region. The choice of sampling rate in wave number space (Δν) in the LBL code is shown to be important for calculations of the instantaneous change in heating rate (ΔQ) and the instantaneous longwave radiative forcing (ΔFtrop). ΔQ varies by up to 50% for values of Δν spanning 5 orders of magnitude, and ΔFtrop varies by up to 10%. In the three less detailed codes, ΔQ differs by up to 45% at 100 hPa and 50% at 1 hPa compared to a LBL calculation. This causes differences of up to 70% in the equilibrium fixed dynamical heating temperature change due to the SWV perturbation. The stratosphere-adjusted radiative forcing differs by up to 96% across the less detailed codes. The results highlight an important source of uncertainty in quantifying and modeling the links between SWV trends and climate.
Resumo:
We study boundary value problems posed in a semistrip for the elliptic sine-Gordon equation, which is the paradigm of an elliptic integrable PDE in two variables. We use the method introduced by one of the authors, which provides a substantial generalization of the inverse scattering transform and can be used for the analysis of boundary as opposed to initial-value problems. We first express the solution in terms of a 2 by 2 matrix Riemann-Hilbert problem whose \jump matrix" depends on both the Dirichlet and the Neumann boundary values. For a well posed problem one of these boundary values is an unknown function. This unknown function is characterised in terms of the so-called global relation, but in general this characterisation is nonlinear. We then concentrate on the case that the prescribed boundary conditions are zero along the unbounded sides of a semistrip and constant along the bounded side. This corresponds to a case of the so-called linearisable boundary conditions, however a major difficulty for this problem is the existence of non-integrable singularities of the function q_y at the two corners of the semistrip; these singularities are generated by the discontinuities of the boundary condition at these corners. Motivated by the recent solution of the analogous problem for the modified Helmholtz equation, we introduce an appropriate regularisation which overcomes this difficulty. Furthermore, by mapping the basic Riemann-Hilbert problem to an equivalent modified Riemann-Hilbert problem, we show that the solution can be expressed in terms of a 2 by 2 matrix Riemann-Hilbert problem whose jump matrix depends explicitly on the width of the semistrip L, on the constant value d of the solution along the bounded side, and on the residues at the given poles of a certain spectral function denoted by h. The determination of the function h remains open.
Resumo:
Although the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contributions of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remain poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The first stimulus (adaptor) was delivered either to the index or to the middle finger of the right or left hand, whereas the second stimulus (test) was always administered to the left index finger. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more after subsequent stimulation of homologous as compared with nonhomologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices, this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI as well as in SII. Our findings suggest that SI is more than a simple relay for sensory information and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands).
Resumo:
The proteome of Salmonella enterica serovar Typhimurium was characterized by 2-dimensional HPLC mass spectrometry to provide a platform for subsequent proteomic investigations of low level multiple antibiotic resistance (MAR). Bacteria (2.15 +/- 0.23 x 10(10) cfu; mean +/- s.d.) were harvested from liquid culture and proteins differentially fractionated, on the basis of solubility, into preparations representative of the cytosol, cell envelope and outer membrane proteins (OMPs). These preparations were digested by treatment with trypsin and peptides separated into fractions (n = 20) by strong cation exchange chromatography (SCX). Tryptic peptides in each SCX fraction were further separated by reversed-phase chromatography and detected by mass spectrometry. Peptides were assigned to proteins and consensus rank listings compiled using SEQUEST. A total of 816 +/- 11 individual proteins were identified which included 371 +/- 33, 565 +/- 15 and 262 +/- 5 from the cytosolic, cell envelope and OMP preparations, respectively. A significant correlation was observed (r(2) = 0.62 +/- 0.10; P < 0.0001) between consensus rank position for duplicate cell preparations and an average of 74 +/- 5% of proteins were common to both replicates. A total of 34 outer membrane proteins were detected, 20 of these from the OMP preparation. A range of proteins (n = 20) previously associated with the mar locus in E. coli were also found including the key MAR effectors AcrA, TolC and OmpF.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.
Resumo:
As a major mode of intraseasonal variability, which interacts with weather and climate systems on a near-global scale, the Madden – Julian Oscillation (MJO) is a crucial source of predictability for numerical weather prediction (NWP) models. Despite its global significance and comprehensive investigation, improvements in the representation of the MJO in an NWP context remain elusive. However, recent modifications to the model physics in the ECMWF model led to advances in the representation of atmospheric variability and the unprecedented propagation of the MJO signal through the entire integration period. In light of these recent advances, a set of hindcast experiments have been designed to assess the sensitivity of MJO simulation to the formulation of convection. Through the application of established MJO diagnostics, it is shown that the improvements in the representation of the MJO can be directly attributed to the modified convective parametrization. Furthermore, the improvements are attributed to the move from a moisture-convergent- to a relative-humidity-dependent formulation for organized deep entrainment. It is concluded that, in order to understand the physical mechanisms through which a relative-humidity-dependent formulation for entrainment led to an improved simulation of the MJO, a more process-based approach should be taken. T he application of process-based diagnostics t o t he hindcast experiments presented here will be the focus of Part II of this study.
Resumo:
This chapter looks into the gap between presentational realism and the representation of physical experience in Werner Herzog's work so as to retrieve the indexical trace – or the absolute materiality of death. To that end, it draws links between Herzog and other directors akin to realism in its various forms, including surrealism. In particular, it focuses on François Truffaut and Glauber Rocha, representing respectively the Nouvelle Vague and the Cinema Novo, whose works had a decisive weight on Herzog’s aesthetic choices to the point of originating distinct phases of his outputs. The analyses, though restricted to a small number of films, intends to re-evaluate Herzog’s position within, and contribution to, film history.