828 resultados para SimPly
Resumo:
The communal lands of the Eastern Cape have been regarded as both tools and problems by policy-makers. In particular, communal lands are problematised as environmentally degraded, of suboptimum productivity and constraining economic development. The Eastern Cape Communal Lands Research Project was framed within this policy discourse with the aim of introducing legume-based pasture into ‘abandoned arable lands’. Initial results from community workshops show that the institutional arrangements for these arable lands vary widely and, with them, the capacity to utilise any new technology that may have application to them. Rather than simply draw on social capital, if a participatory research approach is to enhance the agency of the participating communites, it may need to contribute to social capital building and especially to create a dialogical space in which the matters being researched can be discussed meaningfully.
Resumo:
This article shows how the solution to the promotion problem—the problem of locating the optimal level of advertising in a downstream market—can be derived simply, empirically, and robustly through the application of some simple calculus and Bayesian econometrics. We derive the complete distribution of the level of promotion that maximizes producer surplus and generate recommendations about patterns as well as levels of expenditure that increase net returns. The theory and methods are applied to quarterly series (1978:2S1988:4) on red meats promotion by the Australian Meat and Live-Stock Corporation. A slightly different pattern of expenditure would have profited lamb producers
Resumo:
In terms of evolution, the strategy of catching prey would have been an important part of survival in a constantly changing environment. A prediction mechanism would have developed to compensate for any delay in the sensory-motor system. In a previous study, “proactive control” was found, in which the motion of the hands preceded the virtual moving target. These results implied that the positive phase shift of the hand motion represents the proactive nature of the visual-motor control system, which attempts to minimize the brief error in the hand motion when the target changes position unexpectedly. In our study, a visual target moves in circle (13 cm diameter) on a computer screen, and each subject is asked to keep track of the target’s motion by the motion of a cursor. As the frequency of the target increases, a rhythmic component was found in the velocity of the cursor in spite of the fact that the velocity of the target was constant. The generation of a rhythmic component cannot be explained simply as a feedback mechanism for the phase shifts of the target and cursor in a sensory-motor system. Therefore, it implies that the rhythmic component was generated to predict the velocity of the target, which is a feed-forward mechanism in the sensory-motor system. Here, we discuss the generation of the rhythmic component and its roll in the feed-forward mechanism.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.
Resumo:
An experiment published in this Journal has been revisited and it is found that the curve pattern of the anodic polarization curve for iron repeats itself successively when the potential scan is repeated. It is surprising that this observation has not been reported previously in the literature because it immediately brings into question the long accepted and well-known explanations involving a passive film. A qualitative and plausible explanation is provided from surprisingly simple principles for this new finding. Some important pedagogic conclusions have been derived from this work. It is noteworthy that the somewhat complicated phenomenon can be simply explained, thus providing two important lessons to students. First, even well-accepted scientific work studying simple processes may be incomplete and worthy of further study, and second, such processes may be explained simply at the undergraduate level. The contents of the paper also confirm that presenting curricular contents in a new and more correct manner is beneficial, interesting, and that research in curricular contents represents one of the important forms of educational research.
Resumo:
Using aggregate indices of education, health, demographic, and gender equality outcomes, we empirically investigate the hypothesis that Bangladesh achieved a higher level of social development compared with countries of similar level of per capita income. Stylized facts and cross-country regression results support this hypothesis for a broad range of dimensions. Further tests show that such achievements do not simply reflect income-mediated channels and social expenditure programs. We conclude by speculating on the role of Bangladesh’s development to sustain the process of growth and on the role of governance and institutional quality for the nexus between growth and development.
Resumo:
This paper presents an account of the literacy activities engaged in by the parents of 29 children around the time that the children were about to start school at Key Stage 1. Fifteen of the children were reading fluently before they began school and the remaining fourteen were matched for age, sex, receptive vocabulary scores, preschool group attended and socio-economic family status, but not reading fluently. In order to ascertain that the fluent readers were not simply coming from homes where literacy activities were more in evidence, parents were asked to report on their own literacy activities. The data obtained indicated that there were no systematic differences in the activities of the two sets of parents. They also showed that there was a considerable amount of literacy activity evidence in the homes. It is argues that, whilst the home environment is highly instrumental in nurturing literacy development, it is not enough to account for precocious reading ability.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.
Resumo:
However common it has become, the term World Cinema still lacks a proper, positive definition. Despite its all-encompassing, democratic vocation, it is not usually employed to mean cinema worldwide. On the contrary, the usual way of defining it is restrictive and negative, as ‘the non-Hollywood cinema’. Needless to say, negation here translates a positive intention to turn difference from the dominant model into a virtue to be rescued from an unequal competition. However, it unwittingly sanctions the American way of looking at the world, according to which Hollywood is the centre and all other cinemas are the periphery. As an alternative to this model, this chapter proposes: • World Cinema is simply the cinema of the world. It has no centre. It is not the other, but it is us. It has no beginning and no end, but is a global process. World Cinema, as the world itself, is circulation. • World Cinema is not a discipline, but a method, a way of cutting across film history according to waves of relevant films and movements, thus creating flexible geographies. • As a positive, inclusive, democratic concept, World Cinema allows all sorts of theoretical approaches, provided they are not based on the binary perspective.
Resumo:
Atmospheric CO2 concentration is hypothesized to influence vegetation distribution via tree–grass competition, with higher CO2 concentrations favouring trees. The stable carbon isotope (δ13C) signature of vegetation is influenced by the relative importance of C4 plants (including most tropical grasses) and C3 plants (including nearly all trees), and the degree of stomatal closure – a response to aridity – in C3 plants. Compound-specific δ13C analyses of leaf-wax biomarkers in sediment cores of an offshore South Atlantic transect are used here as a record of vegetation changes in subequatorial Africa. These data suggest a large increase in C3 relative to C4 plant dominance after the Last Glacial Maximum. Using a process-based biogeography model that explicitly simulates 13C discrimination, it is shown that precipitation and temperature changes cannot explain the observed shift in δ13C values. The physiological effect of increasing CO2 concentration is decisive, altering the C3/C4 balance and bringing the simulated and observed δ13C values into line. It is concluded that CO2 concentration itself was a key agent of vegetation change in tropical southern Africa during the last glacial–interglacial transition. Two additional inferences follow. First, long-term variations in terrestrial δ13Cvalues are not simply a proxy for regional rainfall, as has sometimes been assumed. Although precipitation and temperature changes have had major effects on vegetation in many regions of the world during the period between the Last Glacial Maximum and recent times, CO2 effects must also be taken into account, especially when reconstructing changes in climate between glacial and interglacial states. Second, rising CO2 concentration today is likely to be influencing tree–grass competition in a similar way, and thus contributing to the "woody thickening" observed in savannas worldwide. This second inference points to the importance of experiments to determine how vegetation composition in savannas is likely to be influenced by the continuing rise of CO2 concentration.
Resumo:
The energy-Casimir stability method, also known as the Arnold stability method, has been widely used in fluid dynamical applications to derive sufficient conditions for nonlinear stability. The most commonly studied system is two-dimensional Euler flow. It is shown that the set of two-dimensional Euler flows satisfying the energy-Casimir stability criteria is empty for two important cases: (i) domains having the topology of the sphere, and (ii) simply-connected bounded domains with zero net vorticity. The results apply to both the first and the second of Arnold’s stability theorems. In the spirit of Andrews’ theorem, this puts a further limitation on the applicability of the method. © 2000 American Institute of Physics.
Resumo:
Processing of highly perishable non-storable crops, such as tomato, is typically promoted for two reasons: as a way of absorbing excess supply, particularly during gluts that result from predominantly rainfed cultivation; and to enhance the value chain through a value-added process. For Ghana, improving domestic tomato processing would also reduce the country’s dependence on imported tomato paste and so improve foreign exchange reserves, as well as provide employment opportunities and development opportunities in what are poor rural areas of the country. Many reports simply repeat the mantra that processing offers a way of buying up the glut. Yet the reality is that the “tomato gluts,” an annual feature of the local press, occur only for a few weeks of the year, and are almost always a result of large volumes of rainfed local varieties unsuitable for processing entering the fresh market at the same time, not the improved varieties that could be used by the processors. For most of the year, the price of tomatoes suitable for processing is above the breakeven price for tomato processors, given the competition from imports. Improved varieties (such as Pectomech) that are suitable for processing are also preferred by consumers and achieve a premium price over the local varieties.
Resumo:
Interactions between different convection modes can be investigated using an energy–cycle description under a framework of mass–flux parameterization. The present paper systematically investigates this system by taking a limit of two modes: shallow and deep convection. Shallow convection destabilizes itself as well as the other convective modes by moistening and cooling the environment, whereas deep convection stabilizes itself as well as the other modes by drying and warming the environment. As a result, shallow convection leads to a runaway growth process in its stand–alone mode, whereas deep convection simply damps out. Interaction between these two convective modes becomes a rich problem, even when it is limited to the case with no large–scale forcing, because of these opposing tendencies. Only if the two modes are coupled at a proper level can a self–sustaining system arise, exhibiting a periodic cycle. The present study establishes the conditions for self–sustaining periodic solutions. It carefully documents the behaviour of the two mode system in order to facilitate the interpretation of global model behaviours when this energy–cycle is implemented as a closure into a convection parameterization in future.