100 resultados para Symbolic Computations
Pianos for the people: the manufacture, marketing and sale of pianos as consumer durables, 1850-1914
Resumo:
During the second half of the nineteenth century, British society experienced a rise in real incomes and a change in its composition, with the expansion of the middle classes. These two factors led to a consumer revolution, with a growing, but still segmented, demand for household goods that could express status and aspiration. At the same time technological changes and new ways of marketing and selling goods made these goods more affordable. This paper analyzes these themes and the process of mediation that took place between producers, retailers, and consumers, by looking at the most culturally symbolic of nineteenth century consumer goods, the piano.
Resumo:
This article examines the relationship between nationalism and liberal values, and more specifically the redefinition of boundaries between national communities and others in the rhetoric of radical right parties in Europe. The aim is to examine the tension between radical right party discourse and the increasing need to shape this discourse in liberal terms. We argue that the radical right parties that successfully operate within the democratic system tend to be those best able to tailor their discourse to the liberal and civic characteristics of national identity so as to present themselves and their ideologies as the true authentic defenders of the nation's unique reputation for democracy, diversity and tolerance. Comparing the success of a number of European radical right parties ranging from the most electorally successful SVP to the more mixed BNP, FN and NPD, we show that the parties that effectively deploy the symbolic resources of national identity through a predominantly voluntaristic prism tend to be the ones that fare better within their respective political systems. In doing so, we challenge the conventional view in the study of nationalism which expects civic values to shield countries from radicalism and extremism.
Resumo:
The Daochos Monument at Delphi has received some scholarly attention from an art-historical and archaeological perspective; this article, however, examines it rather as a reflection of contemporary Thessalian history and discourse, an aspect which has been almost entirely neglected. Through its visual imagery and its inscriptions, the monument adopts and adapts long-standing Thessalian themes of governance and identity, and achieves a delicate balance with Macedonian concerns to forge a symbolic rapprochement between powers and cultures in the Greek north. Its dedicator, Daochos, emerges as far more than just the puppet of Philip II of Macedon. This hostile and largely Demosthenic characterisation, which remains influential even in modern historiography, is far from adequate in allowing for an understanding of the relationship between Thessalian and Macedonian motivations at this time, or of the importance of Delphi as the pan-Hellenic setting of their interaction. Looking closely at the Daochos Monument instead allows for a rare glimpse into the Thessalian perspective in all its complexity.
Resumo:
How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000- fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. Keywords: haldanes, biological time, scaling, pedomorphosis
Resumo:
Using figures derived from the UK Home Office, this paper analyses and reviews the impact and deployment of Part V of the Criminal Justice and Public Order Act 1994 since its enactment. This is done with special reference to its impact on citizenship and the regulation of ‘the environment’ and associated rural spaces. It is argued that, notwithstanding the actual use of the public order clauses in Part V of the Act, its underlying meanings are largely of a symbolic nature. Such symbolism is, however, a powerful indication of the defence of particularist constructions of rural space. It can also open out new conditions of possibility, providing a useful ‘oppressed’ status and media spectacle for a range of protesters and activists.
The unsteady flow of a weakly compressible fluid in a thin porous layer II: three-dimensional theory
Resumo:
We consider the problem of determining the pressure and velocity fields for a weakly compressible fluid flowing in a three-dimensional layer, composed of an inhomogeneous, anisotropic porous medium, with vertical side walls and variable upper and lower boundaries, in the presence of vertical wells injecting and/or extracting fluid. Numerical solution of this three-dimensional evolution problem may be expensive, particularly in the case that the depth scale of the layer h is small compared to the horizontal length scale l, a situation which occurs frequently in the application to oil and gas reservoir recovery and which leads to significant stiffness in the numerical problem. Under the assumption that $\epsilon\propto h/l\ll 1$, we show that, to leading order in $\epsilon$, the pressure field varies only in the horizontal directions away from the wells (the outer region). We construct asymptotic expansions in $\epsilon$ in both the inner (near the wells) and outer regions and use the asymptotic matching principle to derive expressions for all significant process quantities. The only computations required are for the solution of non-stiff linear, elliptic, two-dimensional boundary-value, and eigenvalue problems. This approach, via the method of matched asymptotic expansions, takes advantage of the small aspect ratio of the layer, $\epsilon$, at precisely the stage where full numerical computations become stiff, and also reveals the detailed structure of the dynamics of the flow, both in the neighbourhood of wells and away from wells.
Resumo:
This is an extended version of Philip Murphy's inaugural lecture as director of the Institute of Commonwealth Studies, delivered on 23 February 2011. It traces the relationship of the UK with the wider Commonwealth over 40 years, paying particular attention to the rhetoric of governments and opposition parties from Wilson and Heath to Cameron. It examines the reasons for the Commonwealth being relegated to a peripheral role in British foreign policy, especially European preoccupations and the issues of Rhodesia and South Africa. It argues that the Commonwealth remains of considerable practical and enormous symbolic importance to the UK. The British government should engage with the Commonwealth more than it has done in the recent past and the Commonwealth should be both open to and critical of its imperial past.
Resumo:
A system for continuous data assimilation is presented and discussed. To simulate the dynamical development a channel version of a balanced barotropic model is used and geopotential (height) data are assimilated into the models computations as data become available. In the first experiment the updating is performed every 24th, 12th and 6th hours with a given network. The stations are distributed at random in 4 groups in order to simulate 4 areas with different density of stations. Optimum interpolation is performed for the difference between the forecast and the valid observations. The RMS-error of the analyses is reduced in time, and the error being smaller the more frequent the updating is performed. The updating every 6th hour yields an error in the analysis less than the RMS-error of the observation. In a second experiment the updating is performed by data from a moving satellite with a side-scan capability of about 15°. If the satellite data are analysed at every time step before they are introduced into the system the error of the analysis is reduced to a value below the RMS-error of the observation already after 24 hours and yields as a whole a better result than updating from a fixed network. If the satellite data are introduced without any modification the error of the analysis is reduced much slower and it takes about 4 days to reach a comparable result to the one where the data have been analysed.
Resumo:
This article examines utopian gestures and inaugural desires in two films which became symbolic of the Brazilian Film Revival in the late 1990s: Central Station (1998) and Midnight (1999). Both evolve around the idea of an overcrowded or empty centre in a country trapped between past and future, in which the motif of the zero stands for both the announcement and the negation of utopia. The analysis draws parallels between them and new wave films which also elaborate on the idea of the zero, with examples picked from Italian neo-realism, the Brazilian Cinema Novo and the New German Cinema. In Central Station, the ‘point zero’, or the core of the homeland, is retrieved in the archaic backlands, where political issues are resolved in the private sphere and the social drama turns into family melodrama. Midnight, in its turn, recycles Glauber Rocha’s utopian prophecies in the new millennium’s hour zero, when the earthly paradise represented by the sea is re-encountered by the middle-class character, but not by the poor migrant. In both cases, public injustice is compensated by the heroes’ personal achievements, but those do not refer to the real nation, its history or society. Their utopian breadth, based on nostalgia, citation and genre techniques, is of a virtual kind, attune to cinema only.
Resumo:
Ensemble-based data assimilation is rapidly proving itself as a computationally-efficient and skilful assimilation method for numerical weather prediction, which can provide a viable alternative to more established variational assimilation techniques. However, a fundamental shortcoming of ensemble techniques is that the resulting analysis increments can only span a limited subspace of the state space, whose dimension is less than the ensemble size. This limits the amount of observational information that can effectively constrain the analysis. In this paper, a data selection strategy that aims to assimilate only the observational components that matter most and that can be used with both stochastic and deterministic ensemble filters is presented. This avoids unnecessary computations, reduces round-off errors and minimizes the risk of importing observation bias in the analysis. When an ensemble-based assimilation technique is used to assimilate high-density observations, the data-selection procedure allows the use of larger localization domains that may lead to a more balanced analysis. Results from the use of this data selection technique with a two-dimensional linear and a nonlinear advection model using both in situ and remote sounding observations are discussed.
Resumo:
We consider the Dirichlet and Robin boundary value problems for the Helmholtz equation in a non-locally perturbed half-plane, modelling time harmonic acoustic scattering of an incident field by, respectively, sound-soft and impedance infinite rough surfaces.Recently proposed novel boundary integral equation formulations of these problems are discussed. It is usual in practical computations to truncate the infinite rough surface, solving a boundary integral equation on a finite section of the boundary, of length 2A, say. In the case of surfaces of small amplitude and slope we prove the stability and convergence as A→∞ of this approximation procedure. For surfaces of arbitrarily large amplitude and/or surface slope we prove stability and convergence of a modified finite section procedure in which the truncated boundary is ‘flattened’ in finite neighbourhoods of its two endpoints. Copyright © 2001 John Wiley & Sons, Ltd.
Resumo:
This article examines the potential to improve numerical weather prediction (NWP) by estimating upper and lower bounds on predictability by re-visiting the original study of Lorenz (1982) but applied to the most recent version of the European Centre for Medium Range Weather Forecasts (ECMWF) forecast system, for both the deterministic and ensemble prediction systems (EPS). These bounds are contrasted with an older version of the same NWP system to see how they have changed with improvements to the NWP system. The computations were performed for the earlier seasons of DJF 1985/1986 and JJA 1986 and the later seasons of DJF 2010/2011 and JJA 2011 using the 500-hPa geopotential height field. Results indicate that for this field, we may be approaching the limit of deterministic forecasting so that further improvements might only be obtained by improving the initial state. The results also show that predictability calculations with earlier versions of the model may overestimate potential forecast skill, which may be due to insufficient internal variability in the model and because recent versions of the model are more realistic in representing the true atmospheric evolution. The same methodology is applied to the EPS to calculate upper and lower bounds of predictability of the ensemble mean forecast in order to explore how ensemble forecasting could extend the limits of the deterministic forecast. The results show that there is a large potential to improve the ensemble predictions, but for the increased predictability of the ensemble mean, there will be a trade-off in information as the forecasts will become increasingly smoothed with time. From around the 10-d forecast time, the ensemble mean begins to converge towards climatology. Until this point, the ensemble mean is able to predict the main features of the large-scale flow accurately and with high consistency from one forecast cycle to the next. By the 15-d forecast time, the ensemble mean has lost information with the anomaly of the flow strongly smoothed out. In contrast, the control forecast is much less consistent from run to run, but provides more detailed (unsmoothed) but less useful information.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Resumo:
This paper presents a critical history of the concept of ‘structured deposition’. It examines the long-term development of this idea in archaeology, from its origins in the early 1980s through to the present day, looking at how it has been moulded and transformed. On the basis of this historical account, a number of problems are identified with the way that ‘structured deposition’ has generally been conceptualized and applied. It is suggested that the range of deposits described under a single banner as being ‘structured’ is unhelpfully broad, and that archaeologists have been too willing to view material culture patterning as intentionally produced – the result of symbolic or ritual action. It is also argued that the material signatures of ‘everyday’ practice have been undertheorized and all too often ignored. Ultimately, it is suggested that if we are ever to understand fully the archaeological signatures of past practice, it is vital to consider the ‘everyday’ as well as the ‘ritual’ processes which lie behind the patterns we uncover in the ground.
Resumo:
Previously, governments have responded to the impacts of economic failures and consequently have developed more regulations to protect employees, customers, shareholders and the economic wellbeing of the state. Our research addresses how Accounting Information Systems (AIS) may act as carriers for institutionalised practices associated with maintaining regulatory compliance within the context of UK Asset Management Houses. The AIS was found to be a strong conduit for institutionalized compliance related practices, utilising symbolic systems, relational systems, routines and artefacts to carry approaches relating to regulative, normative and cultural-cognitive strands of institutionalism. Thus, AIS are integral to the development and dissipation of best practice for the management of regulatory compliance. As institutional elements are clearly present we argue that AIS and regulatory compliance provide a rich context to further institutionalism. Since AIS may act as conduits for regulatory approaches, both systems adopters and clients may benefit from actively seeking to codify and abstract best practices into AIS. However, the application of generic institutionalized approaches, which may be applied across similar organizations, must be tempered with each firm’s business environment and associated regulatory exposure. A balance should be sought between approaches specific enough to be useful but generic enough to be universally applied.