783 resultados para Fundamentals of computing theory
Resumo:
This article reviews the use of complexity theory in planning theory using the theory of metaphors for theory transfer and theory construction. The introduction to the article presents the author's positioning of planning theory. The first section thereafter provides a general background of the trajectory of development of complexity theory and discusses the rationale of using the theory of metaphors for evaluating the use of complexity theory in planning. The second section introduces the workings of metaphors in general and theory-constructing metaphors in particular, drawing out an understanding of how to proceed with an evaluative approach towards an analysis of the use of complexity theory in planning. The third section presents two case studies – reviews of two articles – to illustrate how the framework might be employed. It then discusses the implications of the evaluation for the question ‘can complexity theory contribute to planning?’ The concluding section discusses the employment of the ‘theory of metaphors’ for evaluating theory transfer and draws out normative suggestions for engaging in theory transfer using the metaphorical route.
Resumo:
This paper represents the second part of a study of semi-geostrophic (SG) geophysical fluid dynamics. SG dynamics shares certain attractive properties with the better known and more widely used quasi-geostrophic (QG) model, but is also a good prototype for balanced models that are more accurate than QG dynamics. The development of such balanced models is an area of great current interest. The goal of the present work is to extend a central body of QG theory, concerning the evolution of disturbances to prescribed basic states, to SG dynamics. Part 1 was based on the pseudomomentum; Part 2 is based on the pseudoenergy. A pseudoenergy invariant is a conserved quantity, of second order in disturbance amplitude relative to a prescribed steady basic state, which is related to the time symmetry of the system. We derive such an invariant for the semi-geostrophic equations, and use it to obtain: (i) a linear stability theorem analogous to Arnol'd's ‘first theorem’; and (ii) a small-amplitude local conservation law for the invariant, obeying the group-velocity property in the WKB limit. The results are analogous to their quasi-geostrophic forms, and reduce to those forms in the limit of small Rossby number. The results are derived for both the f-plane Boussinesq form of semi-geostrophic dynamics, and its extension to β-plane compressible flow by Magnusdottir & Schubert. Novel features particular to semi-geostrophic dynamics include apparently unnoticed lateral boundary stability criteria. Unlike the boundary stability criteria found in the first part of this study, however, these boundary criteria do not necessarily preclude the construction of provably stable basic states. The interior semi-geostrophic dynamics has an underlying Hamiltonian structure, which guarantees that symmetries in the system correspond naturally to the system's invariants. This is an important motivation for the theoretical approach used in this study. The connection between symmetries and conservation laws is made explicit using Noether's theorem applied to the Eulerian form of the Hamiltonian description of the interior dynamics.
Resumo:
The quantitative effects of uniform strain and background rotation on the stability of a strip of constant vorticity (a simple shear layer) are examined. The thickness of the strip decreases in time under the strain, so it is necessary to formulate the linear stability analysis for a time-dependent basic flow. The results show that even a strain rate γ (scaled with the vorticity of the strip) as small as 0.25 suppresses the conventional Rayleigh shear instability mechanism, in the sense that the r.m.s. wave steepness cannot amplify by more than a certain factor, and must eventually decay. For γ < 0.25 the amplification factor increases as γ decreases; however, it is only 3 when γ e 0.065. Numerical simulations confirm the predictions of linear theory at small steepness and predict a threshold value necessary for the formation of coherent vortices. The results help to explain the impression from numerous simulations of two-dimensional turbulence reported in the literature that filaments of vorticity infrequently roll up into vortices. The stabilization effect may be expected to extend to two- and three-dimensional quasi-geostrophic flows.
Resumo:
This article assesses the impact of a UK-based professional development programme on curriculum innovation and change in English Language Education (ELE) in Western China. Based on interviews, focus group discussions and observation of a total of 48 English teachers who had participated in an overseas professional development programme influenced by modern approaches to education and ELE, and 9 of their colleagues who had not taken part, it assesses the uptake of new approaches on teachers’ return to China. Interviews with 10 senior managers provided supplementary data. Using Diffusion of Innovations Theory as the conceptual framework, we examine those aspects of the Chinese situation that are supportive of change and those that constrain innovation. We offer evidence of innovation in classroom practice on the part of returnees and ‘reinvention’ of the innovation to ensure a better fit with local needs. The key role of course participants as opinion leaders in the diffusion of new ideas is also explored. We conclude that the selective uptake of this innovation is under way and likely to be sustained against a background of continued curriculum reform in China.
Resumo:
The Code for Sustainable Homes (the Code) will require new homes in the United Kingdom to be ‘zero carbon’ from 2016. Drawing upon an evolutionary innovation perspective, this paper contributes to a gap in the literature by investigating which low and zero carbon technologies are actually being used by house builders, rather than the prevailing emphasis on the potentiality of these technologies. Using the results from a questionnaire three empirical contributions are made. First, house builders are selecting a narrow range of technologies. Second, these choices are made to minimise the disruption to their standard design and production templates (SDPTs). Finally, the coalescence around a small group of technologies is expected to intensify with solar-based technologies predicted to become more important. This paper challenges the dominant technical rationality in the literature that technical efficiency and cost benefits are the primary drivers for technology selection. These drivers play an important role but one which is mediated by the logic of maintaining the SDPTs of the house builders. This emphasises the need for construction diffusion of innovation theory to be problematized and developed within the context of business and market regimes constrained and reproduced by resilient technological trajectories.
Resumo:
The present research aimed to comprehensively explore psychopathology in Williams syndrome (WS) across the lifespan and evaluate the relationship between psychopathology and age category (child or adult), gender and cognitive ability. The parents of 50 participants with WS, aged 6-50 years, were interviewed using the Schedule for Affective Disorders and Schizophrenia for School-Age Children (K-SADS-PL). The prevalence of a wide range of Axis I DSM-IV disorders was assessed. In addition to high rates of anxiety and Attention Deficit Hyperactivity Disorder (ADHD) (38% and 20% respectively), 14% of our sample met criteria for a depressive disorder and 42% of participants were not experiencing any significant psychopathological difficulties. There was some evidence for different patterns of psychopathology between children and adults with WS and between males and females. These relationships were largely in keeping with those found in the typically developing population, thus supporting the validity of applying theory and treatment approaches for psychopathology in the typically developing population to WS.
Resumo:
Quantile forecasts are central to risk management decisions because of the widespread use of Value-at-Risk. A quantile forecast is the product of two factors: the model used to forecast volatility, and the method of computing quantiles from the volatility forecasts. In this paper we calculate and evaluate quantile forecasts of the daily exchange rate returns of five currencies. The forecasting models that have been used in recent analyses of the predictability of daily realized volatility permit a comparison of the predictive power of different measures of intraday variation and intraday returns in forecasting exchange rate variability. The methods of computing quantile forecasts include making distributional assumptions for future daily returns as well as using the empirical distribution of predicted standardized returns with both rolling and recursive samples. Our main findings are that the Heterogenous Autoregressive model provides more accurate volatility and quantile forecasts for currencies which experience shifts in volatility, such as the Canadian dollar, and that the use of the empirical distribution to calculate quantiles can improve forecasts when there are shifts
Resumo:
Computational formalisms have been pushing the boundaries of the field of computing for the last 80 years and much debate has surrounded what computing entails; what it is, and what it is not. This paper seeks to explore the boundaries of the ideas of computation and provide a framework for enabling a constructive discussion of computational ideas. First, a review of computing is given, ranging from Turing Machines to interactive computing. Then, a variety of natural physical systems are considered for their computational qualities. From this exploration, a framework is presented under which all dynamical systems can be considered as instances of the class of abstract computational platforms. An abstract computational platform is defined by both its intrinsic dynamics and how it allows computation that is meaningful to an external agent through the configuration of constraints upon those dynamics. It is asserted that a platform’s computational expressiveness is directly related to the freedom with which constraints can be placed. Finally, the requirements for a formal constraint description language are considered and it is proposed that Abstract State Machines may provide a reasonable basis for such a language.
Resumo:
This paper provides an overview of interpolation of Banach and Hilbert spaces, with a focus on establishing when equivalence of norms is in fact equality of norms in the key results of the theory. (In brief, our conclusion for the Hilbert space case is that, with the right normalisations, all the key results hold with equality of norms.) In the final section we apply the Hilbert space results to the Sobolev spaces Hs(Ω) and tildeHs(Ω), for s in R and an open Ω in R^n. We exhibit examples in one and two dimensions of sets Ω for which these scales of Sobolev spaces are not interpolation scales. In the cases when they are interpolation scales (in particular, if Ω is Lipschitz) we exhibit examples that show that, in general, the interpolation norm does not coincide with the intrinsic Sobolev norm and, in fact, the ratio of these two norms can be arbitrarily large.
Resumo:
Casson and Wadeson (International Journal of the Economics of Business, 1998, 5, pp. 5-27) have modelled the dialogue, or conversation, which customers have with their suppliers in order to convey their requirements, while taking production implications into account. They showed that this has important implications for the positioning of the boundaries of the firm. Unfortunately, their model has the restriction that communication is only costly in the direction of customer to supplier. This paper extends their model by introducing two-way communication costs. It shows that the level of communication cost in the direction of supplier to customer is a key additional factor in determining the nature of the dialogue that takes place. It also shows that this has important additional implications for the positioning of the boundaries of the firm. Custom computer software development is used as an example of an application of the theory.
Resumo:
The magnetization properties of aggregated ferrofluids are calculated by combining the chain formation model developed by Zubarev with the modified mean-field theory. Using moderate assumptions for the inter- and intrachain interactions we obtain expressions for the magnetization and initial susceptibility. When comparing the results of our theory to molecular dynamics simulations of the same model we find that at large dipolar couplings (lambda>3) the chain formation model appears to give better predictions than other analytical approaches. This supports the idea that chain formation is an important structural ingredient of strongly interacting dipolar particles.
Resumo:
The study of the mechanical energy budget of the oceans using Lorenz available potential energy (APE) theory is based on knowledge of the adiabatically re-arranged Lorenz reference state of minimum potential energy. The compressible and nonlinear character of the equation of state for seawater has been thought to cause the reference state to be ill-defined, casting doubt on the usefulness of APE theory for investigating ocean energetics under realistic conditions. Using a method based on the volume frequency distribution of parcels as a function of temperature and salinity in the context of the seawater Boussinesq approximation, which we illustrate using climatological data, we show that compressibility effects are in fact minor. The reference state can be regarded as a well defined one-dimensional function of depth, which forms a surface in temperature, salinity and density space between the surface and the bottom of the ocean. For a very small proportion of water masses, this surface can be multivalued and water parcels can have up to two statically stable levels in the reference density profile, of which the shallowest is energetically more accessible. Classifying parcels from the surface to the bottom gives a different reference density profile than classifying in the opposite direction. However, this difference is negligible. We show that the reference state obtained by standard sorting methods is equivalent, though computationally more expensive, to the volume frequency distribution approach. The approach we present can be applied systematically and in a computationally efficient manner to investigate the APE budget of the ocean circulation using models or climatological data.
Resumo:
The assessment of chess players is an increasingly attractive opportunity and an unfortunate necessity. The chess community needs to limit potential reputational damage by inhibiting cheating and unjustified accusations of cheating: there has been a recent rise in both. A number of counter-intuitive discoveries have been made by benchmarking the intrinsic merit of players’ moves: these call for further investigation. Is Capablanca actually, objectively the most accurate World Champion? Has ELO rating inflation not taken place? Stimulated by FIDE/ACP, we revisit the fundamentals of the subject to advance a framework suitable for improved standards of computational experiment and more precise results. Other domains look to chess as the demonstrator of good practice, including the rating of professionals making high-value decisions under pressure, personnel evaluation by Multichoice Assessment and the organization of crowd-sourcing in citizen science projects. The ‘3P’ themes of performance, prediction and profiling pervade all these domains.