217 resultados para determinism
Resumo:
This thesis examines the theory of technological determinism, which espouses the view that technological change drives social change, through an analysis of the impact of new media on higher education models in the United States of America. In so doing, it explores the impacts of new media technologies on higher education, in particular, and society in general. The thesis reviews the theoretical shape of the discourse surrounding new media technologies before narrowing in on utopian claims about the impact of new media technologies on education. It tests these claims through a specific case study of higher education in the USA. The study investigates whether 'new' media technologies (eg the Internet) are resulting in new forms of higher education in the USA and whether the blurring of information and entertainment technologies has caused a similar blurring in education and entertainment providers. It uses primary data gathered by the author in a series of interviews with key education, industry and media representatives in North America in 1997. Chapter 2 looks at the literature and history surrounding several topics central to the thesis - the discourses of technological determinism, the history of technology use and adoption in education, and impacts of new media technologies on education. Chapter 3 presents the findings of the American case study on the relationship between media and higher education and Chapter 4 concludes and synthesises the investigation.
Resumo:
The environment moderates behaviour using a subtle language of ‘affordances’ and ‘behaviour-settings’. Affordances are environmental offerings. They are objects that demand action; a cliff demands a leap and binoculars demand a peek. Behaviour-settings are ‘places;’ spaces encoded with expectations and meanings. Behaviour-settings work the opposite way to affordances; they demand inhibition; an introspective demeanour in a church or when under surveillance. Most affordances and behaviour-settings are designed, and as such, designers are effectively predicting brain reactions. • Affordances are nested within, and moderated by behaviour-settings. Both trigger automatic neural responses (excitation and inhibition). These, for the best part cancel each other out. This balancing enables object recognition and allows choice about what action should be taken (if any). But when excitation exceeds inhibition, instinctive action will automatically commence. In positive circumstances this may mean laughter or a smile. In negative circumstances, fleeing, screaming or other panic responses are likely. People with poor frontal function, due to immaturity (childhood or developmental disorders) or due to hypofrontality (schizophrenia, brain damage or dementia) have a reduced capacity to balance excitatory and inhibitory impulses. For these people, environmental behavioural demands increase with the decline of frontal brain function. • The world around us is not only encoded with symbols and sensory information. Opportunities and restrictions work on a much more primal level. Person/space interactions constantly take place at a molecular scale. Every space we enter has its own special dynamic, where individualism vies for supremacy between the opposing forces of affordance-related excitation and the inhibition intrinsic to behaviour-settings. And in this context, even a small change–the installation of a CCTV camera can turn a circus to a prison. • This paper draws on cutting-edge neurological theory to understand the psychological determinates of the everyday experience of the designed environment.
Exploring processes of indeterminate determinism in music composition, programming and improvisation
Resumo:
This portfolio consists of 15 original musical works. Taking the form of electronic and acousmatic music, multimedia, and scores, these chamber works serve as a result of experimentation and improvisation with individually built computer interfaces. The accompanying commentary provides discourse on the conceptual practice of these interfaces becoming a compositional entity that present a multi-interpretative opportunity to explore, engage, and personalise. Following this, the commentary examines the path of creative decisions and musical choices that formed both these interfaces and the resulting musical and visual works. This portfolio is accompanied by interfaces used, transcoded interfacing behavioural information, and documented improvisational findings.
Resumo:
The role of climatic change in determining the shape of human evolution,
a theme that came to prominence during the early years of the twentieth
century, has resurfaced with renewed vigor. The author examines the rise and
resurgence of the modern history of the idea that hominid evolutionary pathways
have been triggered by climatic causes to illustrate the continuing vitality of environmental determinism and to highlight some continuities between early-twentieth century and contemporary archaeoanthropology.
Resumo:
Dans cette thèse l’ancienne question philosophique “tout événement a-t-il une cause ?” sera examinée à la lumière de la mécanique quantique et de la théorie des probabilités. Aussi bien en physique qu’en philosophie des sciences la position orthodoxe maintient que le monde physique est indéterministe. Au niveau fondamental de la réalité physique – au niveau quantique – les événements se passeraient sans causes, mais par chance, par hasard ‘irréductible’. Le théorème physique le plus précis qui mène à cette conclusion est le théorème de Bell. Ici les prémisses de ce théorème seront réexaminées. Il sera rappelé que d’autres solutions au théorème que l’indéterminisme sont envisageables, dont certaines sont connues mais négligées, comme le ‘superdéterminisme’. Mais il sera argué que d’autres solutions compatibles avec le déterminisme existent, notamment en étudiant des systèmes physiques modèles. Une des conclusions générales de cette thèse est que l’interprétation du théorème de Bell et de la mécanique quantique dépend crucialement des prémisses philosophiques desquelles on part. Par exemple, au sein de la vision d’un Spinoza, le monde quantique peut bien être compris comme étant déterministe. Mais il est argué qu’aussi un déterminisme nettement moins radical que celui de Spinoza n’est pas éliminé par les expériences physiques. Si cela est vrai, le débat ‘déterminisme – indéterminisme’ n’est pas décidé au laboratoire : il reste philosophique et ouvert – contrairement à ce que l’on pense souvent. Dans la deuxième partie de cette thèse un modèle pour l’interprétation de la probabilité sera proposé. Une étude conceptuelle de la notion de probabilité indique que l’hypothèse du déterminisme aide à mieux comprendre ce que c’est qu’un ‘système probabiliste’. Il semble que le déterminisme peut répondre à certaines questions pour lesquelles l’indéterminisme n’a pas de réponses. Pour cette raison nous conclurons que la conjecture de Laplace – à savoir que la théorie des probabilités présuppose une réalité déterministe sous-jacente – garde toute sa légitimité. Dans cette thèse aussi bien les méthodes de la philosophie que de la physique seront utilisées. Il apparaît que les deux domaines sont ici solidement reliés, et qu’ils offrent un vaste potentiel de fertilisation croisée – donc bidirectionnelle.
Resumo:
Multiple equilibria in a coupled ocean–atmosphere–sea ice general circulation model (GCM) of an aquaplanet with many degrees of freedom are studied. Three different stable states are found for exactly the same set of parameters and external forcings: a cold state in which a polar sea ice cap extends into the midlatitudes; a warm state, which is ice free; and a completely sea ice–covered “snowball” state. Although low-order energy balance models of the climate are known to exhibit intransitivity (i.e., more than one climate state for a given set of governing equations), the results reported here are the first to demonstrate that this is a property of a complex coupled climate model with a consistent set of equations representing the 3D dynamics of the ocean and atmosphere. The coupled model notably includes atmospheric synoptic systems, large-scale circulation of the ocean, a fully active hydrological cycle, sea ice, and a seasonal cycle. There are no flux adjustments, with the system being solely forced by incoming solar radiation at the top of the atmosphere. It is demonstrated that the multiple equilibria owe their existence to the presence of meridional structure in ocean heat transport: namely, a large heat transport out of the tropics and a relatively weak high-latitude transport. The associated large midlatitude convergence of ocean heat transport leads to a preferred latitude at which the sea ice edge can rest. The mechanism operates in two very different ocean circulation regimes, suggesting that the stabilization of the large ice cap could be a robust feature of the climate system. Finally, the role of ocean heat convergence in permitting multiple equilibria is further explored in simpler models: an atmospheric GCM coupled to a slab mixed layer ocean and an energy balance model
Resumo:
This is the first half of a two-part paper which deals with the social theoretic assumptions underlying system dynamics. The motivation is that clarification in this area can help mainstream social scientists to understand how our field relates to their literature, methods and concerns. Part I has two main sections. The aim of the first is to answer the question: How do the ideas of system dynamics relate to traditional social theories? The theoretic assumptions of the field are seldom explicit but rather are implicit in its practice. The range of system dynamics practice is therefore considered and related to a framework - widely used in both operational research (OR) and systems science - that organises the assumptions behind traditional social theoretic paradigms. Distinct and surprisingly varied groupings of practice are identified, making it difficult to place system dynamics in any one paradigm with any certainty. The difficulties of establishing a social theoretic home for system dynamics are exemplified in the second main section. This is done by considering the question: Is system dynamics deterministic? An analysis shows that attempts to relate system dynamics to strict notions of voluntarism or determinism quickly indicate that the field does not fit with either pole of this dichotomous, and strictly paradigmatic, view. Part I therefore concludes that definitively placing system dynamics with respect to traditional social theories is highly problematic. The scene is therefore set for Part II of the paper, which proposes an innovative and potentially fruitful resolution to this problem.
Resumo:
Background: Deterministic evolution, phylogenetic contingency and evolutionary chance each can influence patterns of morphological diversification during adaptive radiation. In comparative studies of replicate radiations, convergence in a common morphospace implicates determinism, whereas non-convergence suggests the importance of contingency or chance. Methodology/Principal Findings: The endemic cichlid fish assemblages of the three African great lakes have evolved similar sets of ecomorphs but show evidence of non-convergence when compared in a common morphospace, suggesting the importance of contingency and/or chance. We then analyzed the morphological diversity of each assemblage independently and compared their axes of diversification in the unconstrained global morphospace. We find that despite differences in phylogenetic composition, invasion history, and ecological setting, the three assemblages are diversifying along parallel axes through morphospace and have nearly identical variance-covariance structures among morphological elements. Conclusions/Significance: By demonstrating that replicate adaptive radiations are diverging along parallel axes, we have shown that non-convergence in the common morphospace is associated with convergence in the global morphospace. Applying these complimentary analyses to future comparative studies will improve our understanding of the relationship between morphological convergence and non-convergence, and the roles of contingency, chance and determinism in driving morphological diversification.
Resumo:
The rank-based nonlinear predictability score was recently introduced as a test for determinism in point processes. We here adapt this measure to time series sampled from time-continuous flows. We use noisy Lorenz signals to compare this approach against a classical amplitude-based nonlinear prediction error. Both measures show an almost identical robustness against Gaussian white noise. In contrast, when the amplitude distribution of the noise has a narrower central peak and heavier tails than the normal distribution, the rank-based nonlinear predictability score outperforms the amplitude-based nonlinear prediction error. For this type of noise, the nonlinear predictability score has a higher sensitivity for deterministic structure in noisy signals. It also yields a higher statistical power in a surrogate test of the null hypothesis of linear stochastic correlated signals. We show the high relevance of this improved performance in an application to electroencephalographic (EEG) recordings from epilepsy patients. Here the nonlinear predictability score again appears of higher sensitivity to nonrandomness. Importantly, it yields an improved contrast between signals recorded from brain areas where the first ictal EEG signal changes were detected (focal EEG signals) versus signals recorded from brain areas that were not involved at seizure onset (nonfocal EEG signals).
Resumo:
We describe the current status of and provide performance results for a prototype compiler of Prolog to C, ciaocc. ciaocc is novel in that it is designed to accept different kinds of high-level information, typically obtained via an automatic analysis of the initial Prolog program and expressed in a standardized language of assertions. This information is used to optimize the resulting C code, which is then processed by an off-the-shelf C compiler. The basic translation process essentially mimics the unfolding of a bytecode emulator with respect to the particular bytecode corresponding to the Prolog program. This is facilitated by a flexible design of the instructions and their lower-level components. This approach allows reusing a sizable amount of the machinery of the bytecode emulator: predicates already written in C, data definitions, memory management routines and áreas, etc., as well as mixing emulated bytecode with native code in a relatively straightforward way. We report on the performance of programs compiled by the current versión of the system, both with and without analysis information.
Improving the compilation of prolog to C using type and determinism information: Preliminary results
Resumo:
We describe the current status of and provide preliminary performance results for a compiler of Prolog to C. The compiler is novel in that it is designed to accept different kinds of high-level information (typically obtained via an analysis of the initial Prolog program and expressed in a standardized language of assertions) and use this information to optimize the resulting C code, which is then further processed by an off-the-shelf C compiler. The basic translation process used essentially mimics an unfolding of a C-coded bytecode emúlator with respect to the particular bytecode corresponding to the Prolog program. Optimizations are then applied to this unfolded program. This is facilitated by a more flexible design of the bytecode instructions and their lower-level components. This approach allows reusing a sizable amount of the machinery of the bytecode emulator: ancillary pieces of C code, data definitions, memory management routines and áreas, etc., as well as mixing bytecode emulated code with natively compiled code in a relatively straightforward way We report on the performance of programs compiled by the current versión of the system, both with and without analysis information.
Resumo:
The testing of concurrent software components can be difficult due to the inherent non-determinism present in these components. For example, if the same test case is run multiple times, it may produce different results. This non-determinism may lead to problems with determining expected outputs. In this paper, we present and discuss several possible solutions to this problem in the context of testing concurrent Java components using the ConAn testing tool. We then present a recent extension to the tool that provides a general solution to this problem that is sufficient to deal with the level of non-determinism that we have encountered in testing over 20 components with ConAn. © 2005 IEEE