874 resultados para Theoretical approaches
Resumo:
This review will focus on four areas of motor control which have recently been enriched both by neural network and control system models: motor planning, motor prediction, state estimation and motor learning. We will review the computational foundations of each of these concepts and present specific models which have been tested by psychophysical experiments. We will cover the topics of optimal control for motor planning, forward models for motor prediction, observer models of state estimation arid modular decomposition in motor learning. The aim of this review is to demonstrate how computational approaches, as well as proposing specific models, provide a theoretical framework to formalize the issues in motor control.
Resumo:
The 4d photoabsorption spectra of I2+, I3+, and I4+ have been obtained in the 70-127 eV region with the dual laser-produced plasma technique at time delays ranging from 400 to 520 ns. With decreasing time delay, the dominant contribution to the spectra evolves from the I2+ to the I4+ ions, and each spectrum contains discrete 4d-nf transitions and a broad 4d-epsilon f shape resonance, which are identified with the aid of multiconfiguration Hartree-Fock calculations. The excited states decay by direct autoionization involving 5s or 5p electrons, and rates for the different processes and resulting linewidths were calculated. With increasing ionization, the 4d-epsilon f shape resonance become intense and broader in going from I2+ to I3+, and then vanishes at I5+. In addition, the discrete structure of the calculated spectrum of each ion gradually approaches the corresponding shape resonance position. Based on the assumption of a normalized Boltzmann distribution among the excited states and a steady-state collisional-radiative model, we reproduced spectra which are in good agreement with experiment.
Resumo:
Past years have seen the development of different approaches to detect phytoplankton groups from space. One of these methods, the PHYSAT one, is empirically based on reflectance anomalies. Despite observations in good agreement with in situ measurements, the underlying theoretical explanation of the method is still missing and needed by the ocean color community as it prevents improvements of the methods and characterization of uncertainties on the inversed products. In this study, radiative transfer simulations are used in addition to in situ measurements to understand the organization of the signals used in PHYSAT. Sensitivity analyses are performed to assess the impact of the variability of the following three parameters on the reflectance anomalies: specific phytoplankton absorption, colored dissolved organic matter absorption, and particles backscattering. While the later parameter explains the largest part of the anomalies variability, results show that each group is generally associated with a specific bio-optical environment which should be considered to improve methods of phytoplankton groups detection.
Resumo:
Over the past decade or so a number of historians of science and historical geographers, alert to the situated nature of scientific knowledge production and reception and to the migratory patterns of science on the move, have called for more explicit treatment of the geographies of past scientific knowledge. Closely linked to work in the sociology of scientific knowledge and science studies and connected with a heightened interest in spatiality evident across the humanities and social sciences this ‹spatial turn’ has informed a wide-ranging body of work on the history of science. This discussion essay revisits some of the theoretical props supporting this turn to space and provides a number of worked examples from the history of the life sciences that demonstrate the different ways in which the spaces of science have been comprehended.
Resumo:
The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.
Resumo:
This article explores ‘temporal framing’ in the oral conte. The starting point is a recent theoretical debate around the temporal structure of narrative discourse which has highlighted a fundamental tension between the approaches of two of the most influential current theoretical models, one of which is ‘framing theory’. The specific issue concerns the role of temporal adverbials appearing at the head of the clause (e.g. dates, relative temporal adverbials such as le lendemain) versus that of temporal ‘connectives’ such as puis, ensuite, etc. Through an analysis of a corpus of contes performed at the Conservatoire contemporain de Littérature Orale, I shall explore temporal framing in the light of this theoretical debate, and shall argue that, as with other types of narrative discourse, framing is primarily a structural rather than a temporal device in oral narrative. In a final section, I shall further argue, using Kintsch’s construction-integration model of narrative processing, that framing is fundamental to the cognitive processes involved in oral story performance.
Resumo:
The scheduling problem in distributed data-intensive computing environments has become an active research topic due to the tremendous growth in grid and cloud computing environments. As an innovative distributed intelligent paradigm, swarm intelligence provides a novel approach to solving these potentially intractable problems. In this paper, we formulate the scheduling problem for work-flow applications with security constraints in distributed data-intensive computing environments and present a novel security constraint model. Several meta-heuristic adaptations to the particle swarm optimization algorithm are introduced to deal with the formulation of efficient schedules. A variable neighborhood particle swarm optimization algorithm is compared with a multi-start particle swarm optimization and multi-start genetic algorithm. Experimental results illustrate that population based meta-heuristics approaches usually provide a good balance between global exploration and local exploitation and their feasibility and effectiveness for scheduling work-flow applications. © 2010 Elsevier Inc. All rights reserved.
Resumo:
The aim of this paper is to explore the ‘natural attitude’ underpinning risk practices in child welfare. This refers to various taken-for-granted approaches to risk that social workers and other human service professionals draw upon in their everyday practice. The approach proceeds by identifying and critically examining three key, meta-theoretical paradigms on risk which typically shape the natural attitude. They are labelled ‘objectivist’, ‘subjectivist’ and ‘critical’. The ontological, epistemological, axiological and methodological premises supporting each paradigm, and how they shape risk practices, are then reviewed leading to a composite, meta-theoretical position on risk termed ‘methodological pragmatism’. This position draws on the strengths of each paradigm and is formulated into ten propositions which consider how risk should be approached in child welfare. Within this corpus of thought salient themes are endorsed such as the need for method triangulation, an examination of ‘deep causality’, and the promotion of emancipatory perspectives. By critically reflecting on meta-theory, the paper contributes to the development of substantive theories of risk assessment and management in child welfare.
Resumo:
There is extensive theoretical work on measures of inconsistency for arbitrary formulae in knowledge bases. Many of these are defined in terms of the set of minimal inconsistent subsets (MISes) of the base. However, few have been implemented or experimentally evaluated to support their viability, since computing all MISes is intractable in the worst case. Fortunately, recent work on a related problem of minimal unsatisfiable sets of clauses (MUSes) offers a viable solution in many cases. In this paper, we begin by drawing connections between MISes and MUSes through algorithms based on a MUS generalization approach and a new optimized MUS transformation approach to finding MISes. We implement these algorithms, along with a selection of existing measures for flat and stratified knowledge bases, in a tool called mimus. We then carry out an extensive experimental evaluation of mimus using randomly generated arbitrary knowledge bases. We conclude that these measures are viable for many large and complex random instances. Moreover, they represent a practical and intuitive tool for inconsistency handling.
Resumo:
The selective catalytic reduction (SCR) of NOx compounds with NH3 is a hot topic in recent years. Among various catalysts, zeolites are proved to be efficient and promising for NH3-SCR, yet the whole processes and intrinsic mechanism are still not well understood due to the structural complexity of zeolites. With the improvement of theoretical chemistry techniques, quantum-chemical calculations are now capable of modeling the structure, acidity, adsorption, and ultimately reaction pathways over zeolites to some extent. In this review, a brief summary of relevant concepts of NH3-SCR is presented. Cluster approaches, embedded techniques, and periodic treatments are described as three main methods. Details of quantum-chemical investigations toward the key issues such as, the structure of active sites, the adsorption of small molecules, and the reaction mechanism of NH3-SCR over zeolites are discussed. Finally, a perspective for future theoretical research is given.
Resumo:
Law's Ethical, Global and Theoretical Contexts examines William Twining's principal contributions to law and jurisprudence in the context of three issues which will receive significant scholarly attention over the coming decades. Part I explores human rights, including torture, the role of evidence in human rights cases, the emerging discourse on 'traditional values', the relevance of 'Southern voices' to human rights debates, and the relationship between human rights and peace agreements. Part II assesses the impact of globalization through the lenses of sociology and comparative constitutionalism, and features an analysis of the development of pluralistic ideas of law in the context of privatization. Finally, Part III addresses issues of legal theory, including whether global legal pluralism needs a concept of law, the importance of context in legal interpretation, the effect of increasing digitalization on legal theory, and the utility of feminist and postmodern approaches to globalization and legal theory.
Resumo:
Religion is a funny thing, because it always seems to be riding two horses at once. One could describe these horses in a number of different ways, using all sorts of familiar dichotomies; practice and belief, body and soul, earthly and heavenly, here and hereafter. “Give us this day our daily bread and forgive us our trespasses”. Here, food and forgiveness, or, perhaps more accurately, ingestion and salvation, are claimed, simultaneously – even seamlessly – by religion. This list could (and does) go on, being inclusive of, for example, immanence and transcendence – but more on this below. Yet these binary pairs can clearly be observed bleeding into one another. Ingesting pork, for example, often appears to be religiously more troublesome than does ingesting bread. This is because matter matters. We may ask, then, is religion really riding two horses, or are these ‘familiar dichotomies’ so familiar because they are false? Rephrasing the question in terms that partially echo the title and subtitle of Morgan’s (2010) landmark edited volume Religion and Material Culture: The Matter of Belief, is, I think, helpfully clarifying. What, then, is the matter with religion? The answer presented below is that, very often, the matter with religion is the matter of religion. Put more simply still, the problem with religion is its materiality. This chapter examines the whys and wherefores of this problem for the anthropology of religion – its ethnographic puzzles and methodological opportunities, as well as its conceptual impasses and theoretical insights.
Resumo:
Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schrödinger equation for low-dimensionality systems.
Resumo:
This study considers the potential for influencing business students to become ethical managers by directing their undergraduate learning environment. In particular, the relationship between business students’ academic cheating, as a predictor of workplace ethical behavior, and their approaches to learning is explored. The three approaches to learning identified from the students’ approaches to learning literature are deep approach, represented by an intrinsic interest in and a desire to understand the subject, surface approach, characterized by rote learning and memorization without understanding, and strategic approach, associated with competitive students whose motivation is the achievement of good grades by adopting either a surface or deep approach. Consistent with the hypothesized theoretical model, structural equation modeling revealed that the surface approach is associated with higher levels of cheating, while the deep approach is related to lower levels. The strategic approach was also associated with less cheating and had a statistically stronger influence than the deep approach. Further, a significantly positive relationship reported between deep and strategic approaches suggests that cheating is reduced when deep and strategic approaches are paired. These findings suggest that future managers and business executives can be influenced to behave more ethically in the workplace by directing their learning approaches. It is hoped that the evidence presented may encourage those involved in the design of business programs to implement educational strategies which optimize students’ approaches to learning towards deep and strategic characteristics, thereby equipping tomorrow’s managers and business executives with skills to recognize and respond appropriately to workplace ethical dilemmas.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.