880 resultados para theory and modeling
Resumo:
Aquifers are a vital water resource whose quality characteristics must be safeguarded or, if damaged, restored. The extent and complexity of aquifer contamination is related to characteristics of the porous medium, the influence of boundary conditions, and the biological, chemical and physical processes. After the nineties, the efforts of the scientists have been increased exponentially in order to find an efficient way for estimating the hydraulic parameters of the aquifers, and thus, recover the contaminant source position and its release history. To simplify and understand the influence of these various factors on aquifer phenomena, it is common for researchers to use numerical and controlled experiments. This work presents some of these methods, applying and comparing them on data collected during laboratory, field and numerical tests. The work is structured in four parts which present the results and the conclusions of the specific objectives.
Resumo:
The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
For anyone involved in developing a research project, this textbook provides an integrated, accessible and humorous account that explains why research methods are the way they are and how they do what they do. Unrivalled in its nature Doing Business Research addresses the research project as a whole and provides: - essential detail of philosophical and theoretical matters that are crucial to conceptualising the nature of methodology - a pragmatic guide to why things are important and how they are important - a huge range of things to consider that the reader can use to develop their research project further - a resource book, providing extensive suggested reading to help the researcher do their research.
Resumo:
It has been suggested that, in order to maintain its relevance, critical research must develop a strong emphasis on empirical work rather than the conceptual emphasis that has typically characterized critical scholarship in management. A critical project of this nature is applicable in the information systems (IS) arena, which has a growing tradition of qualitative inquiry. Despite its relativist ontology, actor–network theory places a strong emphasis on empirical inquiry and this paper argues that actor–network theory, with its careful tracing and recording of heterogeneous networks, is well suited to the generation of detailed and contextual empirical knowledge about IS. The intention in this paper is to explore the relevance of IS research informed by actor–network theory in the pursuit of a broader critical research project as de? ned in earlier work.
Resumo:
We study the role of political accountability as a determinant of corruption and economic growth. Our model identifies two governance regimes defined by the quality of political institutions and shows that the relationship between corruption and growth is regime specific. We use a threshold model to estimate the impact of corruption on growth where corruption is treated as an endogenous variable. We find two governance regimes, conditional on the quality of political institutions. In the regime with high quality political institutions, corruption has a substantial negative impact on growth. In the regime with low quality institutions, corruption has no impact on growth.
Resumo:
The book aims to introduce the reader to DEA in the most accessible manner possible. It is specifically aimed at those who have had no prior exposure to DEA and wish to learn its essentials, how it works, its key uses, and the mechanics of using it. The latter will include using DEA software. Students on degree or training courses will find the book especially helpful. The same is true of practitioners engaging in comparative efficiency assessments and performance management within their organisation. Examples are used throughout the book to help the reader consolidate the concepts covered. Table of content: List of Tables. List of Figures. Preface. Abbreviations. 1. Introduction to Performance Measurement. 2. Definitions of Efficiency and Related Measures. 3. Data Envelopment Analysis Under Constant Returns to Scale: Basic Principles. 4. Data Envelopment Analysis under Constant Returns to Scale: General Models. 5. Using Data Envelopment Analysis in Practice. 6. Data Envelopment Analysis under Variable Returns to Scale. 7. Assessing Policy Effectiveness and Productivity Change Using DEA. 8. Incorporating Value Judgements in DEA Assessments. 9. Extensions to Basic DEA Models. 10. A Limited User Guide for Warwick DEA Software. Author Index. Topic Index. References.
Resumo:
There would seem to be no greater field for observing the effects of neo-liberal reforms in higher education than the former Soviet university, where attempts to legitimize neo-liberal philosophy over Soviet ideology plays out in everyday practices of educational reform. However, ethnographic research about higher education in post-Soviet Central Asia suggests that its “liberalization” is both an ideological myth and a complicated reality. This chapter focuses on how and why neo-liberal agendas have “travelled” to the Central Asian republic of Kyrgyzstan, what happens when educators encounter and resist them, and why these spaces of resistance are important starting points for the development of alternative visions of educational possibility in this recently “Third-worlded” society.
Resumo:
Many see the absence of conflict between groups as indicative of effective intergroup relations. Others consider its management a suitable effectiveness criterion. In this article we demarcate a different approach and propose that these views are deficient in describing effective intergroup relations. The article theorizes alternative criteria of intergroup effectiveness rooted in team representatives' subjective value judgements and assesses the psychometric characteristics of a short measure based on these criteria. Results on empirical validity suggest the measure to be a potential alternative outcome of organizational conflict. Implications for both the study of intergroup relations and conflict theory are discussed. © 2005 Psychology Press Ltd.
Resumo:
The introduction situates the ‘hard problem’ in its historical context and argues that the problem has two sides: the output side (the Kant-Eccles problem of the freedom of the Will) and the input side (the problem of qualia). The output side ultimately reduces to whether quantum mechanics can affect the operation of synapses. A discussion of the detailed molecular biology of synaptic transmission as presently understood suggests that such affects are unlikely. Instead an evolutionary argument is presented which suggests that our conviction of free agency is an evolutionarily induced illusion and hence that the Kant-Eccles problem is itself illusory. This conclusion is supported by well-known neurophysiology. The input side, the problem of qualia, of subjectivity, is not so easily outflanked. After a brief review of the neurophysiological correlates of consciousness (NCC) and of the Penrose-Hameroff microtubular neuroquantology it is again concluded that the molecular neurobiology makes quantum wave-mechanics an unlikely explanation. Instead recourse is made to an evolutionarily- and neurobiologically-informed panpsychism. The notion of an ‘emergent’ property is carefully distinguished from that of the more usual ‘system’ property used by most dual-aspect theorists (and the majority of neuroscientists) and used to support Llinas’ concept of an ‘oneiric’ consciousness continuously modified by sensory input. I conclude that a panpsychist theory, such as this, coupled with the non-classical understanding of matter flowing from quantum physics (both epistemological and scientific) may be the default and only solution to the problem posed by the presence of mind in a world of things.
Resumo:
It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.
Resumo:
This work explores the relevance of semantic and linguistic description to translation, theory and practice. It is aimed towards a practical model of approach to texts to translate. As literary texts [poetry mainly] are the focus of attention, so are stylistic matters. Note, however, that 'style', and, to some extent, the conclusions of the work, are not limited to so-called literary texts. The study of semantic description reveals that most translation problems do not stem from the cognitive (langue-related), but rather from the contextual (parole-related) aspects of meaning. Thus, any linguistic model that fails to account for the latter is bound to fall short. T.G.G. does, whereas Systemics, concerned with both the 'Iangue' and 'parole' (stylistic and sociolinguistic mainly) aspects of meaning, provides a useful framework of approach to texts to translate. Two essential semantic principles for translation are: that meaning is the property of a language (Firth); and the 'relativity of meaning assignments' (Tymoczko). Both imply that meaning can only be assessed, correctly, in the relevant socio-cultural background. Translation is seen as a restricted creation, and the translator's encroach as a three-dimensional critical one. To encompass the most technical to the most literary text, and account for variations in emphasis in any text, translation theory must be based on typology of function Halliday's ideational, interpersonal and textual, or, Buhler's symbol, signal, symptom, Functions3. Function Coverall and specific] will dictate aims and method, and also provide the critic with criteria to assess translation Faithfulness. Translation can never be reduced to purely objective methods, however. Intuitive procedures intervene, in textual interpretation and analysis, in the choice of equivalents, and in the reception of a translation. Ultimately, translation, theory and practice, may perhaps constitute the touchstone as regards the validity of linguistic and semantic theories.