12 resultados para Subjectivity and objectivity
em Aston University Research Archive
Resumo:
Techniques are developed for the visual interpretation of drainage features from satellite imagery. The process of interpretation is formalised by the introduction of objective criteria. Problems of assessing the accuracy of maps are recognized, and a method is developed for quantifying the correctness of an interpretation, in which the more important features are given an appropriate weight. A study was made of imagery from a variety of landscapes in Britain and overseas, from which maps of drainage networks were drawn. The accuracy of the mapping was assessed in absolute terms, and also in relation to the geomorphic parameters used in hydrologic models. Results are presented relating the accuracy of interpretation to image quality, subjectivity and the effects of topography. It is concluded that the visual interpretation of satellite imagery gives maps of sufficient accuracy for the preliminary assessment of water resources, and for the estimation of geomorphic parameters. An examination is made of the use of remotely sensed data in hydrologic models. It is proposed that the spectral properties of a scene are holistic, and are therefore more efficient than conventional catchment characteristics. Key hydrologic parameters were identified, and were estimated from streamflow records. The correlation between hydrologic variables and spectral characteristics was examined, and regression models for streamflow were developed, based solely on spectral data. Regression models were also developed using conventional catchment characteristics, whose values were estimated using satellite imagery. It was concluded that models based primarily on variables derived from remotely sensed data give results which are as good as, or better than, models using conventional map data. The holistic properties of remotely sensed data are realised only in undeveloped areas. In developed areas an assessment of current land-use is a more useful indication of hydrologic response.
Resumo:
Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.
Resumo:
In this paper, the exchange rate forecasting performance of neural network models are evaluated against the random walk, autoregressive moving average and generalised autoregressive conditional heteroskedasticity models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore, the parameters are chosen according to what the researcher considers to be the best. Such an approach, however,implies that the risk of making bad decisions is extremely high, which could explain why in many studies, neural network models do not consistently perform better than their time series counterparts. In this paper, through extensive experimentation, the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of Forecasting exchange rates with linear and nonlinear models 415 performing well. The results show that in general, neural network models perform better than the traditionally used time series models in forecasting exchange rates.
Resumo:
Linear models reach their limitations in applications with nonlinearities in the data. In this paper new empirical evidence is provided on the relative Euro inflation forecasting performance of linear and non-linear models. The well established and widely used univariate ARIMA and multivariate VAR models are used as linear forecasting models whereas neural networks (NN) are used as non-linear forecasting models. It is endeavoured to keep the level of subjectivity in the NN building process to a minimum in an attempt to exploit the full potentials of the NN. It is also investigated whether the historically poor performance of the theoretically superior measure of the monetary services flow, Divisia, relative to the traditional Simple Sum measure could be attributed to a certain extent to the evaluation of these indices within a linear framework. Results obtained suggest that non-linear models provide better within-sample and out-of-sample forecasts and linear models are simply a subset of them. The Divisia index also outperforms the Simple Sum index when evaluated in a non-linear framework. © 2005 Taylor & Francis Group Ltd.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is unproductive. A risk-based decision support system (DSS) that reduces the amount of time spent on inspection has been presented. The risk-based DSS uses the analytic hierarchy process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of occurrence of these risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost and the cumulative effect of failure is determined through probability analysis. The model optimizes the cost of pipeline operations by reducing subjectivity in selecting a specific inspection method, identifying and prioritizing the right pipeline segment for inspection and maintenance, deriving budget allocation, providing guidance to deploy the right mix labor for inspection and maintenance, planning emergency preparation, and deriving logical insurance plan. The proposed methodology also helps derive inspection and maintenance policy for the entire pipeline system, suggest design, operational philosophy, and construction methodology for new pipelines.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.
Resumo:
In this paper the exchange rate forecasting performance of neural network models are evaluated against random walk and a range of time series models. There are no guidelines available that can be used to choose the parameters of neural network models and therefore the parameters are chosen according to what the researcher considers to be the best. Such an approach, however, implies that the risk of making bad decisions is extremely high which could explain why in many studies neural network models do not consistently perform better than their time series counterparts. In this paper through extensive experimentation the level of subjectivity in building neural network models is considerably reduced and therefore giving them a better chance of performing well. Our results show that in general neural network models perform better than traditionally used time series models in forecasting exchange rates.
Resumo:
The introduction situates the ‘hard problem’ in its historical context and argues that the problem has two sides: the output side (the Kant-Eccles problem of the freedom of the Will) and the input side (the problem of qualia). The output side ultimately reduces to whether quantum mechanics can affect the operation of synapses. A discussion of the detailed molecular biology of synaptic transmission as presently understood suggests that such affects are unlikely. Instead an evolutionary argument is presented which suggests that our conviction of free agency is an evolutionarily induced illusion and hence that the Kant-Eccles problem is itself illusory. This conclusion is supported by well-known neurophysiology. The input side, the problem of qualia, of subjectivity, is not so easily outflanked. After a brief review of the neurophysiological correlates of consciousness (NCC) and of the Penrose-Hameroff microtubular neuroquantology it is again concluded that the molecular neurobiology makes quantum wave-mechanics an unlikely explanation. Instead recourse is made to an evolutionarily- and neurobiologically-informed panpsychism. The notion of an ‘emergent’ property is carefully distinguished from that of the more usual ‘system’ property used by most dual-aspect theorists (and the majority of neuroscientists) and used to support Llinas’ concept of an ‘oneiric’ consciousness continuously modified by sensory input. I conclude that a panpsychist theory, such as this, coupled with the non-classical understanding of matter flowing from quantum physics (both epistemological and scientific) may be the default and only solution to the problem posed by the presence of mind in a world of things.
Resumo:
The central theses of Kant's critical philosophy are sometimes said to have been overtaken by evolutionary biology. This paper considers how far this proposition can be sustained. I argue that the ‘architectonic’ or ‘system-building’ character of the mind, the categories and the forms of intuition, can indeed be seen as the outcome of a particular evolutionary lineage in a Darwinian world. I argue, further, that the principal motive energizing the critical philosophy is the 'nightmare' of physical determinism. An alternative escape route from this particular nightmare is rehearsed. If this route is taken, the intricate arguments of the Critiques are unnecessary to save moral action in a world of things. Nonetheless, insofar as 'first philosophy' necessarily starts from within the philosopher's own subjectivity, Kant's work retains its power. I suggest that the Kantian and the Darwinian interpretations are to an extent complementary. If this is so, some form of evolutionarily-informed dual-aspect psychoneural identity theory could combine the essence of the two interpretations.
Resumo:
The primary aim of this thesis was to investigate the in vivo ocular morphological and contractile changes occurring within the accommodative apparatus prior to the onset of presbyopia, with particular reference to ciliary muscle changes with age and the origin of a myopic shift in refraction during incipient presbyopia. Commissioned semi-automated software proved capable of extracting accurate and repeatable measurements from crystalline lens and ciliary muscle Anterior Segment Optical Coherence Tomography (AS-OCT) images and reduced the subjectivity of AS-OCT image analysis. AS-OCT was utilised to document longitudinal changes in ciliary muscle morphology within an incipient presbyopic population (n=51). A significant antero-inwards shift of ciliary muscle mass was observed after 2.5 years. Furthermore, in a subgroup study (n=20), an accommodative antero-inwards movement of ciliary muscle mass was evident. After 2.5 years, the centripetal response of the ciliary muscle significantly attenuated during accommodation, whereas the antero-posterior mobility of the ciliary muscle remained invariant. Additionally, longitudinal measurement of ocular biometry revealed a significant increase in crystalline lens thickness and a corresponding decrease in anterior chamber depth after 2.5 years (n=51). Lenticular changes appear to be determinant of changes in refraction during incipient presbyopia. During accommodation, a significant increase in crystalline lens thickness and axial length was observed, whereas anterior chamber depth decreased (n=20). The change in ocular biometry per dioptre of accommodation exerted remained invariant after 2.5 years. Cross-sectional ocular biometric data were collected to quantify accommodative axial length changes from early adulthood to advanced presbyopia (n=72). Accommodative axial length elongation significantly attenuated during presbyopia, which was consistent with a significant increase in ocular rigidity during presbyopia. The studies presented in this thesis support the Helmholtz theory of accommodation and despite the reduction in centripetal ciliary muscle contractile response with age, primarily implicate lenticular changes in the development of presbyopia.
Resumo:
Intersubjectivity is an important concept in psychology and sociology. It refers to sharing conceptualizations through social interactions in a community and using such shared conceptualization as a resource to interpret things that happen in everyday life. In this work, we make use of intersubjectivity as the basis to model shared stance and subjectivity for sentiment analysis. We construct an intersubjectivity network which links review writers, terms they used, as well as the polarities of the terms. Based on this network model, we propose a method to learn writer embeddings which are subsequently incorporated into a convolutional neural network for sentiment analysis. Evaluations on the IMDB, Yelp 2013 and Yelp 2014 datasets show that the proposed approach has achieved the state-of-the-art performance.