46 resultados para Adaptive object model
em Aston University Research Archive
Resumo:
Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.
Resumo:
When object databases arrived on the scene some ten years ago, they provided database capabilities for previously neglected, complex applications, such as CAD, but were burdened with one inherent teething problem, poor performance. Physical database design is one tool that can provide performance improvements and it is the general area of concern for this thesis. Clustering is one fruitful design technique which can provide improvements in performance. However, clustering in object databases has not been explored in depth and so has not been truly exploited. Further, clustering, although a physical concern, can be determined from the logical model. The object model is richer than previous models, notably the relational model, and so it is anticipated that the opportunities with respect to clustering are greater. This thesis provides a thorough analysis of object clustering strategies with a view to highlighting any links between the object logical and physical model and improving performance. This is achieved by considering all possible types of object logical model construct and the implementation of those constructs in terms of theoretical clusterings strategies to produce actual clustering arrangements. This analysis results in a greater understanding of object clustering strategies, aiding designers in the development process and providing some valuable rules of thumb to support the design process.
Resumo:
This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.
Resumo:
The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.
Resumo:
This paper presents a forecasting technique for forward electricity/gas prices, one day ahead. This technique combines a Kalman filter (KF) and a generalised autoregressive conditional heteroschedasticity (GARCH) model (often used in financial forecasting). The GARCH model is used to compute next value of a time series. The KF updates parameters of the GARCH model when the new observation is available. This technique is applied to real data from the UK energy markets to evaluate its performance. The results show that the forecasting accuracy is improved significantly by using this hybrid model. The methodology can be also applied to forecasting market clearing prices and electricity/gas loads.
Resumo:
In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over time. A relational model of classrooms is proposed which focuses on the relations between different elements (physical, environmental, cognitive, social) in the classroom and on how their interaction is crucial in understanding and describing classroom action.
Resumo:
This first edition of the workshop Model-driven Software Adaptation (M-ADAPT'07) took place in the Technische Universität Berlin with the International Conference ECOOP'07 in the beautiful and buzzing city of Berlin, on the 30th of July, 2007. The workshop was organized by Gordon Blair, Nelly Bencomo, and Robert France. Participants explored how to develop appropriate model-driven approaches to model, analyze, and validate the volatile properties of the behaviour of adaptive systems and its environments. This report gives an overview of the presentations as well as an account of the fruitful discussions that took place at M-ADAPT'07. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
Engineering adaptive software is an increasingly complex task. Here, we demonstrate Genie, a tool that supports the modelling, generation, and operation of highly reconfigurable, component-based systems. We showcase how Genie is used in two case-studies: i) the development and operation of an adaptive flood warning system, and ii) a service discovery application. In this context, adaptation is enabled by the Gridkit reflective middleware platform.
Resumo:
Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.
Resumo:
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
The development of strategy remains a debate for academics and a concern for practitioners. Published research has focused on producing models for strategy development and on studying how strategy is developed in organisations. The Operational Research literature has highlighted the importance of considering complexity within strategic decision making; but little has been done to link strategy development with complexity theories, despite organisations and organisational environments becoming increasingly more complex. We review the dominant streams of strategy development and complexity theories. Our theoretical investigation results in the first conceptual framework which links an established Strategic Operational Research model, the Strategy Development Process model, with complexity via Complex Adaptive Systems theory. We present preliminary findings from the use of this conceptual framework applied to a longitudinal, in-depth case study, to demonstrate the advantages of using this integrated conceptual model. Our research shows that the conceptual model proposed provides rich data and allows for a more holistic examination of the strategy development process. © 2012 Operational Research Society Ltd. All rights reserved.
Resumo:
The kinematic mapping of a rigid open-link manipulator is a homomorphism between Lie groups. The homomorphisrn has solution groups that act on an inverse kinematic solution element. A canonical representation of solution group operators that act on a solution element of three and seven degree-of-freedom (do!) dextrous manipulators is determined by geometric analysis. Seven canonical solution groups are determined for the seven do! Robotics Research K-1207 and Hollerbach arms. The solution element of a dextrous manipulator is a collection of trivial fibre bundles with solution fibres homotopic to the Torus. If fibre solutions are parameterised by a scalar, a direct inverse funct.ion that maps the scalar and Cartesian base space coordinates to solution element fibre coordinates may be defined. A direct inverse pararneterisation of a solution element may be approximated by a local linear map generated by an inverse augmented Jacobian correction of a linear interpolation. The action of canonical solution group operators on a local linear approximation of the solution element of inverse kinematics of dextrous manipulators generates cyclical solutions. The solution representation is proposed as a model of inverse kinematic transformations in primate nervous systems. Simultaneous calibration of a composition of stereo-camera and manipulator kinematic models is under-determined by equi-output parameter groups in the composition of stereo-camera and Denavit Hartenberg (DH) rnodels. An error measure for simultaneous calibration of a composition of models is derived and parameter subsets with no equi-output groups are determined by numerical experiments to simultaneously calibrate the composition of homogeneous or pan-tilt stereo-camera with DH models. For acceleration of exact Newton second-order re-calibration of DH parameters after a sequential calibration of stereo-camera and DH parameters, an optimal numerical evaluation of DH matrix first order and second order error derivatives with respect to a re-calibration error function is derived, implemented and tested. A distributed object environment for point and click image-based tele-command of manipulators and stereo-cameras is specified and implemented that supports rapid prototyping of numerical experiments in distributed system control. The environment is validated by a hierarchical k-fold cross validated calibration to Cartesian space of a radial basis function regression correction of an affine stereo model. Basic design and performance requirements are defined for scalable virtual micro-kernels that broker inter-Java-virtual-machine remote method invocations between components of secure manageable fault-tolerant open distributed agile Total Quality Managed ISO 9000+ conformant Just in Time manufacturing systems.
Resumo:
The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.
Resumo:
In a series of experiments, we tested category-specific activation in normal parti¬cipants using magnetoencephalography (MEG). Our experiments explored the temporal processing of objects, as MEG characterises neural activity on the order of milliseconds. Our experiments explored object-processing, including assessing the time-course of ob¬ject naming, early differences in processing living compared with nonliving objects and processing objects at the basic compared with the domain level, and late differences in processing living compared with nonliving objects and processing objects at the basic compared with the domain level. In addition to studies using normal participants, we also utilised MEG to explore category-specific processing in a patient with a deficit for living objects. Our findings support the cascade model of object naming (Humphreys et al., 1988). In addition, our findings using normal participants demonstrate early, category-specific perceptual differences. These findings are corroborated by our patient study. In our assessment of the time-course of category-specific effects as well as a separate analysis designed to measure semantic differences between living and nonliving objects, we found support for the sensory/motor model of object naming (Martin, 1998), in addition to support for the cascade model of object naming. Thus, object processing in normal participants appears to be served by a distributed network in the brain, and there are both perceptual and semantic differences between living and nonliving objects. A separate study assessing the influence of the level at which you are asked to identify an object on processing in the brain found evidence supporting the convergence zone hypothesis (Damasio, 1989). Taken together, these findings indicate the utility of MEG in exploring the time-course of object processing, isolating early perceptual and later semantic effects within the brain.