524 resultados para implements
Resumo:
The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
In order to reverse the use of lecture-based teaching, it is argued that personal reflection can be used as part of the quality assurance process. This paper proposes one response to personal reflection - reflective imagination, which is summarised as an action plan with six activities. It combines two conceptual issues raised in the US, the need to think creatively about learning and the reflective mindset, and one issue raised in the UK, cultivating the entrepreneurial imagination. Reflective imagination is linked to wider social science research, the place of self and reflexivity in scholarship. Finally, a personal history case study is presented which records a visit to Harvard Business School. The visit implements the six activities associated with reflective imagination. This is a method paper exploring reflective imagination.
Resumo:
Marr's work offered guidelines on how to investigate vision (the theory - algorithm - implementation distinction), as well as specific proposals on how vision is done. Many of the latter have inevitably been superseded, but the approach was inspirational and remains so. Marr saw the computational study of vision as tightly linked to psychophysics and neurophysiology, but the last twenty years have seen some weakening of that integration. Because feature detection is a key stage in early human vision, we have returned to basic questions about representation of edges at coarse and fine scales. We describe an explicit model in the spirit of the primal sketch, but tightly constrained by psychophysical data. Results from two tasks (location-marking and blur-matching) point strongly to the central role played by second-derivative operators, as proposed by Marr and Hildreth. Edge location and blur are evaluated by finding the location and scale of the Gaussian-derivative `template' that best matches the second-derivative profile (`signature') of the edge. The system is scale-invariant, and accurately predicts blur-matching data for a wide variety of 1-D and 2-D images. By finding the best-fitting scale, it implements a form of local scale selection and circumvents the knotty problem of integrating filter outputs across scales. [Supported by BBSRC and the Wellcome Trust]
Resumo:
This thesis considers four broad areas:(i) ANALYSIS OF THE STRESS FIELD.(a) research studies, relevant to the British Social Services considering the cultural setting, and the rigor with which they were conducted; (b) models of stress, specifically examining the theoretical soundness and practical application of the Medical, Engineering and Transactional models;(c) organisational models of stress relating specifically to human service organisations.(ii) QUALITATIVE AND QUANTITATIVE RESEARCH METHODOLOGIES.(a) the appropriate application of each respective methodology and the particular usefulness of qualitative research designs; (b) the relevance of understanding the language and terminology associated with the subject area prior to the implementation of survey methods; (iii) FIELDWORK.(a) Phase 1. By use of focus groups, in-depth interviews and diary keeping amongst a small range of teams and managers, the Researcher develops a basic conceptual framework of stress within a Social Services context. In addition a small scale personality inventory was administered to participants.(b) Phase 2. This consisted of three key elements: 6 case studies in which the Researcher implements and appraises the impact of a range of intervention strategies designed to assist teams and their managers in dealing more effectively with stress; the administration of a large scale survey to all the field social work teams within the Social Services Department; an analysis of the user role within the stress process by way of two focus groups.(iv) THEORETICAL DEVELOPMENT.
Resumo:
This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains information relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of concept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network approach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the presence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear techniques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.
Resumo:
We report statistical time-series analysis tools providing improvements in the rapid, precision extraction of discrete state dynamics from time traces of experimental observations of molecular machines. By building physical knowledge and statistical innovations into analysis tools, we provide techniques for estimating discrete state transitions buried in highly correlated molecular noise. We demonstrate the effectiveness of our approach on simulated and real examples of steplike rotation of the bacterial flagellar motor and the F1-ATPase enzyme. We show that our method can clearly identify molecular steps, periodicities and cascaded processes that are too weak for existing algorithms to detect, and can do so much faster than existing algorithms. Our techniques represent a step in the direction toward automated analysis of high-sample-rate, molecular-machine dynamics. Modular, open-source software that implements these techniques is provided.
Resumo:
This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.
Resumo:
We propose an arithmetic of function intervals as a basis for convenient rigorous numerical computation. Function intervals can be used as mathematical objects in their own right or as enclosures of functions over the reals. We present two areas of application of function interval arithmetic and associated software that implements the arithmetic: (1) Validated ordinary differential equation solving using the AERN library and within the Acumen hybrid system modeling tool. (2) Numerical theorem proving using the PolyPaver prover. © 2014 Springer-Verlag.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT One of the current research trends in Enterprise Resource Planning (ERP) involves examining the critical factors for its successful implementation. However, such research is limited to system implementation, not focusing on the flexibility of ERP to respond to changes in business. Therefore, this study explores a combination system, made up of an ERP and informality, intended to provide organisations with efficient and flexible performance simultaneously. In addition, this research analyses the benefits and challenges of using the system. The research was based on socio-technical system (STS) theory which contains two dimensions: 1) a technical dimension which evaluates the performance of the system; and 2) a social dimension which examines the impact of the system on an organisation. A mixed method approach has been followed in this research. The qualitative part aims to understand the constraints of using a single ERP system, and to define a new system corresponding to these problems. To achieve this goal, four Chinese companies operating in different industries were studied, all of which faced challenges in using an ERP system due to complexity and uncertainty in their business environments. The quantitative part contains a discrete-event simulation study that is intended to examine the impact of operational performance when a company implements the hybrid system in a real-life situation. Moreover, this research conducts a further qualitative case study, the better to understand the influence of the system in an organisation. The empirical aspect of the study reveals that an ERP with pre-determined business activities cannot react promptly to unanticipated changes in a business. Incorporating informality into an ERP can react to different situations by using different procedures that are based on the practical knowledge of frontline employees. Furthermore, the simulation study shows that the combination system can achieve a balance between efficiency and flexibility. Unlike existing research, which emphasises a continuous improvement in the IT functions of an enterprise system, this research contributes to providing a definition of a new system in theory, which has mixed performance and contains both the formal practices embedded in an ERP and informal activities based on human knowledge. It supports both cost-efficiency in executing business transactions and flexibility in coping with business uncertainty.This research also indicates risks of using the system, such as using an ERP with limited functions; a high cost for performing informally; and a low system acceptance, owing to a shift in organisational culture. With respect to practical contribution, this research suggests that companies can choose the most suitable enterprise system approach in accordance with their operational strategies. The combination system can be implemented in a company that needs to operate a medium amount of volume and variety. By contrast, the traditional ERP system is better suited in a company that operates a high-level volume market, while an informal system is more suitable for a firm with a requirement for a high level of variety.
Resumo:
Accurate T-cell epitope prediction is a principal objective of computational vaccinology. As a service to the immunology and vaccinology communities at large, we have implemented, as a server on the World Wide Web, a partial least squares-base multivariate statistical approach to the quantitative prediction of peptide binding to major histocom-patibility complexes (MHC), the key checkpoint on the antigen presentation pathway within adaptive,cellular immunity. MHCPred implements robust statistical models for both Class I alleles (HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203,HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3301, HLA-A*6801, HLA-A*6802 and HLA-B*3501) and Class II alleles (HLA-DRB*0401, HLA-DRB*0401and HLA-DRB* 0701).
Resumo:
Methodology of computer-aided investigation and provision of safety for complex constructions and a prototype of the intelligent applied system, which implements it, are considered. The methodology is determined by the model of the object under scrutiny, by the structure and functions of investigation of safety as well as by a set of research methods. The methods are based on the technologies of object-oriented databases, expert systems and on the mathematical modeling. The intelligent system’s prototype represents component software, which provides for support of decision making in the process of safety investigations and investigation of the cause of failure. Support of decision making is executed by analogy, by determined search for the precedents (cases) with respect to predicted (on the stage of design) and observed (on the stage of exploitation) parameters of the damage, destruction and malfunction of a complex hazardous construction.
Resumo:
The purpose of this work is the development of database of the distributed information measurement and control system that implements methods of optical spectroscopy for plasma physics research and atomic collisions and provides remote access to information and hardware resources within the Intranet/Internet networks. The database is based on database management system Oracle9i. Client software was realized in Java language. The software was developed using Model View Controller architecture, which separates application data from graphical presentation components and input processing logic. The following graphical presentations were implemented: measurement of radiation spectra of beam and plasma objects, excitation function for non-elastic collisions of heavy particles and analysis of data acquired in preceding experiments. The graphical clients have the following functionality of the interaction with the database: browsing information on experiments of a certain type, searching for data with various criteria, and inserting the information about preceding experiments.
Resumo:
* The work is supported by RFBR, grant 04-01-00858-a
Resumo:
The present paper is devoted to creation of cryptographic data security and realization of the packet mode in the distributed information measurement and control system that implements methods of optical spectroscopy for plasma physics research and atomic collisions. This system gives a remote access to information and instrument resources within the Intranet/Internet networks. The system provides remote access to information and hardware resources for the natural sciences within the Intranet/Internet networks. The access to physical equipment is realized through the standard interface servers (PXI, CАМАC, and GPIB), the server providing access to Ethernet devices, and the communication server, which integrates the equipment servers into a uniform information system. The system is used to make research task in optical spectroscopy, as well as to support the process of education at the Department of Physics and Engineering of Petrozavodsk State University.