28 resultados para skills mapping process

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is a study of the generation of topographic mappings - dimension reducing transformations of data that preserve some element of geometric structure - with feed-forward neural networks. As an alternative to established methods, a transformational variant of Sammon's method is proposed, where the projection is effected by a radial basis function neural network. This approach is related to the statistical field of multidimensional scaling, and from that the concept of a 'subjective metric' is defined, which permits the exploitation of additional prior knowledge concerning the data in the mapping process. This then enables the generation of more appropriate feature spaces for the purposes of enhanced visualisation or subsequent classification. A comparison with established methods for feature extraction is given for data taken from the 1992 Research Assessment Exercise for higher educational institutions in the United Kingdom. This is a difficult high-dimensional dataset, and illustrates well the benefit of the new topographic technique. A generalisation of the proposed model is considered for implementation of the classical multidimensional scaling (¸mds}) routine. This is related to Oja's principal subspace neural network, whose learning rule is shown to descend the error surface of the proposed ¸mds model. Some of the technical issues concerning the design and training of topographic neural networks are investigated. It is shown that neural network models can be less sensitive to entrapment in the sub-optimal global minima that badly affect the standard Sammon algorithm, and tend to exhibit good generalisation as a result of implicit weight decay in the training process. It is further argued that for ideal structure retention, the network transformation should be perfectly smooth for all inter-data directions in input space. Finally, there is a critique of optimisation techniques for topographic mappings, and a new training algorithm is proposed. A convergence proof is given, and the method is shown to produce lower-error mappings more rapidly than previous algorithms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this study was to compare the in vitro dissolution profile of a new rapidly absorbed paracetamol tablet containing sodium bicarbonate (PS) with that of a conventional paracetamol tablet (P), and to relate these by deconvolution and mapping to in vivo release. The dissolution methods used include the standard procedure described in the USP monograph for paracetamol tablets, employing buffer at pH5.8 or 0.05 M HCl at stirrer speeds between 10 and 50 rpm. The mapping process was developed and implemented in Microsoft Excel® worksheets that iteratively calculated the optimal values of scale and shape factors which linked in vivo time to in vitro time. The in vitro-in vivo correlation (IVIVC) was carried out simultaneously for both formulations to produce common mapping factors. The USP method, using buffer at pH5.8, demonstrated no difference between the two products. However, using an acidic medium the rate of dissolution of P but not of PS decreased with decreasing stirrer speed. A significant correlation (r=0.773; p<.00001) was established between in vivo release and in vitro dissolution using the profiles obtained with 0.05 M HCl and a stirrer speed of 30 rpm. The scale factor for optimal simultaneous IVIVC in the fasting state was 2.54 and the shape factor was 0.16; corresponding values for mapping in the fed state were 3.37 and 0.13 (implying a larger in vitro-in vivo time difference but reduced shape difference in the fed state). The current IVIVC explains, in part, the observed in vivo variability of the two products. The approach to mapping may also be extended to different batches of these products, to predict the impact of any changes of in vitro dissolution on in vivo release and plasma drug concentration-time profiles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a framework for considering quality control of volunteered geographic information (VGI). Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A case study demonstrates the use of a process-based approach to change regarding the implementation of an information system for road traffic accident reporting in a UK police force. The supporting tools of process mapping and business process simulation are used in the change process and assist in communicating the current process design and people's roles in the overall performance of that design. The simulation model is also used to predict the performance of new designs incorporating the use of information technology. The approach is seen to have a number of advantages in the context of a public sector organisation. These include the ability for personnel to move from a traditional grouping of staff in occupational groups with relationships defined by reporting requirements to a view of their role in a process, which delivers a performance to a customer. By running the simulation through time it is also possible to gauge how changes at an operational level can lead to the meeting of strategic targets over time. Also the ability of simulation to proof new designs was seen as particularly important in a government agency were past failures of information technology investments had contributed to a more risk averse approach to their implementation. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We re-analysed visuo-spatial perspective taking data from Kessler and Thomson (2010) plus a previously unpublished pilot with respect to individual- and sex differences in embodied processing (defined as body-posture congruence effects). We found that so-called 'systemisers' (males/low-social-skills) showed weaker embodiment than so-called 'embodiers' (females/high-social-skills). We conclude that 'systemisers' either have difficulties with embodied processing or, alternatively, they have a strategic advantage in selecting different mechanisms or the appropriate level of embodiment. In contrast, 'embodiers' have an advantageous strategy of "deep" embodied processing reflecting their urge to empathise or, alternatively, less flexibility in fine-tuning the involvement of bodily representations. © 2012 Copyright Taylor and Francis Group, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper, addresses the problem of novelty detection in the case that the observed data is a mixture of a known 'background' process contaminated with an unknown other process, which generates the outliers, or novel observations. The framework we describe here is quite general, employing univariate classification with incomplete information, based on knowledge of the distribution (the 'probability density function', 'pdf') of the data generated by the 'background' process. The relative proportion of this 'background' component (the 'prior' 'background' 'probability), the 'pdf' and the 'prior' probabilities of all other components are all assumed unknown. The main contribution is a new classification scheme that identifies the maximum proportion of observed data following the known 'background' distribution. The method exploits the Kolmogorov-Smirnov test to estimate the proportions, and afterwards data are Bayes optimally separated. Results, demonstrated with synthetic data, show that this approach can produce more reliable results than a standard novelty detection scheme. The classification algorithm is then applied to the problem of identifying outliers in the SIC2004 data set, in order to detect the radioactive release simulated in the 'oker' data set. We propose this method as a reliable means of novelty detection in the emergency situation which can also be used to identify outliers prior to the application of a more general automatic mapping algorithm. © Springer-Verlag 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper argues the use of reusable simulation templates as a tool that can help to predict the effect of e-business introduction on business processes. First, a set of requirements for e-business modelling is introduced and modelling options described. Traditional business process mapping techniques are examined as a way of identifying potential changes. Whilst paper-based process mapping may not highlight significant differences between traditional and e-business processes, simulation does allow the real effects of e-business to be identified. Simulation has the advantage of capturing the dynamic characteristics of the process, thus reflecting more accurately the changes in behaviour. This paper shows the value of using generic process maps as a starting point for collecting the data that is needed to build the simulation and proposes the use of reusable templates/components for the speedier building of e-business simulation models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A structured approach to process improvement is described in the context of the human resources division of a UK police force. The approach combines a number of established techniques of process improvement such as the balanced scorecard and process mapping with a scoring system developed to prioritise processes for improvement. The methodology described presents one way of ensuring the correct processes are identified and redesigned at an operational level in such a way as to support the organisation's strategic aims. In addition, a performance measurement system is utilised to attempt to ensure that the changes implemented do actually achieve the desired effect over time. The case demonstrates the need to choose and in some cases develop in-house tools and techniques dependent on the context of the process improvement effort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim To undertake a national study of teaching, learning and assessment in UK schools of pharmacy. Design Triangulation of course documentation, 24 semi-structured interviews undertaken with 29 representatives from the schools and a survey of all final year students (n=1,847) in the 15 schools within the UK during 2003–04. Subjects and setting All established UK pharmacy schools and final year MPharm students. Outcome measures Data were combined and analysed under the topics of curriculum, teaching and learning, assessment, multi-professional teaching and learning, placement education and research projects. Results Professional accreditation was the main driver for curriculum design but links to preregistration training were poor. Curricula were consistent but offered little student choice. On average half the curriculum was science-based. Staff supported the science content but students less so. Courses were didactic but schools were experimenting with new methods of learning. Examinations were the principal form of assessment but the contribution of practice to the final degree ranged considerably (21–63%). Most students considered the assessment load to be about right but with too much emphasis upon knowledge. Assessment of professional competence was focused upon dispensing and pharmacy law. All schools undertook placement teaching in hospitals but there was little in community/primary care. There was little inter-professional education. Resources and logistics were the major limiters. Conclusions There is a need for an integrated review of the accreditation process for the MPharm and preregistration training and redefinition of professional competence at an undergraduate level.