895 resultados para Hierarchical task analysis
Resumo:
We have undertaken two-dimensional gel electrophoresis proteomic profiling on a series of cell lines with different recombinant antibody production rates. Due to the nature of gel-based experiments not all protein spots are detected across all samples in an experiment, and hence datasets are invariably incomplete. New approaches are therefore required for the analysis of such graduated datasets. We approached this problem in two ways. Firstly, we applied a missing value imputation technique to calculate missing data points. Secondly, we combined a singular value decomposition based hierarchical clustering with the expression variability test to identify protein spots whose expression correlates with increased antibody production. The results have shown that while imputation of missing data was a useful method to improve the statistical analysis of such data sets, this was of limited use in differentiating between the samples investigated, and highlighted a small number of candidate proteins for further investigation. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
The paper investigates a Bayesian hierarchical model for the analysis of categorical longitudinal data from a large social survey of immigrants to Australia. Data for each subject are observed on three separate occasions, or waves, of the survey. One of the features of the data set is that observations for some variables are missing for at least one wave. A model for the employment status of immigrants is developed by introducing, at the first stage of a hierarchical model, a multinomial model for the response and then subsequent terms are introduced to explain wave and subject effects. To estimate the model, we use the Gibbs sampler, which allows missing data for both the response and the explanatory variables to be imputed at each iteration of the algorithm, given some appropriate prior distributions. After accounting for significant covariate effects in the model, results show that the relative probability of remaining unemployed diminished with time following arrival in Australia.
Resumo:
This Study examined the muldlevel reladonships among negadve affect Behavioural Inhibidon System (BIS) Sensidvit)' and performance. It also invesdgated whether the reladonship among these variables changed across pracdce. Pardcipants performed muldple trials of a simulated air traffic control task, A single measure of BIS was taken before pracdce, while negadve affect and performance were measured at repeated intervals. As expected, negadve affect was detrimental to performance at both a between-person and withinperson level, BIS was also found to be detrimental to performance. Contrary' to expectadons, the reladonship between BIS and performance was not mediated by overall levels of negadve affect. As predicted, the effects of overall levels of negadve affect and BIS strengthened across pracdce as pardcipants gained task knowledge and skill. The findings of this study are interpreted using resource allocadon theor}' and the implicadons for skiU acquisidon discussed.
Resumo:
Experiments with simulators allow psychologists to better understand the causes of human errors and build models of cognitive processes to be used in human reliability assessment (HRA). This paper investigates an approach to task failure analysis based on patterns of behaviour, by contrast to more traditional event-based approaches. It considers, as a case study, a formal model of an air traffic control (ATC) system which incorporates controller behaviour. The cognitive model is formalised in the CSP process algebra. Patterns of behaviour are expressed as temporal logic properties. Then a model-checking technique is used to verify whether the decomposition of the operator's behaviour into patterns is sound and complete with respect to the cognitive model. The decomposition is shown to be incomplete and a new behavioural pattern is identified, which appears to have been overlooked in the analysis of the data provided by the experiments with the simulator. This illustrates how formal analysis of operator models can yield fresh insights into how failures may arise in interactive systems.
Resumo:
Web transaction data between Web visitors and Web functionalities usually convey user task-oriented behavior pattern. Mining such type of click-stream data will lead to capture usage pattern information. Nowadays Web usage mining technique has become one of most widely used methods for Web recommendation, which customizes Web content to user-preferred style. Traditional techniques of Web usage mining, such as Web user session or Web page clustering, association rule and frequent navigational path mining can only discover usage pattern explicitly. They, however, cannot reveal the underlying navigational activities and identify the latent relationships that are associated with the patterns among Web users as well as Web pages. In this work, we propose a Web recommendation framework incorporating Web usage mining technique based on Probabilistic Latent Semantic Analysis (PLSA) model. The main advantages of this method are, not only to discover usage-based access pattern, but also to reveal the underlying latent factor as well. With the discovered user access pattern, we then present user more interested content via collaborative recommendation. To validate the effectiveness of proposed approach, we conduct experiments on real world datasets and make comparisons with some existing traditional techniques. The preliminary experimental results demonstrate the usability of the proposed approach.
Resumo:
Recently, goal orientation, a mental framework for understanding how individuals approach learning and achievement situadons, has emerged as an important predictor of performance. This study addressed the effects of domain-specific avoid and prove orientations on performance from the betweenand within-person levels of analysis. One hundred and three participants performed thirty trials of an airtraffic control task. Domain-specific avoid and prove orientations were measured before each trial to assess the effects of changes in goal orientadon on changes in performance (i.e. within-person relationships). Average levels of avoid and prove orientations were calculated to assess the effect of goal orientation on overall performance (i.e. between-person relationships). Findings from the between-person level of analysis revealed that high prove-orientated individuals performed better than low proveorientated individuals. Results also revealed that average goal orientation levels moderated the withinperson relationships. The effect of changes in avoid orientation on changes in performance was stronger for low versus high avoid-oriented individuals while the effect of changes in prove orientadon on changes in performances was stronger for low versus highprove oriented individuals. Implications of these findings are considered.
Resumo:
This paper incorporates hierarchical structure into the neoclassical theory of the firm. Firms are hierarchical in two respects: the organization of workers in production and the wage structure. The firm’s hierarchy is represented as the sector of a circle, where the radius represents the hierarchy’s height, the width of the sector represents the breadth of the hierarchy at a given height, and the angle of the sector represents span of control for any given supervisor. A perfectly competitive firm then chooses height and width, as well as capital inputs, in order to maximize profit. We analyze the short run and long run impact of changes in scale economies, input substitutability and input and output prices on the firm’s hierarchical structure. We find that the firm unambiguously becomes more hierarchical as the specialization of its workers increases or as its output price increases relative to input prices. The effect of changes in scale economies is contingent on the output price. The model also brings forth an analysis of wage inequality within the firm, which is found to be independent of technological considerations, and only depends on the firm’s wage schedule.
Resumo:
Software simulation models are computer programs that need to be verified and debugged like any other software. In previous work, a method for error isolation in simulation models has been proposed. The method relies on a set of feature matrices that can be used to determine which part of the model implementation is responsible for deviations in the output of the model. Currrently these feature matrices have to be generated by hand from the model implementation, which is a tedious and error-prone task. In this paper, a method based on mutation analysis, as well as prototype tool support for the verification of the manually generated feature matrices is presented. The application of the method and tool to a model for wastewater treatment shows that the feature matrices can be verified effectively using a minimal number of mutants.
Resumo:
Among the Solar System’s bodies, Moon, Mercury and Mars are at present, or have been in the recent years, object of space missions aimed, among other topics, also at improving our knowledge about surface composition. Between the techniques to detect planet’s mineralogical composition, both from remote and close range platforms, visible and near-infrared reflectance (VNIR) spectroscopy is a powerful tool, because crystal field absorption bands are related to particular transitional metals in well-defined crystal structures, e.g., Fe2+ in M1 and M2 sites of olivine or pyroxene (Burns, 1993). Thanks to the improvements in the spectrometers onboard the recent missions, a more detailed interpretation of the planetary surfaces can now be delineated. However, quantitative interpretation of planetary surface mineralogy could not always be a simple task. In fact, several factors such as the mineral chemistry, the presence of different minerals that absorb in a narrow spectral range, the regolith with a variable particle size range, the space weathering, the atmosphere composition etc., act in unpredictable ways on the reflectance spectra on a planetary surface (Serventi et al., 2014). One method for the interpretation of reflectance spectra of unknown materials involves the study of a number of spectra acquired in the laboratory under different conditions, such as different mineral abundances or different particle sizes, in order to derive empirical trends. This is the methodology that has been followed in this PhD thesis: the single factors previously listed have been analyzed, creating, in the laboratory, a set of terrestrial analogues with well-defined composition and size. The aim of this work is to provide new tools and criteria to improve the knowledge of the composition of planetary surfaces. In particular, mixtures composed with different content and chemistry of plagioclase and mafic minerals have been spectroscopically analyzed at different particle sizes and with different mineral relative percentages. The reflectance spectra of each mixture have been analyzed both qualitatively (using the software ORIGIN®) and quantitatively applying the Modified Gaussian Model (MGM, Sunshine et al., 1990) algorithm. In particular, the spectral parameter variations of each absorption band have been evaluated versus the volumetric FeO% content in the PL phase and versus the PL modal abundance. This delineated calibration curves of composition vs. spectral parameters and allow implementation of spectral libraries. Furthermore, the trends derived from terrestrial analogues here analyzed and from analogues in the literature have been applied for the interpretation of hyperspectral images of both plagioclase-rich (Moon) and plagioclase-poor (Mars) bodies.
Resumo:
Underpinned by the resource-based view (RBV), social exchange theory (SET), and a theory of intrinsic motivation (empowerment), I proposed and tested a multi-level model that simultaneously examines the intermediate linkages or mechanisms through which HPWS impact individual and organizational performance. First and underpinned by RBV, I examined at the unit level, collective human capital and competitive advantage as path-ways through which the use of HPWS influences – branch market performance. Second and-, underpinned by social exchange (perceived organizational support) and intrinsic motivation (psychological empowerment) theories, I examined cross and individual level mechanisms through which experienced HPWS may influence employee performance. I tested the propositions of this study with multisource data obtained from junior and senior customer contact employees, and managers of 37 branches of two banks in Ghana. Results of the Structural Equation Modeling (SEM) analysis revealed that (i) collective human capital partially mediated the relationship between management-rated HPWS and competitive advantage, while competitive advantage completely mediated the influence of human capital on branch market performance. Consequently, management-rated HPWS influenced branch market performance indirectly through collective human capital and competitive advantage. Additionally, results of hierarchical linear modeling (HLM) tests of the cross-level influences on the motivational implications of HPWS revealed that (i) management-rated HPWS influenced experienced HPWS; (ii) perceived organizational support (POS) and psychological empowerment fully mediated the influence of experienced HPWS on service-oriented organizational citizenship behaviour (OCB), and; (iii) service-oriented OCB mediated the influence of psychological empowerment and POS on service quality and task performance. I discuss the theoretical and practical implications of these findings.
Resumo:
Visualization has proven to be a powerful and widely-applicable tool the analysis and interpretation of data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and sub-clusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach first on a toy data set, and then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multi-phase flows in oil pipelines and to data in 36 dimensions derived from satellite images.
Resumo:
We are concerned with the problem of image segmentation in which each pixel is assigned to one of a predefined finite number of classes. In Bayesian image analysis, this requires fusing together local predictions for the class labels with a prior model of segmentations. Markov Random Fields (MRFs) have been used to incorporate some of this prior knowledge, but this not entirely satisfactory as inference in MRFs is NP-hard. The multiscale quadtree model of Bouman and Shapiro (1994) is an attractive alternative, as this is a tree-structured belief network in which inference can be carried out in linear time (Pearl 1988). It is an hierarchical model where the bottom-level nodes are pixels, and higher levels correspond to downsampled versions of the image. The conditional-probability tables (CPTs) in the belief network encode the knowledge of how the levels interact. In this paper we discuss two methods of learning the CPTs given training data, using (a) maximum likelihood and the EM algorithm and (b) emphconditional maximum likelihood (CML). Segmentations obtained using networks trained by CML show a statistically-significant improvement in performance on synthetic images. We also demonstrate the methods on a real-world outdoor-scene segmentation task.
Resumo:
It has been argued that a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex data sets, and therefore a hierarchical visualization system is desirable. In this paper we extend an existing locally linear hierarchical visualization system PhiVis ¸iteBishop98a in several directions: bf(1) We allow for em non-linear projection manifolds. The basic building block is the Generative Topographic Mapping (GTM). bf(2) We introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree. General training equations are derived, regardless of the position of the model in the tree. bf(3) Using tools from differential geometry we derive expressions for local directional curvatures of the projection manifold. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. It enables the user to interactively highlight those data in the ancestor visualization plots which are captured by a child model. We also incorporate into our system a hierarchical, locally selective representation of magnification factors and directional curvatures of the projection manifolds. Such information is important for further refinement of the hierarchical visualization plot, as well as for controlling the amount of regularization imposed on the local models. We demonstrate the principle of the approach on a toy data set and apply our system to two more complex 12- and 18-dimensional data sets.
Resumo:
The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.