51 resultados para absorbing sets
Resumo:
Complexity is conventionally defined as the level of detail or intricacy contained within a picture. The study of complexity has received relatively little attention-in part, because of the absence of an acceptable metric. Traditionally, normative ratings of complexity have been based on human judgments. However, this study demonstrates that published norms for visual complexity are biased. Familiarity and learning influence the subjective complexity scores for nonsense shapes, with a significant training x familiarity interaction [F(1,52) = 17.53, p <.05]. Several image-processing techniques were explored as alternative measures of picture and image complexity. A perimeter detection measure correlates strongly with human judgments of the complexity of line drawings of real-world objects and nonsense shapes and captures some of the processes important in judgments of subjective complexity, while removing the bias due to familiarity effects.
Resumo:
Motivation: Recently, many univariate and several multivariate approaches have been suggested for testing differential expression of gene sets between different phenotypes. However, despite a wealth of literature studying their performance on simulated and real biological data, still there is a need to quantify their relative performance when they are testing different null hypotheses.
Results: In this article, we compare the performance of univariate and multivariate tests on both simulated and biological data. In the simulation study we demonstrate that high correlations equally affect the power of both, univariate as well as multivariate tests. In addition, for most of them the power is similarly affected by the dimensionality of the gene set and by the percentage of genes in the set, for which expression is changing between two phenotypes. The application of different test statistics to biological data reveals that three statistics (sum of squared t-tests, Hotelling's T2, N-statistic), testing different null hypotheses, find some common but also some complementing differentially expressed gene sets under specific settings. This demonstrates that due to complementing null hypotheses each test projects on different aspects of the data and for the analysis of biological data it is beneficial to use all three tests simultaneously instead of focusing exclusively on just one.
Resumo:
Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.
Resumo:
Studies of HeLa cells and serum- and glucocorticoid-regulated kinase 1 (SGK1) knockout mice identified threonine residues in the n-myc downstream-regulated gene 1 protein (NDRG1-Thr(346/356/366)) that are phosphorylated by SGK1 but not by related kinases (Murray et al., Biochem J 385:1-12, 2005). We have, therefore, monitored the phosphorylation of NDRG1-Thr(346/356/366) in order to explore the changes in SGK1 activity associated with the induction and regulation of the glucocorticoid-dependent Na+ conductance (G (Na)) in human airway epithelial cells. Transient expression of active (SGK1-S422D) and inactive (SGK1-K127A) SGK1 mutants confirmed that activating SGK1 stimulates NDRG1-Thr(346/356/366) phosphorylation. Although G (Na) is negligible in hormone-deprived cells, these cells displayed basal SGK1 activity that was sensitive to LY294002, an inhibitor of 3-phosphatidylinositol phosphate kinase (PI3K). Dexamethasone (0.2 mu M) acutely activated SGK1 and the peak of this response (2-3 h) coincided with the induction of G (Na), and both responses were PI3K-dependent. While these data suggest that SGK1 might mediate the rise in G (Na), transient expression of the inactive SGK1-K127A mutant did not affect the hormonal induction of G (Na) but did suppress the activation of SGK1. Dexamethasone-treated cells grown on permeable supports formed confluent epithelial sheets that generated short circuit current due to electrogenic Na+ absorption. Forskolin and insulin both stimulated this current and the response to insulin, but not forskolin, was LY294002-sensitive and associated with the activation of SGK1. While these data suggest that SGK1 is involved in the control of G (Na), its role may be minor, which could explain why sgk1 knockout has different effects upon different tissues.
Resumo:
We construct a bounded linear operator on a separable, reflexive and strictly convex Banach space whose resolvent norm is constant in a neighbourhood of zero.
Resumo:
In the present paper, we introduce a notion of a style representing abstract, complex objects having characteristics that can be represented as structured objects. Furthermore, we provide some mathematical properties of such styles. As a main result, we present a novel approach to perform a meaningful comparative analysis of such styles by defining and using graph-theoretic measures. We compare two styles by comparing the underlying feature sets representing sets of graph structurally. To determine the structural similarity between the underlying graphs, we use graph similarity measures that are computationally efficient. More precisely, in order to compare styles, we map each feature set to a so-called median graph and compare the resulting median graphs. As an application, we perform an experimental study to compare special styles representing sets of undirected graphs and present numerical results thereof. (C) 2007 Elsevier Inc. All rights reserved.
Absorbing new knowledge in small and medium-sized enterprises: A multiple case analysis of Six Sigma
Resumo:
The primary aim of this article is to critically analyse the development of Six Sigma theory and practice within small and medium-sized enterprises (SMEs) using a multiple case study approach. The article also explores the subsequent development of Lean Six Sigma as a means of addressing the perceived limitations of the efficacy of Six Sigma in this context. The overarching theoretical framework is that of absorptive capacity, where Six Sigma is conceptualized as new knowledge to be absorbed by smaller firms. The findings from a multiple case study involving repeat interviews and focus groups informed the development of an analytical model demonstrating the dynamic underlying routines for the absorptive capacity process and the development of a number of summative propositions relating the characteristics of SMEs to Six Sigma and Lean Six Sigma implementation.