847 resultados para computational creativity
Resumo:
In this article, we offer a new way of exploring relationships between three different dimensions of a business operation, namely the stage of business development, the methods of creativity and the major cultural values. Although separately, each of these has gained enormous attention from the management research community, evidenced by a large volume of research studies, there have been not many studies that attempt to describe the logic that connect these three important aspects of a business; let alone empirical evidences that support any significant relationships among these variables. The paper also provides a data set and an empirical investigation on that data set, using a categorical data analysis, to conclude that examinations of these possible relationships are meaningful and possible for seemingly unquantifiable information. The results also show that the most significant category among all creativity methods employed in Vietnamese enterprises is the “creative disciplines” rule in the “entrepreneurial phase,” while in general creative disciplines have played a critical role in explaining the structure of our data sample, for both stages of development in our consideration.
Resumo:
Strikingly, most literature suggests that market competition will push firms to take creativity/innovation seriously as matter of death or survival. Using the data, we examined creativity methods (Napier and Nilsson, 2008; Napier, 2010) in conjunction with three influential cultural values – namely risk tolerance, relationship, and dependence on resources – to assess how they influence decisions of entrepreneurs.The primary objective of this study focuses on perceived values of entrepreneurship and creativity in business conducted within a turbulent environment. Our initial hypothesis is that a typical entrepreneurial process carries with it “creativity-enabling elements.” In a normal situation, when businesses focus more on optimizing their resources for commercial gains, perceptions about values of entrepreneurial creativity are usually vague. However, in difficult times and harsh competition, the difference between survival and failure may be creativity. This paper also examines many previous findings on both entrepreneurship and creativity and suggests a highly possible “organic growth” of creativity in an entrepreneurial environment and reinforcing value of entrepreneurship when creativity power is present. In other words, we see each idea reinforcing the other. We use data from a survey of sample Vietnamese firms during the chaotic economic year 2012 to learn about the ‘entrepreneurshipcreativity nexus.’ A data set of 137 responses qualified for a statistical examination was obtained from an online survey, which started on February 16 and ended May 24, 2012, sent to local entrepreneurs and corporate managers using social networks. The authors employed categorical data analysis (Agresti, 2002; Azen & Walker, 2011). Statistical analyses confirm that for business operation, the creativity and entrepreneurial spirit could hardly be separate; and, this is not only correct with entrepreneurial firm, but also well established companies. The single most important factor before business startup and during early implementation in Vietnam is what we call “connection/relationship.” However, businesspeople are increasingly aware of the need of creativity/innovation. In fact, we suggest that creativity and entrepreneurial spirit cannot be separated in entrepreneurial firms as well as established companies.
Resumo:
Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.
Resumo:
We report a comprehensive study of the binary systems of the platinum-group metals with the transition metals, using high-throughput first-principles calculations. These computations predict stability of new compounds in 28 binary systems where no compounds have been reported in the literature experimentally and a few dozen of as-yet unreported compounds in additional systems. Our calculations also identify stable structures at compound compositions that have been previously reported without detailed structural data and indicate that some experimentally reported compounds may actually be unstable at low temperatures. With these results, we construct enhanced structure maps for the binary alloys of platinum-group metals. These maps are much more complete, systematic, and predictive than those based on empirical results alone.
Resumo:
Proteins are essential components of cells and are crucial for catalyzing reactions, signaling, recognition, motility, recycling, and structural stability. This diversity of function suggests that nature is only scratching the surface of protein functional space. Protein function is determined by structure, which in turn is determined predominantly by amino acid sequence. Protein design aims to explore protein sequence and conformational space to design novel proteins with new or improved function. The vast number of possible protein sequences makes exploring the space a challenging problem.
Computational structure-based protein design (CSPD) allows for the rational design of proteins. Because of the large search space, CSPD methods must balance search accuracy and modeling simplifications. We have developed algorithms that allow for the accurate and efficient search of protein conformational space. Specifically, we focus on algorithms that maintain provability, account for protein flexibility, and use ensemble-based rankings. We present several novel algorithms for incorporating improved flexibility into CSPD with continuous rotamers. We applied these algorithms to two biomedically important design problems. We designed peptide inhibitors of the cystic fibrosis agonist CAL that were able to restore function of the vital cystic fibrosis protein CFTR. We also designed improved HIV antibodies and nanobodies to combat HIV infections.
Resumo:
Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.
Resumo:
Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.
Resumo:
With increasing recognition of the roles RNA molecules and RNA/protein complexes play in an unexpected variety of biological processes, understanding of RNA structure-function relationships is of high current importance. To make clean biological interpretations from three-dimensional structures, it is imperative to have high-quality, accurate RNA crystal structures available, and the community has thoroughly embraced that goal. However, due to the many degrees of freedom inherent in RNA structure (especially for the backbone), it is a significant challenge to succeed in building accurate experimental models for RNA structures. This chapter describes the tools and techniques our research group and our collaborators have developed over the years to help RNA structural biologists both evaluate and achieve better accuracy. Expert analysis of large, high-resolution, quality-conscious RNA datasets provides the fundamental information that enables automated methods for robust and efficient error diagnosis in validating RNA structures at all resolutions. The even more crucial goal of correcting the diagnosed outliers has steadily developed toward highly effective, computationally based techniques. Automation enables solving complex issues in large RNA structures, but cannot circumvent the need for thoughtful examination of local details, and so we also provide some guidance for interpreting and acting on the results of current structure validation for RNA.
Resumo:
Transcriptional regulation has been studied intensively in recent decades. One important aspect of this regulation is the interaction between regulatory proteins, such as transcription factors (TF) and nucleosomes, and the genome. Different high-throughput techniques have been invented to map these interactions genome-wide, including ChIP-based methods (ChIP-chip, ChIP-seq, etc.), nuclease digestion methods (DNase-seq, MNase-seq, etc.), and others. However, a single experimental technique often only provides partial and noisy information about the whole picture of protein-DNA interactions. Therefore, the overarching goal of this dissertation is to provide computational developments for jointly modeling different experimental datasets to achieve a holistic inference on the protein-DNA interaction landscape.
We first present a computational framework that can incorporate the protein binding information in MNase-seq data into a thermodynamic model of protein-DNA interaction. We use a correlation-based objective function to model the MNase-seq data and a Markov chain Monte Carlo method to maximize the function. Our results show that the inferred protein-DNA interaction landscape is concordant with the MNase-seq data and provides a mechanistic explanation for the experimentally collected MNase-seq fragments. Our framework is flexible and can easily incorporate other data sources. To demonstrate this flexibility, we use prior distributions to integrate experimentally measured protein concentrations.
We also study the ability of DNase-seq data to position nucleosomes. Traditionally, DNase-seq has only been widely used to identify DNase hypersensitive sites, which tend to be open chromatin regulatory regions devoid of nucleosomes. We reveal for the first time that DNase-seq datasets also contain substantial information about nucleosome translational positioning, and that existing DNase-seq data can be used to infer nucleosome positions with high accuracy. We develop a Bayes-factor-based nucleosome scoring method to position nucleosomes using DNase-seq data. Our approach utilizes several effective strategies to extract nucleosome positioning signals from the noisy DNase-seq data, including jointly modeling data points across the nucleosome body and explicitly modeling the quadratic and oscillatory DNase I digestion pattern on nucleosomes. We show that our DNase-seq-based nucleosome map is highly consistent with previous high-resolution maps. We also show that the oscillatory DNase I digestion pattern is useful in revealing the nucleosome rotational context around TF binding sites.
Finally, we present a state-space model (SSM) for jointly modeling different kinds of genomic data to provide an accurate view of the protein-DNA interaction landscape. We also provide an efficient expectation-maximization algorithm to learn model parameters from data. We first show in simulation studies that the SSM can effectively recover underlying true protein binding configurations. We then apply the SSM to model real genomic data (both DNase-seq and MNase-seq data). Through incrementally increasing the types of genomic data in the SSM, we show that different data types can contribute complementary information for the inference of protein binding landscape and that the most accurate inference comes from modeling all available datasets.
This dissertation provides a foundation for future research by taking a step toward the genome-wide inference of protein-DNA interaction landscape through data integration.
Resumo:
© 2014 .The adoption of antisense gene silencing as a novel disinfectant for prokaryotic organisms is hindered by poor silencing efficiencies. Few studies have considered the effects of off-targets on silencing efficiencies, especially in prokaryotic organisms. In this computational study, a novel algorithm was developed that determined and sorted the number of off-targets as a function of alignment length in Escherichia coli K-12 MG1655 and Mycobacterium tuberculosis H37Rv. The mean number of off-targets per a single location was calculated to be 14.1. ±. 13.3 and 36.1. ±. 58.5 for the genomes of E. coli K-12 MG1655 and M. tuberculosis H37Rv, respectively. Furthermore, when the entire transcriptome was analyzed, it was found that there was no general gene location that could be targeted to minimize or maximize the number of off-targets. In an effort to determine the effects of off-targets on silencing efficiencies, previously published studies were used. Analyses with acpP, ino1, and marORAB revealed a statistically significant relationship between the number of short alignment length off-targets hybrids and the efficacy of the antisense gene silencing, suggesting that the minimization of off-targets may be beneficial for antisense gene silencing in prokaryotic organisms.
Resumo:
While much research has focused on entrepreneurship and creativity in developed economies, the notions of both topics are still embryonic in many emerging economies. This paper focuses on entrepreneurs in one such economy, Vietnam, to understand the perceptions of entrepreneurs about the role that innovation and creativity may play in their own entrepreneurial ventures and success. This is important because before reaping benefits from entrepreneurship, entrepreneurs need to decide when and on what conditions they start based on their calculations of required resources and predictions of likely outcomes. The research also sought to understand how "creativity," broadly applied ("innovation" and "creative performance") affects the ways that entrepreneurs think about and anticipate their own success and decisions. In essence, the study suggests that the higher the entrepreur’s creativity is, the more likely she or he is to start a new business and believe success will result. Future research could examine whether history, industry and geographic location matter in entrepreneurs’ perceptions as well as whether transition/emerging economies like Vietnam may have different views altogether about the two key concepts.
Resumo:
In the analysis of industrial processes, there is an increasing emphasis on systems governed by interacting continuum phenomena. Mathematical models of such multi-physics processes can only be achieved for practical simulations through computational solution procedures—computational mechanics. Examples of such multi-physics systems in the context of metals processing are used to explore some of the key issues. Finite-volume methods on unstructured meshes are proposed as a means to achieve efficient rapid solutions to such systems. Issues associated with the software design, the exploitation of high performance computers, and the concept of the virtual computational-mechanics modelling laboratory are also addressed in this context.
Resumo:
One thing is (a) to develop a system that handles some task to one's satisfaction, and also has a universally recognized myrthful side to its output. Another thing is (b) to provide an analysis of why you are getting such a byproduct. Yet another thing is (c) to develop a model that incorporates reflection about some phenomenon in humor for its own sake. This paper selects for discussion especially Alibi, going on to describe the preliminaries of Columbus. The former, which fits in (a), is a planner with an explanatory capability. It invents pretexts. It's no legal defense, but it is relevant to evidential thinking in AI & Law. Some of the output pretext are myrthful. Not in the sense they are silly: they are not. A key factor seems to be the very alacrity at explaining out detail after detail of globally damning evidence. I attempt a reanalysis of Alibi in respect of (b). As to Columbus, it fits instead in (c). We introduce here the basics of this (unimplemented) model, developed to account for a sample text in parody.
Resumo:
A computational model of solder joint formation and the subsequent cooling behaviour is described. Given the rapid changes in the technology of printed circuit boards, there is a requirement for comprehensive models of solder joint formation which permit detailed analysis of design and optimization options. Solder joint formation is complex, involving a range of interacting phenomena. This paper describes a model implementation (as part of a more comprehensive framework) to describe the shape formation (conditioned by surface tension), heat transfer, phase change and the development of elastoviscoplastic stress. The computational modelling framework is based upon mixed finite element and finite volume procedures, and has unstructured meshes enabling arbitrarily complex geometries to be analysed. Initial results for both through-hole and surface-mount geometries are presented.
Resumo:
There are many processes, particularly in the nuclear and metals processing industries, where electromagnetic fields are used to influence the flow behaviour of a fluid. Procedures exploiting finite volume (FV) methods in both structured and unstructured meshes have recently been developed which enable this influence to be modelled in the context of conventional FV CFD codes. A range of problems have been tackled by the authors, including electromagnetic pumps and brakes, weirs and dams in steelmaking tundishes and interface effects in aluminium smelting cells. Two cases are presented here, which exemplify the application of the new procedures. The first case investigates the influence of electromagnetic fields on solidification front progression in a tin casting and the second case shows how the liquid metals free surface may be controlled through an externally imposed magnetic field in the semi-levitation casting process.