947 resultados para quantization artifacts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environmental chambers were designed for the accelerated ageing of materials used in artistic artifacts to study the synergistic action of temperature, humidity, UV and visible radiation and gaseous pollutants. Two inox-steel/PTFE compartments are kept under controlled temperature and relative humidity, whose values are transmitted to a PC, which stores, plots in real time and continuously feedback heating and humidifying devices through logical signals. A borosilicate, or quartz, window allows the irradiation inside the chamber from an external source. A flow of purified air purges the chamber and conveys selected pollutants from an external source. Each independent compartment works under either stationary or cyclic conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quantum harmonic oscillator is described by the Hermite equation.¹ The asymptotic solution is predominantly used to obtain its analytical solutions. Wave functions (solutions) are quadratically integrable if taken as the product of the convergent asymptotic solution (Gaussian function) and Hermite polynomial,¹ whose degree provides the associated quantum number. Solving it numerically, quantization is observed when a control real variable is "tuned" to integer values. This can be interpreted by graphical reading of Y(x) and |Y(x)|², without other mathematical analysis, and prove useful for teaching fundamentals of quantum chemistry to undergraduates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigation of galvanomagnetic effects in nanostructure GaAs/Mn/GaAs/In0.15Ga0.85As/ GaAs is presented. This nanostructure is classified as diluted magnetic semiconductor (DMS). Temperature dependence of transverse magnetoresistivity of the sample was studied. The anomalous Hall effect was detected and subtracted from the total Hall component. Special attention was paid to the measurements of Shubnikov-de Haas oscillations, which exists only in the case of magnetic field aligned perpendicularly to the plane of the sample. This confirms two-dimensional character of the hole energy spectrum in the quantum well. Such important characteristics as cyclotron mass, the Fermi energy and the Dingle temperature were calculated, using experimental data of Shubnikov-de Haas oscillations. The hole concentration and hole mobility in the quantum well also were estimated for the sample. At 4.2 K spin splitting of the maxima of transverse resistivity was observed and g-factor was calculated for that case. The values of the Dingle temperatures were obtained by two different approaches. From the comparison of these values it was concluded that the broadening of Landau levels in the investigated structure is mainly defined by the scattering of charge carriers on the defects of the crystal lattice

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this research was to develop a framework to analyze how physical environment influences scientific creativity. Due to the relative novelty of this topic, there is still a gap in the unified method to study connection between physical environment and creativity. Therefore, in order to study this issue deeply, the qualitative method was used (interviews and qualitative questionnaire). Scientists (PhD students and senior researchers) of Graduate School of Management were interviewed to build the model and one expert interview was conducted to assess its validity. The model highlights several dimensions via which physical environment can influence scientific creativity: Comfort, Instruments and Diversity. Comfort and Instruments are considered to be related mostly to productivity, an initial requirement for creativity, while Diversity is the factor responsible for supporting all the stages of scientific creative process. Thus, creative physical environment is not one place by its nature, but an aggregative phenomenon. Due to two levels of analysis, the model is named the two-level model of creative physical environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present study, using noise-free simulated signals, we performed a comparative examination of several preprocessing techniques that are used to transform the cardiac event series in a regularly sampled time series, appropriate for spectral analysis of heart rhythm variability (HRV). First, a group of noise-free simulated point event series, which represents a time series of heartbeats, was generated by an integral pulse frequency modulation model. In order to evaluate the performance of the preprocessing methods, the differences between the spectra of the preprocessed simulated signals and the true spectrum (spectrum of the model input modulating signals) were surveyed by visual analysis and by contrasting merit indices. It is desired that estimated spectra match the true spectrum as close as possible, showing a minimum of harmonic components and other artifacts. The merit indices proposed to quantify these mismatches were the leakage rate, defined as a measure of leakage components (located outside some narrow windows centered at frequencies of model input modulating signals) with respect to the whole spectral components, and the numbers of leakage components with amplitudes greater than 1%, 5% and 10% of the total spectral components. Our data, obtained from a noise-free simulation, indicate that the utilization of heart rate values instead of heart period values in the derivation of signals representative of heart rhythm results in more accurate spectra. Furthermore, our data support the efficiency of the widely used preprocessing technique based on the convolution of inverse interval function values with a rectangular window, and suggest the preprocessing technique based on a cubic polynomial interpolation of inverse interval function values and succeeding spectral analysis as another efficient and fast method for the analysis of HRV signals

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current myogenesis and myofibrillogenesis model has been based mostly on in vitro cell culture studies, and, to a lesser extent, on in situ studies in avian and mammalian embryos. While the more isolated artificial conditions of cells in culture permitted careful structural analysis, the actual in situ cellular structures have not been described in detail because the embryos are more difficult to section and manipulate. To overcome these difficulties, we used the optically clear and easy to handle embryos of the zebrafish Danio rerio. We monitored the expression of cytoskeletal and cell-adhesion proteins (actin, myosin, desmin, alpha-actinin, troponin, titin, vimentin and vinculin) using immunofluorescence microscopy and video-enhanced, background-subtracted, differential interference contrast of 24- to 48-h zebrafish embryos. In the mature myotome, the mononucleated myoblasts displayed periodic striations for all sarcomeric proteins tested. The changes in desmin distribution from aggregates to perinuclear and striated forms, although following the same sequence, occurred much faster than in other models. All desmin-positive cells were also positive for myofibrillar proteins and striated, in contrast to that which occurs in cell cultures. Vimentin appeared to be striated in mature cells, while it is developmentally down-regulated in vitro. The whole connective tissue septum between the somites was positive for adhesion proteins such as vinculin, instead of the isolated adhesion plaques observed in cell cultures. The differences in the myogenesis of zebrafish in situ and in cell culture in vitro suggest that some of the previously observed structures and protein distributions in cultures could be methodological artifacts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Point mutations and small insertions or deletions in the human alpha-globin genes may produce alpha-chain structural variants and alpha-thalassemia. Mutations can be detected either by direct DNA sequencing or by screening methods, which select the mutated exon for sequencing. Although small (about 1 kb, 3 exons and 2 introns), the alpha-globin genes are duplicate (alpha2 and alpha1) and highy G-C rich, which makes them difficult to denature, reducing sequencing efficiency and causing frequent artifacts. We modified some conditions for PCR and electrophoresis in order to detect mutations in these genes employing nonradioactive single-strand conformation polymorphism (SSCP). Primers previously described by other authors for radioactive SSCP and phast-SSCP plus denaturing gradient gel electrophoresis were here combined and the resultant fragments (6 new besides 6 original per alpha-gene) submitted to silver staining SSCP. Nine structural and one thalassemic mutations were tested, under different conditions including two electrophoretic apparatus (PhastSystem™ and GenePhor™, Amersham Biosciences), different polyacrylamide gel concentrations, run temperatures and denaturing agents, and entire and restriction enzyme cut fragments. One hundred percent of sensitivity was achieved with four of the new fragments formed, using the PhastSystem™ and 20% gels at 15ºC, without the need of restriction enzymes. This nonradioactive PCR-SSCP approach showed to be simple, rapid and sensitive, reducing the costs involved in frequent sequencing repetitions and increasing the reliability of the results. It can be especially useful for laboratories which do not have an automated sequencer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the last few decades, the videogame has become an important media, economic, and cultural phenomenon. Along with the phenomenon’s proliferation the aspects that constitute its identity have become more and more challenging to determine, however. The persistent surfacing of novel ludic forms continues to expand the conceptual range of ‘games’ and ‘videogames,’ which has already lead to anxious generalizations within academic as well as popular discourses. Such generalizations make it increasingly difficult to comprehend how the instances of this phenomenon actually work, which in turn generates pragmatic problems: the lack of an applicable identification of the videogame hinders its study, play, and everyday conceptualization. To counteract these problems this dissertation establishes a geneontological research methodology that enables the identification of the videogame in relation to its cultural surroundings. Videogames are theorized as ‘games,’ ‘puzzles,’ ‘stories,’ and ‘aesthetic artifacts’ (or ‘artworks’), which produces a geneontological sequence of the videogame as a singular species of culture, Artefactum ludus ludus, or ludom for short. According to this sequence, the videogame’s position as a ‘game’ in the historicized evolution of culture is mainly metaphorical, while at the same time its artifactuality, dynamic system structure, time-critical strategic input requirements and aporetically rhematic aesthetics allow it to be discovered as a conceptually stable but empirically transient uniexistential phenomenon that currently thrivesbut may soon die out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkimuksen kohteena on yksi sotilasyksikkö, Reserviupseerikoulun Esikunta- ja viestikomppania. Kyseisessä yksikössä koulutetaan reservinupseereita sotilaspo-liisi-, huolto-, johtamisjärjestelmä- ja komentopaikkatehtäviin. Sotilaspoliisilinja ja huoltolinja ovat liittyneet yksikköön vuoden 2014 lopulla. Tämä integraatio on saanut aikaan tiedonintressin koulutuslinjojen kulttuureihin liittyen. Tarkoituksena on selvittää miten linjojen koulutuskulttuurit poikkeavat toisistaan ja mitä kulttuurien integraatiossa tapahtuu. Tutkimuksen tavoitteena on myös antaa suosituksia yksikölle mahdollisten kulttuuriristiriitojen ja konfliktien hallitsemiseksi. Tutkimus on luonteeltaan sosiaaliantropologista kauppatieteellistä kulttuurintutki-musta. Taustatieteinä ovat liiketaloustiede, antropologia, sosiologia ja osittain or-ganisaatiopsykologia. Tutkimusote on laadullinen ja aineiston hankinnan mene-telminä on käytetty yksikön kouluttajien haastatteluja ja osallistuvaa havainnointia. Tutkimusaineisto on analysoitu teoriaohjaavaa laadullista sisällönanalyysiä käyttäen. Analyysin tavoitteena oli tunnistaa erot integroituvissa koulutuskulttuureissa, ymmärtää eroista johtuvien konfliktien logiikkaa ja tarkastella kulttuurien vuotamisen logiikkaa. Esikunta- ja viestikomppanian koulutuskulttuurien erilaisuutta analysoitiin seuraavien teoriateemojen kautta: oppilaskäsitys, suhtautuminen upseerioppilaisiin, artefaktit (koulutusmenetelmät), arvot, asenteet ja kouluttamisen syväoletukset. Koulutuskulttuurien kohtaamista analysoitiin psykologisen omistajuuden, reviirikäyttäytymisen ja kulttuurin vuotamisen teorioiden kautta. Tutkimustulokset osoittavat, että kulttuurilla todella on merkitystä sotilasorganisaa-tion integraatiotilanteessa. Esikunta- ja viestikomppanian koulutuskulttuureissa on havaittavissa huomattavia eroja, mutta myös samankaltaisuuksia. Kulttuurien erot johtavat ristiriitaisuuksiin, konflikteihin, reviirikäyttäytymiseen ja saavat ihmiset osoittamaan mieltään eri tavoin. Toisaalta kulttuuripiirteillä on myös taipumusta vuotaa ympäristöön nopeuttaen uusien organisaation osien sopeutumista. Johtajan rooli kulttuurien hallinnassa on keskeinen. Johtajan oma kulttuuriorientaatio voi johtaa konflikteihin, mutta toisaalta myös ratkaista niitä. Tärkeimpänä johtopäätöksenä on, että suuressa sotilasyksikössä kannattaisi laatia strategia eli suunnitelma kulttuurin johtamista varten silloin, kun integraatio on tapahtumassa. Strategian avulla voidaan ottaa kantaa useisiin kulttuurillisiin ristiriitoihin, joita siis Esikunta- ja viestikomppaniankin tapauksessa ilmenee. Koko yksikön henkilöstä kannattaisi sitouttaa ja ottaa mukaan strategian laadintaan. Strategia selventäisi esimerkiksi yksikön arvoja, visiota, missiota, tehtäviä, erilaisten linjojen erityisasemaa ja poikkeuksia, haluttua ja toivottua toimintaa ja asennetta, palkitsemiskäytäntöjä, toisin sanoen kulttuurin eri tekijöitä. Strategia ei ole vain liiketaloudellinen tulosyksikön pitkän tähtäimen pakollinen työkalu, vaan myös sotilasorganisaation kulttuurin johtamisen väline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The type I herpes simplex virus VP22 tegument protein is abundant and well known for its ability to translocate proteins from one cell to the other. In spite of some reports questioning its ability to translocate proteins by attributing the results observed to fixation artifacts or simple attachment to the cell membrane, VP22 has been used to deliver several proteins into different cell types, triggering the expected cell response. However, the question of the ability of VP22 to enter stem cells has not been addressed. We investigated whether VP22 could be used as a tool to be applied in stem cell research and differentiation due to its capacity to internalize other proteins without altering the cell genome. We generated a VP22.eGFP construct to evaluate whether VP22 could be internalized and carry another protein with it into two different types of stem cells, namely adult human dental pulp stem cells and mouse embryonic stem cells. We generated a VP22.eGFP fusion protein and demonstrated that, in fact, it enters stem cells. Therefore, this system may be used as a tool to deliver various proteins into stem cells, allowing stem cell research, differentiation and the generation of induced pluripotent stem cells in the absence of genome alterations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object detection is a fundamental task of computer vision that is utilized as a core part in a number of industrial and scientific applications, for example, in robotics, where objects need to be correctly detected and localized prior to being grasped and manipulated. Existing object detectors vary in (i) the amount of supervision they need for training, (ii) the type of a learning method adopted (generative or discriminative) and (iii) the amount of spatial information used in the object model (model-free, using no spatial information in the object model, or model-based, with the explicit spatial model of an object). Although some existing methods report good performance in the detection of certain objects, the results tend to be application specific and no universal method has been found that clearly outperforms all others in all areas. This work proposes a novel generative part-based object detector. The generative learning procedure of the developed method allows learning from positive examples only. The detector is based on finding semantically meaningful parts of the object (i.e. a part detector) that can provide additional information to object location, for example, pose. The object class model, i.e. the appearance of the object parts and their spatial variance, constellation, is explicitly modelled in a fully probabilistic manner. The appearance is based on bio-inspired complex-valued Gabor features that are transformed to part probabilities by an unsupervised Gaussian Mixture Model (GMM). The proposed novel randomized GMM enables learning from only a few training examples. The probabilistic spatial model of the part configurations is constructed with a mixture of 2D Gaussians. The appearance of the parts of the object is learned in an object canonical space that removes geometric variations from the part appearance model. Robustness to pose variations is achieved by object pose quantization, which is more efficient than previously used scale and orientation shifts in the Gabor feature space. Performance of the resulting generative object detector is characterized by high recall with low precision, i.e. the generative detector produces large number of false positive detections. Thus a discriminative classifier is used to prune false positive candidate detections produced by the generative detector improving its precision while keeping high recall. Using only a small number of positive examples, the developed object detector performs comparably to state-of-the-art discriminative methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Imagine the potential implications of an organization whose business and IT processes are well aligned and are capable of reactively and proactively responding to the external and internal changes. The Philips IT Infrastructure and Operations department (I&O) is undergoing a series of transformation activities to help Philips business keeping up with the changes. I&O would serve a critical function in any business sectors; given that the I&O’s strategy switched from “design, build and run” to “specify, acquire and performance manage”, that function is amplified. In 2013, I&O’s biggest transforming programme I&O Futures engaged multiple interdisciplinary departments and programs on decommissioning legacy processes and restructuring new processes with respect to the Information Technology Internet Library (ITIL), helping I&O to achieve a common infrastructure and operating platform (CI&OP). The author joined I&O Futures in the early 2014 and contributed to the CI&OP release 1, during which a designed model Bing Box and its evaluations were conducted through the lens of six sigma’s structured define-measure-analyze-improve-control (DMAIC) improvement approach. This Bing Box model was intended to firstly combine business and IT principles, namely Lean IT, Agile, ITIL best practices, and Aspect-oriented programming (AOP) into a framework. Secondly, the author implemented the modularized optimization cycles according to the defined framework into Philips’ ITIL-based processes and, subsequently, to enhance business process performance as well as to increase efficiency of the optimization cycles. The unique of this thesis is that the Bing Box model not only provided comprehensive optimization approaches and principles for business process performance, but also integrated and standardized optimization modules for the optimization process itself. The research followed a design research guideline that seek to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. The Chapter 2 firstly reviewed the current research on Lean Six Sigma, Agile, AOP and ITIL, aiming at identifying the broad conceptual bases for this study. In Chapter 3, we included the process of constructing the Bing Box model. The Chapter 4 described the adoption of Bing Box model: two-implementation case validated by stakeholders through observations and interviews. Chapter 5 contained the concluding remarks, the limitation of this research work and the future research areas. Chapter 6 provided the references used in this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SUMMARY Organizational creativity – hegemonic and alternative discourses Over the course of recent developments in the societal and business environment, the concept of creativity has been brought into new arenas. The rise of ‘creative industries’ and the idea of creativity as a form of capital have attracted the interests of business and management professionals – as well as academics. As the notion of creativity has been adopted in the organization studies literature, the concept of organizational creativity has been introduced to refer to creativity that takes place in an organizational context. This doctoral thesis focuses on organizational creativity, and its purpose is to explore and problematize the hegemonic organizational creativity discourse and to provide alternative viewpoints for theorizing about creativity in organizations. Taking a discourse theory approach, this thesis, first, provides an outline of the currently predominant, i.e. hegemonic, discourse on organizational creativity, which is explored regarding themes, perspectives, methods and paradigms. Second, this thesis consists of five studies that act as illustrations of certain alternative viewpoints. Through these exemplary studies, this thesis sheds light on the limitations and taken-for-granted aspects of the hegemonic discourse and discusses what these alternative viewpoints could offer for the understanding of and theorizing for organizational creativity. This study leans on an assumption that the development of organizational creativity knowledge and the related discourse is not inevitable or progressive but rather contingent. The organizational creativity discourse has developed in a certain direction, meaning that some themes, perspectives, and methods, as well as assumptions, values, and objectives, have gained a hegemonic position over others, and are therefore often taken for granted and considered valid and relevant. The hegemonization of certain aspects, however, contributes to the marginalization of others. The thesis concludes that the hegemonic discourse on organizational creativity is based on an extensive coverage of certain themes and perspectives, such as those focusing on individual cognitive processes, motivation, or organizational climate and their relation to creativity, to name a few. The limited focus on some themes and the confinement to certain prevalent perspectives, however, results in the marginalization of other themes and perspectives. The negative, often unintended, consequences, implications, and side effects of creativity, the factors that might hinder or prevent creativity, and a deeper inquiry into the ontology and epistemology of creativity have attracted relatively marginal interest. The material embeddedness of organizational creativity, in other words, the physical organizational environment as well as the human body and its non-cognitive resources, has largely been overlooked in the hegemonic discourse, although thereare studies in this area that give reason to believe that they might prove relevant for the understanding of creativity. The hegemonic discourse is based on an individual-centered understanding of creativity which overattributes creativity to an individual and his/her cognitive capabilities, while simultaneously neglecting how, for instance, the physical environment, artifacts, social dynamics and interactions condition organizational creativity. Due to historical reasons, quantitative as well as qualitative yet functionally- oriented studies have predominated the organizational creativity discourse, although studies falling into the interpretationist paradigm have gradually become more popular. The two radical paradigms, as well as methodological and analytical approaches typical of radical research, can be considered to hold a marginal position in the field of organizational creativity. The hegemonic organizational creativity discourse has provided extensive findings related to many aspects of organizational creativity, although the con- ceptualizations and understandings of organizational creativity in the hegemonic discourse are also in many respects limited and one-sided. The hegemonic discourse is based on an assumption that creativity is desirable, good, necessary, or even obligatory, and should be encouraged and nourished. The conceptualiza- tions of creativity favor the kind of creativity which is useful, valuable and can be harnessed for productivity. The current conceptualization is limited to the type of creativity that is acceptable and fits the managerial ideology, and washes out any risky, seemingly useless, or negative aspects of creativity. It also limits the possible meanings and representations that ‘creativity’ has in the respective discourse, excluding many meanings of creativity encountered in other discourses. The excessive focus on creativity that is good, positive, productive and fits the managerial agenda while ignoring other forms and aspects of creativity, however, contributes to the dilution of the notion. Practices aimed at encouraging the kind of creativity may actually entail a risk of fostering moderate alterations rather than more radical novelty, as well as management and organizational practices which limit creative endeavors, rather than increase their likelihood. The thesis concludes that although not often given the space and attention they deserve, there are alternative conceptualizations and understandings of organizational creativity which embrace a broader notion of creativity. The inability to accommodate the ‘other’ understandings and viewpoints within the organizational creativity discourse runs a risk of misrepresenting the complex and many-sided phenomenon of creativity in organizational context. Keywords: Organizational creativity, creativity, organization studies, discourse theory, hegemony