883 resultados para Terminology.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite increasing interest in the discursive aspects of strategy, few studies have examined strategy texts and their power effects. We draw from Critical Discourse Analysis to better understand the power of strategic plans as a directive genre. In our empirical analysis, we examined the creation of the official strategic plan of the City of Lahti in Finland. As a result of our inductive analysis, we identified five central discursive features of this plan: self-authorization, special terminology, discursive innovation, forced consensus and deonticity. We argue that these features can, with due caution, be generalized and conceived as distinctive features of the strategy genre. We maintain that these discursive features are not trivial characteristics; they have important implications for the textual agency of strategic plans, their performative effects, impact on power relations and ideological implications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene expression is one of the most critical factors influencing the phenotype of a cell. As a result of several technological advances, measuring gene expression levels has become one of the most common molecular biological measurements to study the behaviour of cells. The scientific community has produced enormous and constantly increasing collection of gene expression data from various human cells both from healthy and pathological conditions. However, while each of these studies is informative and enlighting in its own context and research setup, diverging methods and terminologies make it very challenging to integrate existing gene expression data to a more comprehensive view of human transcriptome function. On the other hand, bioinformatic science advances only through data integration and synthesis. The aim of this study was to develop biological and mathematical methods to overcome these challenges and to construct an integrated database of human transcriptome as well as to demonstrate its usage. Methods developed in this study can be divided in two distinct parts. First, the biological and medical annotation of the existing gene expression measurements needed to be encoded by systematic vocabularies. There was no single existing biomedical ontology or vocabulary suitable for this purpose. Thus, new annotation terminology was developed as a part of this work. Second part was to develop mathematical methods correcting the noise and systematic differences/errors in the data caused by various array generations. Additionally, there was a need to develop suitable computational methods for sample collection and archiving, unique sample identification, database structures, data retrieval and visualization. Bioinformatic methods were developed to analyze gene expression levels and putative functional associations of human genes by using the integrated gene expression data. Also a method to interpret individual gene expression profiles across all the healthy and pathological tissues of the reference database was developed. As a result of this work 9783 human gene expression samples measured by Affymetrix microarrays were integrated to form a unique human transcriptome resource GeneSapiens. This makes it possible to analyse expression levels of 17330 genes across 175 types of healthy and pathological human tissues. Application of this resource to interpret individual gene expression measurements allowed identification of tissue of origin with 92.0% accuracy among 44 healthy tissue types. Systematic analysis of transcriptional activity levels of 459 kinase genes was performed across 44 healthy and 55 pathological tissue types and a genome wide analysis of kinase gene co-expression networks was done. This analysis revealed biologically and medically interesting data on putative kinase gene functions in health and disease. Finally, we developed a method for alignment of gene expression profiles (AGEP) to perform analysis for individual patient samples to pinpoint gene- and pathway-specific changes in the test sample in relation to the reference transcriptome database. We also showed how large-scale gene expression data resources can be used to quantitatively characterize changes in the transcriptomic program of differentiating stem cells. Taken together, these studies indicate the power of systematic bioinformatic analyses to infer biological and medical insights from existing published datasets as well as to facilitate the interpretation of new molecular profiling data from individual patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I discuss role responsibly, individual responsibility and collective responsibility in corporate multinational setting. My case study is about minerals used in electronics that come from the Democratic Republic of Congo. What I try to show throughout the thesis is how many things need to be taken into consideration when we discuss the responsibility of individuals in corporations. No easy and simple answers are available. Instead, we must keep in mind the complexity of the situation at all times, judging cases on individual basis, emphasizing the importance of individual judgement and virtue, as well as the responsibility we all share as members of groups and the wider society. I begin by discussing the demands that are placed on us as employees. There is always a potential for a conflict between our different roles and also the wider demands placed on us. Role demands are usually much more specific than the wider question of how we should act as human beings. The terminology of roles can also be misleading as it can create illusions about our work selves being somehow radically separated from our everyday, true selves. The nature of collective decision-making and its implications for responsibility is important too. When discussing the moral responsibility of an employee in a corporate setting, one must take into account arguments from individual and collective responsibility, as well as role ethics. Individual responsibility is not a separate or competing notion from that of collective responsibility. Rather, the two are interlinked. Individuals' responsibilities in collective settings combine both individual responsibility and collective responsibility (which is different from aggregate individual responsibility). In the majority of cases, both will apply in various degrees. Some members might have individual responsibility in addition to the collective responsibility, while others just the collective responsibility. There are also times when no-one bears individual moral responsibility but the members are still responsible for the collective part. My intuition is that collective moral responsibility is strongly linked to the way the collective setting affects individual judgements and moulds the decisions, and how the individuals use the collective setting to further their own ends. Individuals remain the moral agents but responsibility is collective if the actions in question are collective in character. I also explore the impacts of bureaucratic ethic and its influence on the individual. Bureaucracies can compartmentalize work to such a degree that individual human action is reduced to mere behaviour. Responsibility is diffused and the people working in the bureaucracy can come to view their actions to be outside the normal human realm where they would be responsible for what they do. Language games and rules, anonymity, internal power struggles, and the fragmentation of information are just some of the reasons responsibility and morality can get blurry in big institutional settings. Throughout the thesis I defend the following theses: ● People act differently depending on their roles. This is necessary for our society to function, but the more specific role demands should always be kept in check by the wider requirements of being a good human being. ● Acts in corporations (and other large collectives) are not reducible to individual actions, and cannot be explained fully by the behaviour of individual employees. ● Individuals are responsible for the actions that they undertake in the collective as role occupiers and are very rarely off the hook. Hiding behind role demands is usually only an excuse and shows a lack of virtue. ● Individuals in roles can be responsible even when the collective is not. This depends on if the act they performed was corporate in nature or not. ● Bureaucratic structure affects individual thinking and is not always a healthy environment to work in. ● Individual members can share responsibility with the collective and our share of the collective responsibility is strongly linked to our relations. ● Corporations and other collectives can be responsible for harm even when no individual is at fault. The structure and the policies of the collective are crucial. ● Socialization plays an important role in our morality at both work and outside it. We are all responsible for the kind of moral context we create. ● When accepting a role or a position in a collective, we are attaching ourselves with the values of that collective. ● Ethical theories should put more emphasis on good judgement and decision-making instead of vague generalisations. My conclusion is that the individual person is always in the centre when it comes to responsibility, and not so easily off the hook as we sometimes think. What we do, and especially who we choose to associate ourselves with, does matter and we should be more careful when we choose who we work for. Individuals within corporations are responsible for choosing that the corporation they associate with is one that they can ascribe to morally, if not fully, then at least for the most part. Individuals are also inclusively responsible to a varying degree for the collective activities they contribute to, even in overdetermined contexts. We all are responsible for the kind of corporations we choose to support through our actions as consumers, investors and citizens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces the META-NORD project which develops Nordic and Baltic part of the European open language resource infrastructure. META-NORD works on assembling, linking across languages, and making widely available the basic language resources used by developers, professionals and researchers to build specific products and applications. The goals of the project, overall approach and specific focus lines on wordnets, terminology resources and treebanks are described. Moreover, results achieved in first five months of the project, i.e. language whitepapers, metadata specification and IPR, are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language software applications encounter new words, e.g., acronyms, technical terminology, names or compounds of such words. In order to add new words to a lexicon, we need to indicate their inflectional paradigm. We present a new generally applicable method for creating an entry generator, i.e. a paradigm guesser, for finite-state transducer lexicons. As a guesser tends to produce numerous suggestions, it is important that the correct suggestions be among the first few candidates. We prove some formal properties of the method and evaluate it on Finnish, English and Swedish full-scale transducer lexicons. We use the open-source Helsinki Finite-State Technology to create finitestate transducer lexicons from existing lexical resources and automatically derive guessers for unknown words. The method has a recall of 82-87 % and a precision of 71-76 % for the three test languages. The model needs no external corpus and can therefore serve as a baseline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the �Single Network Adaptive Critic (SNAC)� is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider the problem of scheduling expression trees on delayed-load architectures. The problem tackled here takes root from the one considered in [Proceedings of the ACM SIGPLAN '91 Conf. on Programming Language Design and Implementation, 1991. p. 256] in which the leaves of the expression trees all refer to memory locations. A generalization of this involves the situation in which the trees may contain register variables, with the registers being used only at the leaves. Solutions to this generalization are given in [ACM Trans. Prog. Lang. Syst. 17 (1995) 740, Microproc. Microprog. 40 (1994) 577]. This paper considers the most general case in which the registers are reusable. This problem is tackled in [Comput. Lang, 21 (1995) 49] which gives an approximate solution to the problem under certain assumptions about the contiguity of the evaluation order: Here we propose an optimal solution (which may involve even a non-contiguous evaluation of the tree). The schedule generated by the algorithm given in this paper is optimal in the sense that it is an interlock-free schedule which uses the minimum number of registers required. An extension to the algorithm incorporates spilling. The problem as stated in this paper is an instruction scheduling problem. However, the problem could also be rephrased as an operations research problem with a difference in terminology. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of semantic interoperability arises while integrating applications in different task domains across the product life cycle. A new shape-function-relationship (SFR) framework is proposed as a taxonomy based on which an ontology is developed. Ontology based on the SFR framework, that captures explicit definition of terminology and knowledge relationships in terms of shape, function and relationship descriptors, offers an attractive approach for solving semantic interoperability issue. Since all instances of terms are based on single taxonomy with a formal classification, mapping of terms requires a simple check on the attributes used in the classification. As a preliminary study, the framework is used to develop ontology of terms used in the aero-engine domain and the ontology is used to resolve the semantic interoperability problem in the integration of design and maintenance. Since the framework allows a single term to have multiple classifications, handling context dependent usage of terms becomes possible. Automating the classification of terms and establishing the completeness of the classification scheme are being addressed presently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moore's Law has driven the semiconductor revolution enabling over four decades of scaling in frequency, size, complexity, and power. However, the limits of physics are preventing further scaling of speed, forcing a paradigm shift towards multicore computing and parallelization. In effect, the system is taking over the role that the single CPU was playing: high-speed signals running through chips but also packages and boards connect ever more complex systems. High-speed signals making their way through the entire system cause new challenges in the design of computing hardware. Inductance, phase shifts and velocity of light effects, material resonances, and wave behavior become not only prevalent but need to be calculated accurately and rapidly to enable short design cycle times. In essence, to continue scaling with Moore's Law requires the incorporation of Maxwell's equations in the design process. Incorporating Maxwell's equations into the design flow is only possible through the combined power that new algorithms, parallelization and high-speed computing provide. At the same time, incorporation of Maxwell-based models into circuit and system-level simulation presents a massive accuracy, passivity, and scalability challenge. In this tutorial, we navigate through the often confusing terminology and concepts behind field solvers, show how advances in field solvers enable integration into EDA flows, present novel methods for model generation and passivity assurance in large systems, and demonstrate the power of cloud computing in enabling the next generation of scalable Maxwell solvers and the next generation of Moore's Law scaling of systems. We intend to show the truly symbiotic growing relationship between Maxwell and Moore!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of nanotechnology has necessitated a better understanding of how material microstructure changes at the atomic level would affect the macroscopic properties that control the performance. Such a challenge has uncovered many phenomena that were not previously understood and taken for granted. Among them are the basic foundation of dislocation theories which are now known to be inadequate. Simplifying assumptions invoked at the macroscale may not be applicable at the micro- and/or nanoscale. There are implications of scaling hierrachy associated with in-homegeneity and nonequilibrium. of physical systems. What is taken to be homogeneous and equilibrium at the macroscale may not be so when the physical size of the material is reduced to microns. These fundamental issues cannot be dispensed at will for the sake of convenience because they could alter the outcome of predictions. Even more unsatisfying is the lack of consistency in modeling physical systems. This could translate to the inability for identifying the relevant manufacturing parameters and rendering the end product unpractical because of high cost. Advanced composite and ceramic materials are cases in point. Discussed are potential pitfalls for applying models at both the atomic and continuum levels. No encouragement is made to unravel the truth of nature. Let it be partiuclates, a smooth continuum or a combination of both. The present trend of development in scaling tends to seek for different characteristic lengths of material microstructures with or without the influence of time effects. Much will be learned from atomistic simulation models to show how results could differ as boundary conditions and scales are changed. Quantum mechanics, continuum and cosmological models provide evidence that no general approach is in sight. Of immediate interest is perhaps the establishment of greater precision in terminology so as to better communicate results involving multiscale physical events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the current paper, we have primarily addressed one powerful simulation tool developed during the last decades-Large Eddy Simulation (LES), which is most suitable for unsteady three-dimensional complex turbulent flows in industry and natural environment. The main point in LES is that the large-scale motion is resolved while the small-scale motion is modeled or, in geophysical terminology, parameterized. With a view to devising a subgrid-scale(SGS) model of high quality, we have highlighted analyzing physical aspects in scale interaction and-energy transfer such as dissipation, backscatter, local and non-local interaction, anisotropy and resolution requirement. They are the factors responsible for where the advantages and disadvantages in existing SGS models come from. A case study on LES of turbulence in vegetative canopy is presented to illustrate that LES model is more based on physical arguments. Then, varieties of challenging complex turbulent flows in both industry and geophysical fields in the near future-are presented. In conclusion; we may say with confidence that new century shall see the flourish in the research of turbulence with the aid of LES combined with other approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: La identificación de los fenómenos considerados sublunares en la Antigüedad a partir de la terminología latina es particularmente dificultosa. Nuestra propuesta es considerar los términos que aparecen en las enumeraciones de Plinio y Séneca tomando como parámetro de referencia la nomenclatura astronómica moderna y como hipótesis la de que las observaciones antiguas, por personas de gran capacidad de observación en un cielo sin contaminación lumínica pueden compararse a las que podemos obtener hoy a través de la astrofotografía. La conclusión es que, dado que en la Antigüedad grecolatina cada fenómeno se reportaba con la misma fórmula descriptiva, es posible determinar la equivalencia entre los términos romanos y los actuales (cometas, meteoros, etc.), como paso previo a adoptar una decisión traductiva, ya sea que los consideremos términos científicos o culturalmente específicos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: En este artículo propongo reflexionar sobre aspectos en los vínculos entre literatura y cine que aún hoy despiertan controversias o arrojan más preguntas que respuestas. Algunos de ellos son el marco disciplinar y metodológico, ciertos problemas de terminología, así como el criterio de clasificación de las transposiciones fílmicas. Para concluir, esbozo una propuesta de tipología de transposición.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES] El creciente fenómeno de la globalización está generando un nuevo y más complejo entorno económico y empresarial. Es éste, por tanto, un fenómeno absolutamente influyente en la gestión de las empresas, que no sólo ha introducido nuevas variables de gestión, sino que está impulsando la necesidad de regenerar algunos términos y conceptos, que parecían estar suficientemente consolidados en la ciencia de la economía de la empresa y que, sin embargo, actualmente no siempre resuelven la delimitación conceptual que parecen requerir algunos elementos constituyentes de la compleja gestión de las empresas internacionales. Desde esta perspectiva, definiremos sendos conceptos actualizados de internacionalización de empresa y de empresa multinacional a partir de sus respectivas evoluciones. Ambos son fenómenos complejos y, en ocasiones, ambiguos con un alto grado de heterogeneidad que dificulta su definición. Para resolver dicho problema, propondremos, adicionalmente, una nueva terminología para enumerar la variedad de implantaciones o filiales exteriores existentes.