934 resultados para Préhistoire -- Terminology


Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report has been written as part of the E-ruralnet –project that addresses e-learning as a means for enhancing lifelong learning opportunities in rural areas, with emphasis on SMEs, micro-enterprises, self-employed and persons seeking employment. E-ruralnet is a European network project part-funded by the European Commission in the context of the Lifelong Learning Programme, Transversal projects-ICT. This report aims to address two issues identified as requiring attention in the previous Observatory study: firstly, access to e-learning for rural areas that have not adequate ICT infrastructure; and secondly new learning approaches introduced through new interactive ICT tools such as web 2.0., wikis, podcasts etc. The possibility of using alternative technology in addition to computers is examined (mobile telephones, DVDs) as well as new approaches to learning (simulation, serious games). The first part of the report examines existing literature on e-learning and what e-learning is all about. Institutional users, learners and instructors/teachers are all looked at separately. We then turn to the implementation of e-learning from the organizational point of view and focus on quality issues related to e-learning. The report includes a separate chapter or e-learning from the rural perspective since most of Europe is geographically speaking rural and the population in those areas is that which could most benefit from the possibilities introduced by the e-learning development. The section titled “Alternative media”, in accordance with the project terminology, looks at standalone technology that is of particular use to rural areas without proper internet connection. It also evaluates the use of new tools and media in e-learning and takes a look at m-learning. Finally, the use of games, serious games and simulations in learning is considered. Practical examples and cases are displayed in a box to facilitate pleasant reading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optical examination of orthorhombic CsIO4 crystals has revealed the existence of ferroelastic domains. That they are ferroelastic domains was confirmed by subjecting the crystal to external stresses. Our results strongly suggest that the transition at 150°C is of the species 4/mmmFmmm in Aim's terminology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pragmatism has sometimes been taken as a catchphrase for epistemological stances in which anything goes. However, other authors argue that the real novelty and contribution of this tradition has to do with its view of action as the context in which all things human take place. Thus, it is action rather than, for example, discourses that should be our starting point in social theory. The introductory section of the book situates pragmatism (especially the ideas of G. H. Mead and John Dewey) within the field and tradition of social theory. This introductory also contextualizes the main core of the book which consists of four chapters. Two of these chapters have been published as articles in scientific journals and one in an edited book. All of them discuss the core problem of social theory: how is action related to social structures (and vice versa)? The argument is that habitual action is the explanation for the emergence of social structures from our action. Action produces structures and social reproduction takes place when action is habitualized; that is, when we develop social dispositions to act in a certain manner in familiar environments. This also means that even though the physical environment is the same for all of us, our habits structure it into different kinds of action possibilities. Each chapter highlights these general insights from different angles. Practice theory has gained momentum in recent years and it has many commonalities with pragmatism because both highlight the situated and corporeal character of human activity. One famous proponent of practice theory is Margaret Archer who has argued that the pragmatism of G. H. Mead leads to an oversocialized conception of selfhood. Mead does indeed present a socialized view of selfhood but this is a meta-sociological argument rather than a substantial sociological claim. Accordingly, one can argue that in this general sense intersubjectivity precedes subjectivity and not the other way around. Such a view does not indicate that our social relation would necessarily "colonize" individual action because there is a place for internal conversations (in Archer s terminology); it is especially in those phases of action where it meets obstacles due to the changes of the environment. The second issue discussed has the background assumption that social structures can fruitfully be conceptualized as institutions. A general classification of different institution theories is presented and it is argued that there is a need for a habitual theory of institutions due to the problems associated with these other theories. So-called habitual institutionalism accounts for institutions in terms of established and prevalent social dispositions that structure our social interactions. The germs of this institution theory can be found in the work of Thorstein Veblen. Since Veblen s times, these ideas have been discussed for example, by the economist Geoffrey M. Hodgson. His ideas on the evolution of institutions are presented but a critical stance is taken towards his tendency of defining institutions with the help of rules because rules are not always present in institutions. Accordingly, habitual action is the most basic but by no means the only aspect of institutional reproduction. The third chapter deals with theme of action and structures in the context of Pierre Bourdieu s thought. Bourdieu s term habitus refers to a system of dispositions which structure social fields. It is argued that habits come close to the concept of habitus in the sense that the latter consists of particular kinds of habits; those that are related to the reproduction of socioeconomic positions. Habits are thus constituents of a general theory of societal reproduction whereas habitus is a systematic combination of socioeconomic habits. The fourth theme relates to issues of social change and development. The capabilities approach has been associated with the name of Amartya Sen, for example, and it underscores problems inhering in economistic ways of evaluating social development. However, Sen s argument has some theoretical problems. For example, his theory cannot adequately confront the problem of relativism. In addition, Sen s discussion lacks also a theory of the role of the public. With the help of arguments derived from pragmatism, one gets an action-based, socially constituted view of freedom in which the role of the public is essential. In general, it is argued that a socially constituted view of agency does not necessarily to lead to pessimistic conclusions about the freedom of action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite increasing interest in the discursive aspects of strategy, few studies have examined strategy texts and their power effects. We draw from Critical Discourse Analysis to better understand the power of strategic plans as a directive genre. In our empirical analysis, we examined the creation of the official strategic plan of the City of Lahti in Finland. As a result of our inductive analysis, we identified five central discursive features of this plan: self-authorization, special terminology, discursive innovation, forced consensus and deonticity. We argue that these features can, with due caution, be generalized and conceived as distinctive features of the strategy genre. We maintain that these discursive features are not trivial characteristics; they have important implications for the textual agency of strategic plans, their performative effects, impact on power relations and ideological implications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene expression is one of the most critical factors influencing the phenotype of a cell. As a result of several technological advances, measuring gene expression levels has become one of the most common molecular biological measurements to study the behaviour of cells. The scientific community has produced enormous and constantly increasing collection of gene expression data from various human cells both from healthy and pathological conditions. However, while each of these studies is informative and enlighting in its own context and research setup, diverging methods and terminologies make it very challenging to integrate existing gene expression data to a more comprehensive view of human transcriptome function. On the other hand, bioinformatic science advances only through data integration and synthesis. The aim of this study was to develop biological and mathematical methods to overcome these challenges and to construct an integrated database of human transcriptome as well as to demonstrate its usage. Methods developed in this study can be divided in two distinct parts. First, the biological and medical annotation of the existing gene expression measurements needed to be encoded by systematic vocabularies. There was no single existing biomedical ontology or vocabulary suitable for this purpose. Thus, new annotation terminology was developed as a part of this work. Second part was to develop mathematical methods correcting the noise and systematic differences/errors in the data caused by various array generations. Additionally, there was a need to develop suitable computational methods for sample collection and archiving, unique sample identification, database structures, data retrieval and visualization. Bioinformatic methods were developed to analyze gene expression levels and putative functional associations of human genes by using the integrated gene expression data. Also a method to interpret individual gene expression profiles across all the healthy and pathological tissues of the reference database was developed. As a result of this work 9783 human gene expression samples measured by Affymetrix microarrays were integrated to form a unique human transcriptome resource GeneSapiens. This makes it possible to analyse expression levels of 17330 genes across 175 types of healthy and pathological human tissues. Application of this resource to interpret individual gene expression measurements allowed identification of tissue of origin with 92.0% accuracy among 44 healthy tissue types. Systematic analysis of transcriptional activity levels of 459 kinase genes was performed across 44 healthy and 55 pathological tissue types and a genome wide analysis of kinase gene co-expression networks was done. This analysis revealed biologically and medically interesting data on putative kinase gene functions in health and disease. Finally, we developed a method for alignment of gene expression profiles (AGEP) to perform analysis for individual patient samples to pinpoint gene- and pathway-specific changes in the test sample in relation to the reference transcriptome database. We also showed how large-scale gene expression data resources can be used to quantitatively characterize changes in the transcriptomic program of differentiating stem cells. Taken together, these studies indicate the power of systematic bioinformatic analyses to infer biological and medical insights from existing published datasets as well as to facilitate the interpretation of new molecular profiling data from individual patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I discuss role responsibly, individual responsibility and collective responsibility in corporate multinational setting. My case study is about minerals used in electronics that come from the Democratic Republic of Congo. What I try to show throughout the thesis is how many things need to be taken into consideration when we discuss the responsibility of individuals in corporations. No easy and simple answers are available. Instead, we must keep in mind the complexity of the situation at all times, judging cases on individual basis, emphasizing the importance of individual judgement and virtue, as well as the responsibility we all share as members of groups and the wider society. I begin by discussing the demands that are placed on us as employees. There is always a potential for a conflict between our different roles and also the wider demands placed on us. Role demands are usually much more specific than the wider question of how we should act as human beings. The terminology of roles can also be misleading as it can create illusions about our work selves being somehow radically separated from our everyday, true selves. The nature of collective decision-making and its implications for responsibility is important too. When discussing the moral responsibility of an employee in a corporate setting, one must take into account arguments from individual and collective responsibility, as well as role ethics. Individual responsibility is not a separate or competing notion from that of collective responsibility. Rather, the two are interlinked. Individuals' responsibilities in collective settings combine both individual responsibility and collective responsibility (which is different from aggregate individual responsibility). In the majority of cases, both will apply in various degrees. Some members might have individual responsibility in addition to the collective responsibility, while others just the collective responsibility. There are also times when no-one bears individual moral responsibility but the members are still responsible for the collective part. My intuition is that collective moral responsibility is strongly linked to the way the collective setting affects individual judgements and moulds the decisions, and how the individuals use the collective setting to further their own ends. Individuals remain the moral agents but responsibility is collective if the actions in question are collective in character. I also explore the impacts of bureaucratic ethic and its influence on the individual. Bureaucracies can compartmentalize work to such a degree that individual human action is reduced to mere behaviour. Responsibility is diffused and the people working in the bureaucracy can come to view their actions to be outside the normal human realm where they would be responsible for what they do. Language games and rules, anonymity, internal power struggles, and the fragmentation of information are just some of the reasons responsibility and morality can get blurry in big institutional settings. Throughout the thesis I defend the following theses: ● People act differently depending on their roles. This is necessary for our society to function, but the more specific role demands should always be kept in check by the wider requirements of being a good human being. ● Acts in corporations (and other large collectives) are not reducible to individual actions, and cannot be explained fully by the behaviour of individual employees. ● Individuals are responsible for the actions that they undertake in the collective as role occupiers and are very rarely off the hook. Hiding behind role demands is usually only an excuse and shows a lack of virtue. ● Individuals in roles can be responsible even when the collective is not. This depends on if the act they performed was corporate in nature or not. ● Bureaucratic structure affects individual thinking and is not always a healthy environment to work in. ● Individual members can share responsibility with the collective and our share of the collective responsibility is strongly linked to our relations. ● Corporations and other collectives can be responsible for harm even when no individual is at fault. The structure and the policies of the collective are crucial. ● Socialization plays an important role in our morality at both work and outside it. We are all responsible for the kind of moral context we create. ● When accepting a role or a position in a collective, we are attaching ourselves with the values of that collective. ● Ethical theories should put more emphasis on good judgement and decision-making instead of vague generalisations. My conclusion is that the individual person is always in the centre when it comes to responsibility, and not so easily off the hook as we sometimes think. What we do, and especially who we choose to associate ourselves with, does matter and we should be more careful when we choose who we work for. Individuals within corporations are responsible for choosing that the corporation they associate with is one that they can ascribe to morally, if not fully, then at least for the most part. Individuals are also inclusively responsible to a varying degree for the collective activities they contribute to, even in overdetermined contexts. We all are responsible for the kind of corporations we choose to support through our actions as consumers, investors and citizens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces the META-NORD project which develops Nordic and Baltic part of the European open language resource infrastructure. META-NORD works on assembling, linking across languages, and making widely available the basic language resources used by developers, professionals and researchers to build specific products and applications. The goals of the project, overall approach and specific focus lines on wordnets, terminology resources and treebanks are described. Moreover, results achieved in first five months of the project, i.e. language whitepapers, metadata specification and IPR, are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language software applications encounter new words, e.g., acronyms, technical terminology, names or compounds of such words. In order to add new words to a lexicon, we need to indicate their inflectional paradigm. We present a new generally applicable method for creating an entry generator, i.e. a paradigm guesser, for finite-state transducer lexicons. As a guesser tends to produce numerous suggestions, it is important that the correct suggestions be among the first few candidates. We prove some formal properties of the method and evaluate it on Finnish, English and Swedish full-scale transducer lexicons. We use the open-source Helsinki Finite-State Technology to create finitestate transducer lexicons from existing lexical resources and automatically derive guessers for unknown words. The method has a recall of 82-87 % and a precision of 71-76 % for the three test languages. The model needs no external corpus and can therefore serve as a baseline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though dynamic programming offers an optimal control solution in a state feedback form, the method is overwhelmed by computational and storage requirements. Approximate dynamic programming implemented with an Adaptive Critic (AC) neural network structure has evolved as a powerful alternative technique that obviates the need for excessive computations and storage requirements in solving optimal control problems. In this paper, an improvement to the AC architecture, called the �Single Network Adaptive Critic (SNAC)� is presented. This approach is applicable to a wide class of nonlinear systems where the optimal control (stationary) equation can be explicitly expressed in terms of the state and costate variables. The selection of this terminology is guided by the fact that it eliminates the use of one neural network (namely the action network) that is part of a typical dual network AC setup. As a consequence, the SNAC architecture offers three potential advantages: a simpler architecture, lesser computational load and elimination of the approximation error associated with the eliminated network. In order to demonstrate these benefits and the control synthesis technique using SNAC, two problems have been solved with the AC and SNAC approaches and their computational performances are compared. One of these problems is a real-life Micro-Electro-Mechanical-system (MEMS) problem, which demonstrates that the SNAC technique is applicable to complex engineering systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider the problem of scheduling expression trees on delayed-load architectures. The problem tackled here takes root from the one considered in [Proceedings of the ACM SIGPLAN '91 Conf. on Programming Language Design and Implementation, 1991. p. 256] in which the leaves of the expression trees all refer to memory locations. A generalization of this involves the situation in which the trees may contain register variables, with the registers being used only at the leaves. Solutions to this generalization are given in [ACM Trans. Prog. Lang. Syst. 17 (1995) 740, Microproc. Microprog. 40 (1994) 577]. This paper considers the most general case in which the registers are reusable. This problem is tackled in [Comput. Lang, 21 (1995) 49] which gives an approximate solution to the problem under certain assumptions about the contiguity of the evaluation order: Here we propose an optimal solution (which may involve even a non-contiguous evaluation of the tree). The schedule generated by the algorithm given in this paper is optimal in the sense that it is an interlock-free schedule which uses the minimum number of registers required. An extension to the algorithm incorporates spilling. The problem as stated in this paper is an instruction scheduling problem. However, the problem could also be rephrased as an operations research problem with a difference in terminology. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of semantic interoperability arises while integrating applications in different task domains across the product life cycle. A new shape-function-relationship (SFR) framework is proposed as a taxonomy based on which an ontology is developed. Ontology based on the SFR framework, that captures explicit definition of terminology and knowledge relationships in terms of shape, function and relationship descriptors, offers an attractive approach for solving semantic interoperability issue. Since all instances of terms are based on single taxonomy with a formal classification, mapping of terms requires a simple check on the attributes used in the classification. As a preliminary study, the framework is used to develop ontology of terms used in the aero-engine domain and the ontology is used to resolve the semantic interoperability problem in the integration of design and maintenance. Since the framework allows a single term to have multiple classifications, handling context dependent usage of terms becomes possible. Automating the classification of terms and establishing the completeness of the classification scheme are being addressed presently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Moore's Law has driven the semiconductor revolution enabling over four decades of scaling in frequency, size, complexity, and power. However, the limits of physics are preventing further scaling of speed, forcing a paradigm shift towards multicore computing and parallelization. In effect, the system is taking over the role that the single CPU was playing: high-speed signals running through chips but also packages and boards connect ever more complex systems. High-speed signals making their way through the entire system cause new challenges in the design of computing hardware. Inductance, phase shifts and velocity of light effects, material resonances, and wave behavior become not only prevalent but need to be calculated accurately and rapidly to enable short design cycle times. In essence, to continue scaling with Moore's Law requires the incorporation of Maxwell's equations in the design process. Incorporating Maxwell's equations into the design flow is only possible through the combined power that new algorithms, parallelization and high-speed computing provide. At the same time, incorporation of Maxwell-based models into circuit and system-level simulation presents a massive accuracy, passivity, and scalability challenge. In this tutorial, we navigate through the often confusing terminology and concepts behind field solvers, show how advances in field solvers enable integration into EDA flows, present novel methods for model generation and passivity assurance in large systems, and demonstrate the power of cloud computing in enabling the next generation of scalable Maxwell solvers and the next generation of Moore's Law scaling of systems. We intend to show the truly symbiotic growing relationship between Maxwell and Moore!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The advent of nanotechnology has necessitated a better understanding of how material microstructure changes at the atomic level would affect the macroscopic properties that control the performance. Such a challenge has uncovered many phenomena that were not previously understood and taken for granted. Among them are the basic foundation of dislocation theories which are now known to be inadequate. Simplifying assumptions invoked at the macroscale may not be applicable at the micro- and/or nanoscale. There are implications of scaling hierrachy associated with in-homegeneity and nonequilibrium. of physical systems. What is taken to be homogeneous and equilibrium at the macroscale may not be so when the physical size of the material is reduced to microns. These fundamental issues cannot be dispensed at will for the sake of convenience because they could alter the outcome of predictions. Even more unsatisfying is the lack of consistency in modeling physical systems. This could translate to the inability for identifying the relevant manufacturing parameters and rendering the end product unpractical because of high cost. Advanced composite and ceramic materials are cases in point. Discussed are potential pitfalls for applying models at both the atomic and continuum levels. No encouragement is made to unravel the truth of nature. Let it be partiuclates, a smooth continuum or a combination of both. The present trend of development in scaling tends to seek for different characteristic lengths of material microstructures with or without the influence of time effects. Much will be learned from atomistic simulation models to show how results could differ as boundary conditions and scales are changed. Quantum mechanics, continuum and cosmological models provide evidence that no general approach is in sight. Of immediate interest is perhaps the establishment of greater precision in terminology so as to better communicate results involving multiscale physical events.