964 resultados para Human Language Technologies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Depuis que l'animal humain a conçu un système de technologies pour la pensée abstraite grâce au langage, la guerre contre le monde sauvage est devenu une voie à sens unique vers l'aliénation, la civilisation et la littérature. Le but de ce travail est d'analyser comment les récits civilisationnels donnent une structure à l'expérience par le biais de la ségrégation, de la domestication, de la sélection, et de l'extermination, tandis que les récits sauvages démontrent les possibilités infinies du chaos pour découvrir le monde en toute sa diversité et en lien avec sa communauté de vie. Un des objectifs de cette thèse a été de combler le fossé entre la science et la littérature, et d'examiner l'interdépendance de la fiction et la réalité. Un autre objectif a été de mettre ces récits au cœur d'un dialogue les uns avec les autres, ainsi que de tracer leur expression dans les différentes disciplines et œuvres pour enfants et adultes mais également d’analyser leur manifestations c’est redondant dans la vie réelle. C'est un effort multi-disciplinaires qui se reflète dans la combinaison de méthodes de recherche en anthropologie et en études littéraires. Cette analyse compare et contraste trois livres de fiction pour enfants qui présentent trois différents paradigmes socio-économiques, à savoir, «Winnie-l'Ourson» de Milne qui met en place un monde civilisé monarcho-capitaliste, la trilogie de Nosov sur «les aventures de Neznaika et ses amis» qui présente les défis et les exploits d'une société anarcho-socialiste dans son évolution du primitivisme vers la technologie, et les livres de Moomines de Jansson, qui représentent le chaos, l'anarchie, et l'état sauvage qui contient tout, y compris des épisodes de civilisation. En axant la méthodologie de ma recherche sur la façon dont nous connaissons le monde, j'ai d'abord examiné la construction, la transmission et l'acquisition des connaissances, en particulier à travers la théorie de praxis de Bourdieu et la critique de la civilisation développée dans les études de Zerzan, Ong, et Goody sur les liens entre l'alphabétisation, la dette et l'oppression. Quant à la littérature pour enfants, j'ai choisi trois livres que j’ai connus pendant mon enfance, c'est-à-dire des livres qui sont devenus comme une «langue maternelle» pour moi. En ce sens, ce travail est aussi de «l’anthropologie du champ natif». En outre, j’analyse les prémisses sous-jacentes qui se trouvent non seulement dans les trois livres, mais dans le déroulement des récits de l'état sauvage et de la civilisation dans la vie réelle, des analyses qui paraissent dans cette thèse sous la forme d'extraits d’un journal ethnographique. De même que j’examine la nature de la littérature ainsi que des structures civilisées qui domestiquent le monde au moyen de menaces de mort, je trace aussi la présence de ces récits dans l'expression scientifique (le récit malthusien-darwinien), religieuse, et dans autres expressions culturelles, et réfléchis sur les défis présentés par la théorie anarchiste (Kropotkine) ainsi que par les livres pour enfants écrits du point de vue sauvage, tels que ceux des Moomines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ce mémoire présente les recherches et réflexions entourant la conception d’une application à base d’ontologie dédiée au e-recrutement dans le domaine des services de dotation de personnel en technologies de l’information à l’ère du Web Social. Cette application, nommée Combine, vise essentiellement à optimiser et enrichir la Communication Médiée par Ordinateur (CMO) des acteurs du domaine et utilise des concepts issus du paradigme technologique émergent qu’est le Web sémantique. Encore très peu discuté dans une perspective CMO, le présent mémoire se propose donc d’examiner les enjeux communicationnels relatifs à ce nouveau paradigme. Il présente ses principaux concepts, dont la notion d’ontologie qui implique la modélisation formelle de connaissances, et expose le cas de développement de Combine. Il décrit comment cette application fut développée, de l’analyse des besoins à l’évaluation du prototype par les utilisateurs ciblés, tout en révélant les préoccupations, les contraintes et les opportunités rencontrées en cours de route. Au terme de cet examen, le mémoire tend à évaluer de manière critique le potentiel de Combine à optimiser la CMO du domaine d’activité ciblé. Le mémoire dresse au final un portrait plutôt favorable quant à la perception positive des acteurs du domaine d’utiliser un tel type d’application, et aussi quant aux nets bénéfices en frais d’Interactions Humain-Ordinateur (IHO) qu’elle fait miroiter. Il avertit toutefois d’une certaine exacerbation du problème dit « d’engagement ontologique » à considérer lors de la construction d’ontologies modélisant des objets sociaux tels que ceux dont le monde du recrutement est peuplé.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les études génétiques, telles que les études de liaison ou d’association, ont permis d’acquérir une plus grande connaissance sur l’étiologie de plusieurs maladies affectant les populations humaines. Même si une dizaine de milliers d’études génétiques ont été réalisées sur des centaines de maladies ou autres traits, une grande partie de leur héritabilité reste inexpliquée. Depuis une dizaine d’années, plusieurs percées dans le domaine de la génomique ont été réalisées. Par exemple, l’utilisation des micropuces d’hybridation génomique comparative à haute densité a permis de démontrer l’existence à grande échelle des variations et des polymorphismes en nombre de copies. Ces derniers sont maintenant détectables à l’aide de micropuce d’ADN ou du séquençage à haut débit. De plus, des études récentes utilisant le séquençage à haut débit ont permis de démontrer que la majorité des variations présentes dans l’exome d’un individu étaient rares ou même propres à cet individu. Ceci a permis la conception d’une nouvelle micropuce d’ADN permettant de déterminer rapidement et à faible coût le génotype de plusieurs milliers de variations rares pour un grand ensemble d’individus à la fois. Dans ce contexte, l’objectif général de cette thèse vise le développement de nouvelles méthodologies et de nouveaux outils bio-informatiques de haute performance permettant la détection, à de hauts critères de qualité, des variations en nombre de copies et des variations nucléotidiques rares dans le cadre d’études génétiques. Ces avancées permettront, à long terme, d’expliquer une plus grande partie de l’héritabilité manquante des traits complexes, poussant ainsi l’avancement des connaissances sur l’étiologie de ces derniers. Un algorithme permettant le partitionnement des polymorphismes en nombre de copies a donc été conçu, rendant possible l’utilisation de ces variations structurales dans le cadre d’étude de liaison génétique sur données familiales. Ensuite, une étude exploratoire a permis de caractériser les différents problèmes associés aux études génétiques utilisant des variations en nombre de copies rares sur des individus non reliés. Cette étude a été réalisée avec la collaboration du Wellcome Trust Centre for Human Genetics de l’University of Oxford. Par la suite, une comparaison de la performance des algorithmes de génotypage lors de leur utilisation avec une nouvelle micropuce d’ADN contenant une majorité de marqueurs rares a été réalisée. Finalement, un outil bio-informatique permettant de filtrer de façon efficace et rapide des données génétiques a été implémenté. Cet outil permet de générer des données de meilleure qualité, avec une meilleure reproductibilité des résultats, tout en diminuant les chances d’obtenir une fausse association.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lorsque les ouragans entrent en contact avec l'environnement bâti et naturel, les autorités publiques n'ont parfois d'autre choix que de déclarer l'évacuation obligatoire de la population située en zone à risque. En raison de l'imprévisibilité du déroulement d'une catastrophe et des comportements humains, les opérations d'évacuation sont confrontées à une incertitude significative. Les expériences passées ont montré que les technologies de l'information et des communications (TIC) ont le potentiel d'améliorer l'état de l'art en gestion des évacuations. Malgré cette reconnaissance, les recherches empiriques sur ce sujet sont à ce jour limitées. La présente étude de cas de la ville de New York explore comment l'intégration des TIC dans la planification opérationnelle des organisations ayant des responsabilités en matière de transport peut améliorer leurs réponses aux événements et influencer le succès global du système de gestion des catastrophes. L'analyse est basée sur les informations recueillies au moyen d'entretiens semi-dirigés avec les organisations de transport et de gestion des catastrophes de la ville de New York ainsi qu’avec des experts du milieu universitaire. Les résultats mettent en lumière le potentiel des TIC pour la prise de décision en interne. Même s’il est largement reconnu que les TIC sont des moyens efficaces d'échanger de l'information en interne et entre les organisations, ces usages sont confrontés à certaines contraintes technologique, organisationnelle, structurelle et systémique. Cette observation a permis d'identifier les contraintes vécues dans les pratiques usuelles de gestion des systèmes urbains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the work reported here is to capture the commonsense knowledge of non-expert human contributors. Achieving this goal will enable more intelligent human-computer interfaces and pave the way for computers to reason about our world. In the domain of natural language processing, it will provide the world knowledge much needed for semantic processing of natural language. To acquire knowledge from contributors not trained in knowledge engineering, I take the following four steps: (i) develop a knowledge representation (KR) model for simple assertions in natural language, (ii) introduce cumulative analogy, a class of nearest-neighbor based analogical reasoning algorithms over this representation, (iii) argue that cumulative analogy is well suited for knowledge acquisition (KA) based on a theoretical analysis of effectiveness of KA with this approach, and (iv) test the KR model and the effectiveness of the cumulative analogy algorithms empirically. To investigate effectiveness of cumulative analogy for KA empirically, Learner, an open source system for KA by cumulative analogy has been implemented, deployed, and evaluated. (The site "1001 Questions," is available at http://teach-computers.org/learner.html). Learner acquires assertion-level knowledge by constructing shallow semantic analogies between a KA topic and its nearest neighbors and posing these analogies as natural language questions to human contributors. Suppose, for example, that based on the knowledge about "newspapers" already present in the knowledge base, Learner judges "newspaper" to be similar to "book" and "magazine." Further suppose that assertions "books contain information" and "magazines contain information" are also already in the knowledge base. Then Learner will use cumulative analogy from the similar topics to ask humans whether "newspapers contain information." Because similarity between topics is computed based on what is already known about them, Learner exhibits bootstrapping behavior --- the quality of its questions improves as it gathers more knowledge. By summing evidence for and against posing any given question, Learner also exhibits noise tolerance, limiting the effect of incorrect similarities. The KA power of shallow semantic analogy from nearest neighbors is one of the main findings of this thesis. I perform an analysis of commonsense knowledge collected by another research effort that did not rely on analogical reasoning and demonstrate that indeed there is sufficient amount of correlation in the knowledge base to motivate using cumulative analogy from nearest neighbors as a KA method. Empirically, evaluating the percentages of questions answered affirmatively, negatively and judged to be nonsensical in the cumulative analogy case compares favorably with the baseline, no-similarity case that relies on random objects rather than nearest neighbors. Of the questions generated by cumulative analogy, contributors answered 45% affirmatively, 28% negatively and marked 13% as nonsensical; in the control, no-similarity case 8% of questions were answered affirmatively, 60% negatively and 26% were marked as nonsensical.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is one of a series of short case studies describing how academic tutors at the University of Southampton have made use of learning technologies to support their students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El desarrollo del presente documento constituye una investigación sobre las actitudes de los directivos frente a la adopción del e-learning como herramienta de trabajo en las organizaciones de Bogotá. Para ello se realizó una encuesta a 101 directivos, tomando como base el tipo de muestreo de conveniencia; esto con el objetivo de identificar sus actitudes frente al uso del e-learning y su influencia dentro de la organización. Como resultado se obtuvo que las actitudes de los directivos influencian en el uso de herramientas e-learning, así como también en las acciones que promueven su uso y en las actitudes de los empleados; por otro lado se identificó que las creencias relacionadas con la apropiación de herramientas e-learning y los factores facilitadores del uso de estas, influencian en las actitudes de los directivos. Lo anterior, corresponde a los análisis llevados a cabo a partir de los resultados contrastados con los estudios empíricos hallados y el marco teórico desarrollado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La creación de conocimiento al interior de las organizaciones es visible mediante la dirección adecuada del conocimiento de los individuos, sin embargo, cada individuo debe interactuar de tal manera que forme una red o sistema de conocimiento organizacional que consolide a largo plazo las empresas en el entorno en el que se desenvuelven. Este documento revisa elementos centrales acerca de la gestión de conocimiento visto desde varios autores y perspectivas e identifica puntos clave para diseñar un modelo de gestión de conocimiento para una empresa del sector de insumos químicos para la industria farmacéutica, cosmética y de alimentos de la ciudad de Bogotá.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do resource booms affect human capital accumulation? We exploit time and spatial variation generated by the commodity boom across local governments in Peru to measure the effect of natural resources on human capital formation. We explore the effect of both mining production and tax revenues on test scores, finding a substantial and statistically significant effect for the latter. Transfers to local governments from mining tax revenues are linked to an increase in math test scores of around 0.23 standard deviations. We find that the hiring of permanent teachers as well as the increases in parental employment and improvements in health outcomes of adults and children are plausible mechanisms for such large effect on learning. These findings suggest that redistributive policies could facilitate the accumulation of human capital in resource abundant developing countries as a way to avoid the natural resources curse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An evolutionary perspective on human thought and behaviour indicates that we should expect to find universal systems of perception, classification, and decision-making regarding the natural world. It is the interaction between these evolved aspects of the human mind, the biodiversity of the natural world, and unique historical, social, and economic contexts within which individuals develop and act that gives rise to cultural diversity. The palaeoanthropological record also indicates that language is a recently evolved phenomenon. This suggests that linguistic approaches in ethnobiology are likely to provide only a partial understanding of how humans perceive, classify, and engage with the natural world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e. g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Education and ethnicity cannot be discussed without taking language into account. This paper will argue that any discussion of ethnic minorities cannot ignore the question of language, nor can any discussion of human rights ignore the question of language rights. Unfortunately, in today's globalised world, governments and minorities are faced with conflicting pressures: on the one hand, for the development and use of education in a global/international language; on the other for the use and development of mother tongue, local or indigenous languages in education. Language complexity and ethnic plurality were largely brought about as a result of the creation of nation-states, which were spread around the world as a result of European colonialism. European languages and formal education systems were used as a means of political and economic control. The legacy that was left by the colonial powers has complicated ethnic relations and has frequently led to conflict. While there is now greater recognition of the importance of language both for economic and educational development, as well as for human rights, the forces of globalisation are leading towards uniformity in the languages used, in culture and even in education. They are working against the development of language rights for smaller groups. We are witnessing a sharp decline in the number of languages spoken. Only those languages which are numerically, economically and politically strong are likely to survive. As a result many linguistic and ethnic groups are in danger of being further marginalised. This paper will illustrate this thesis both historically and from several contemporary societies, showing how certain policies have exacerbated ethnic conflict while others are seeking to promote harmony and reconciliation. Why this should be so will be explored. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human gut microbiota comprises a diverse microbial consortium closely co-evolved with the human genome and diet. The importance of the gut microbiota in regulating human health and disease has however been largely overlooked due to the inaccessibility of the intestinal habitat, the complexity of the gut microbiota itself and the fact that many of its members resist cultivation and are in fact new to science. However, with the emergence of 16S rRNA molecular tools and "post-genomics" high resolution technologies for examining microorganisms as they occur in nature without the need for prior laboratory culture, this limited view of the gut microbiota is rapidly changing. This review will discuss the application of molecular microbiological tools to study the human gut microbiota in a culture independent manner. Genomics or metagenomics approaches have a tremendous capability to generate compositional data and to measure the metabolic potential encoded by the combined genomes of the gut microbiota. Another post-genomics approach, metabonomics, has the capacity to measure the metabolic kinetic or flux of metabolites through an ecosystem at a particular point in time or over a time course. Metabonomics thus derives data on the function of the gut microbiota in situ and how it responds to different environmental stimuli e.g. substrates like prebiotics, antibiotics and other drugs and in response to disease. Recently these two culture independent, high resolution approaches have been combined into a single "transgenomic" approach which allows correlation of changes in metabolite profiles within human biofluids with microbiota compositional metagenomic data. Such approaches are providing novel insight into the composition, function and evolution of our gut microbiota.