865 resultados para diffusion of information
Resumo:
RÉSUMÉ - Les images satellitales multispectrales, notamment celles à haute résolution spatiale (plus fine que 30 m au sol), représentent une source d’information inestimable pour la prise de décision dans divers domaines liés à la gestion des ressources naturelles, à la préservation de l’environnement ou à l’aménagement et la gestion des centres urbains. Les échelles d’étude peuvent aller du local (résolutions plus fines que 5 m) à des échelles régionales (résolutions plus grossières que 5 m). Ces images caractérisent la variation de la réflectance des objets dans le spectre qui est l’information clé pour un grand nombre d’applications de ces données. Or, les mesures des capteurs satellitaux sont aussi affectées par des facteurs « parasites » liés aux conditions d’éclairement et d’observation, à l’atmosphère, à la topographie et aux propriétés des capteurs. Deux questions nous ont préoccupé dans cette recherche. Quelle est la meilleure approche pour restituer les réflectances au sol à partir des valeurs numériques enregistrées par les capteurs tenant compte des ces facteurs parasites ? Cette restitution est-elle la condition sine qua non pour extraire une information fiable des images en fonction des problématiques propres aux différents domaines d’application des images (cartographie du territoire, monitoring de l’environnement, suivi des changements du paysage, inventaires des ressources, etc.) ? Les recherches effectuées les 30 dernières années ont abouti à une série de techniques de correction des données des effets des facteurs parasites dont certaines permettent de restituer les réflectances au sol. Plusieurs questions sont cependant encore en suspens et d’autres nécessitent des approfondissements afin, d’une part d’améliorer la précision des résultats et d’autre part, de rendre ces techniques plus versatiles en les adaptant à un plus large éventail de conditions d’acquisition des données. Nous pouvons en mentionner quelques unes : - Comment prendre en compte des caractéristiques atmosphériques (notamment des particules d’aérosol) adaptées à des conditions locales et régionales et ne pas se fier à des modèles par défaut qui indiquent des tendances spatiotemporelles à long terme mais s’ajustent mal à des observations instantanées et restreintes spatialement ? - Comment tenir compte des effets de « contamination » du signal provenant de l’objet visé par le capteur par les signaux provenant des objets environnant (effet d’adjacence) ? ce phénomène devient très important pour des images de résolution plus fine que 5 m; - Quels sont les effets des angles de visée des capteurs hors nadir qui sont de plus en plus présents puisqu’ils offrent une meilleure résolution temporelle et la possibilité d’obtenir des couples d’images stéréoscopiques ? - Comment augmenter l’efficacité des techniques de traitement et d’analyse automatique des images multispectrales à des terrains accidentés et montagneux tenant compte des effets multiples du relief topographique sur le signal capté à distance ? D’autre part, malgré les nombreuses démonstrations par des chercheurs que l’information extraite des images satellitales peut être altérée à cause des tous ces facteurs parasites, force est de constater aujourd’hui que les corrections radiométriques demeurent peu utilisées sur une base routinière tel qu’est le cas pour les corrections géométriques. Pour ces dernières, les logiciels commerciaux de télédétection possèdent des algorithmes versatiles, puissants et à la portée des utilisateurs. Les algorithmes des corrections radiométriques, lorsqu’ils sont proposés, demeurent des boîtes noires peu flexibles nécessitant la plupart de temps des utilisateurs experts en la matière. Les objectifs que nous nous sommes fixés dans cette recherche sont les suivants : 1) Développer un logiciel de restitution des réflectances au sol tenant compte des questions posées ci-haut. Ce logiciel devait être suffisamment modulaire pour pouvoir le bonifier, l’améliorer et l’adapter à diverses problématiques d’application d’images satellitales; et 2) Appliquer ce logiciel dans différents contextes (urbain, agricole, forestier) et analyser les résultats obtenus afin d’évaluer le gain en précision de l’information extraite par des images satellitales transformées en images des réflectances au sol et par conséquent la nécessité d’opérer ainsi peu importe la problématique de l’application. Ainsi, à travers cette recherche, nous avons réalisé un outil de restitution de la réflectance au sol (la nouvelle version du logiciel REFLECT). Ce logiciel est basé sur la formulation (et les routines) du code 6S (Seconde Simulation du Signal Satellitaire dans le Spectre Solaire) et sur la méthode des cibles obscures pour l’estimation de l’épaisseur optique des aérosols (aerosol optical depth, AOD), qui est le facteur le plus difficile à corriger. Des améliorations substantielles ont été apportées aux modèles existants. Ces améliorations concernent essentiellement les propriétés des aérosols (intégration d’un modèle plus récent, amélioration de la recherche des cibles obscures pour l’estimation de l’AOD), la prise en compte de l’effet d’adjacence à l’aide d’un modèle de réflexion spéculaire, la prise en compte de la majorité des capteurs multispectraux à haute résolution (Landsat TM et ETM+, tous les HR de SPOT 1 à 5, EO-1 ALI et ASTER) et à très haute résolution (QuickBird et Ikonos) utilisés actuellement et la correction des effets topographiques l’aide d’un modèle qui sépare les composantes directe et diffuse du rayonnement solaire et qui s’adapte également à la canopée forestière. Les travaux de validation ont montré que la restitution de la réflectance au sol par REFLECT se fait avec une précision de l’ordre de ±0.01 unités de réflectance (pour les bandes spectrales du visible, PIR et MIR), même dans le cas d’une surface à topographie variable. Ce logiciel a permis de montrer, à travers des simulations de réflectances apparentes à quel point les facteurs parasites influant les valeurs numériques des images pouvaient modifier le signal utile qui est la réflectance au sol (erreurs de 10 à plus de 50%). REFLECT a également été utilisé pour voir l’importance de l’utilisation des réflectances au sol plutôt que les valeurs numériques brutes pour diverses applications courantes de la télédétection dans les domaines des classifications, du suivi des changements, de l’agriculture et de la foresterie. Dans la majorité des applications (suivi des changements par images multi-dates, utilisation d’indices de végétation, estimation de paramètres biophysiques, …), la correction des images est une opération cruciale pour obtenir des résultats fiables. D’un point de vue informatique, le logiciel REFLECT se présente comme une série de menus simples d’utilisation correspondant aux différentes étapes de saisie des intrants de la scène, calcul des transmittances gazeuses, estimation de l’AOD par la méthode des cibles obscures et enfin, l’application des corrections radiométriques à l’image, notamment par l’option rapide qui permet de traiter une image de 5000 par 5000 pixels en 15 minutes environ. Cette recherche ouvre une série de pistes pour d’autres améliorations des modèles et méthodes liés au domaine des corrections radiométriques, notamment en ce qui concerne l’intégration de la FDRB (fonction de distribution de la réflectance bidirectionnelle) dans la formulation, la prise en compte des nuages translucides à l’aide de la modélisation de la diffusion non sélective et l’automatisation de la méthode des pentes équivalentes proposée pour les corrections topographiques.
Resumo:
Thèse doctorale effectuée en cotutelle au département d'histoire de l'Université de Montréal et à l'École doctorale d'archéologie de l'Université Paris 1 Panthéon-Sorbonne - UMR 7041, Archéologies et Sciences de l'Antiquité - Archéologie du monde grec.
Resumo:
As shown by different scholars, the idea of “author” is not absolute or necessary. On the contrary, it came to life as an answer to the very practical needs of an emerging print technology in search of an economic model of its own. In this context, and according to the criticism of the notion of “author” made during the 1960–70s (in particular by Barthes and Foucault), it would only be natural to consider the idea of the author being dead as a global claim accepted by all scholars. Yet this is not the case, because, as Rose suggests, the idea of “author” and the derived notion of copyright are still too important in our culture to be abandoned. But why such an attachment to the idea of “author”? The hypothesis on which this chapter is based is that the theory of the death of the author—developed in texts such as What is an Author? by Michel Foucault and The Death of the Author by Roland Barthes—did not provide the conditions for a shift towards a world without authors because of its inherent lack of concrete editorial practices different from the existing ones. In recent years, the birth and diffusion of the Web have allowed the concrete development of a different way of interpreting the authorial function, thanks to new editorial practices—which will be named “editorialization devices” in this chapter. Thus, what was inconceivable for Rose in 1993 is possible today because of the emergence of digital technology—and in particular, the Web.
Resumo:
Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.
Resumo:
Information communication technology (IC T) has invariably brought about fundamental changes in the way in which libraries gather. preserve and disseminate information. The study was carried out with an aim to estimate and compare the information seeking behaviour (ISB) of the academics of two prominent universities of Kerala in the context of advancements achieved through ICT. The study was motivated by the fast changing scenario of libraries with the proliferation of many high tech products and services. The main purpose of the study was to identify the chief source of information of the academics, and also to examine academics preference upon the form and format of information source. The study also tries to estimate the adequacy of the resources and services currently provided by the libraries.The questionnaire was the central instrument for data collection. An almost census method was adopted for data collection engaging various methods and tools for eliciting data.The total population of the study was 957, out of which questionnaire was distributed to 859 academics. 646 academics responded to the survey, of which 564 of them were sound responses. Data was coded and analysed using Statistical Package for Social Sciences (SPSS) software and also with the help of Microsofl Excel package. Various statistical techniques were engaged to analyse data. A paradigm shift is evident by the fact that academies push themselves towards information in internet i.e. they prefer electronic source to traditional source and the very shift is coupled itself with e-seeking of information. The study reveals that ISB of the academics is influenced priman'ly by personal factors and comparative analysis shows that the ISB ofthc academics is similar in both universities. The productivity of the academics was tested to dig up any relation with respect to their ISB, and it is found that productivity of the academics is extensively related with their ISB. Study also reveals that the users ofthe library are satisfied with the services provided but not with the sources and in conjunction, study also recommends ways and means to improve the existing library system.
Resumo:
Inadequate links between researchers and farmers has resulted in low uptake of research advances recommended to improve food security in the central highlands of Kenya. Access to timely and accurate information by extension agents and farmers is paramount in dissemination of soil fertility management practices. Hence, the study sought to investigate the effect of education levels on communication channels used to disseminate soil fertility technologies in the Central highlands of Kenya. Questionnaires were used to elicit information from 105 extension agents and 240 farmers. About 50.5% of the extension officers were certificate holders while 29.5% were diploma holders from agricultural institutes. Majority of the farmers had attained primary education (59.6%) while 25.8% and 9.2% had attained secondary and post secondary education, respectively. Research institutions were the most accessible sources of information on soil fertility management practices by extension agents while internet and scientific conferences were the least scored as accessible sources of soil fertility management information by extension agents. Education levels significantly influenced preference of individual approach methods by farmers. There was a significant positive relationship between education and accessibility of internet as a source of information on green manure. The implication of the study was that education levels influenced the mode of communication used in the transfer of soil fertility research outputs to the end users. Consequently, it is extremely important to consider education levels in selection of dissemination pathways used in agriculture.
Resumo:
This research investigates what information German Fairtrade coffee consumers search for during pre-purchase information seeking and to what extent information is retrieved. Furthermore, the sequence of the information search as well as the degree of cognitive involvement is highlighted. The role of labeling, the importance of additional ethical information and its quality in terms of concreteness as well as the importance of product price and organic origin are addressed. A set of information relevant to Fairtrade consumers was tested by means of the Information Display Matrix (IDM) method with 389 Fairtrade consumers. Results show that prior to purchase, information on product packages plays an important role and is retrieved rather extensively, but search strategies that reduce the information processing effort are applied as well. Furthermore, general information is preferred over specific information. Results of two regression analyses indicate that purchase decisions are related to search behavior variables rather than to socio-demographic variables and purchase motives. In order to match product information with consumers’ needs, marketers should offer information that is reduced to the central aspects of Fairtrade.
Resumo:
This paper describes and analyses the experience of designing, installing and evaluating a farmer-usable touch screen information kiosk on cattle health in a veterinary institution in Pondicherry. The contents of the kiosk were prepared based on identified demands for information on cattle health, arrived at through various stakeholders meetings. Information on these cattle diseases and conditions affecting the livelihoods of the poor was provided through graphics, text and audio back-up, keeping in mind the needs of landless and illiterate poor cattle owners. A methodology for kiosk evaluation based on the feedback obtained from kiosk facilitator, critical group reflection and individual users was formulated. The formative evaluation reveals the potential strength this ICT has in transferring information to the cattle owners in a service delivery centre. Such information is vital in preventing diseases and helps cattle owners to present and treat their animals at an early stage of disease condition. This in turn helps prevent direct and indirect losses to the cattle owners. The study reveals how an information kiosk installed at a government institution as a freely accessible source of information to all farmers irrespective of their class and caste can help in transfer of information among poor cattle owners, provided periodic updating, interactivity and communication variability are taken care of. Being in the veterinary centre, the kiosk helps stimulate dialogue, and facilitates demand of services based on the information provided by the kiosk screens.
Resumo:
Objective: To determine whether the use of verbal descriptors suggested by the European Union (EU) such as "common" (1-10% frequency) and "rare" (0.01-0.1%) effectively conveys the level of risk of side effects to people taking a medicine. Design: Randomised controlled study with unconcealed allocation. Participants: 120 adults taking simvastatin or atorvastatin after cardiac surgery or myocardial infarction. Setting: Cardiac rehabilitation clinics at two hospitals in Leeds, UK. Intervention: A written statement about one of the side effects of the medicine (either constipation or pancreatitis). Within each side effect condition half the patients were given the information in verbal form and half in numerical form (for constipation, "common" or 2.5%; for pancreatitis, "rare" or 0.04%). Main outcome measure: The estimated likelihood of the side effect occurring. Other outcome measures related to the perceived severity of the side effect, its risk to health, and its effect on decisions about whether to take the medicine. Results: The mean likelihood estimate given for the constipation side effect was 34.2% in the verbal group and 8.1% in the numerical group; for pancreatitis it was 18% in the verbal group and 2.1% in the numerical group. The verbal descriptors were associated with more negative perceptions of the medicine than their equivalent numerical descriptors. Conclusions: Patients want and need understandable information about medicines and their risks and benefits. This is essential if they are to become partners in medicine taking. The use of verbal descriptors to improve the level of information about side effect risk leads to overestimation of the level of harm and may lead patients to make inappropriate decisions about whether or not they take the medicine.
Resumo:
The knowledge economy offers opportunity to a broad and diverse community of information systems users to efficiently gain information and know-how for improving qualifications and enhancing productivity in the work place. Such demand will continue and users will frequently require optimised and personalised information content. The advancement of information technology and the wide dissemination of information endorse individual users when constructing new knowledge from their experience in the real-world context. However, a design of personalised information provision is challenging because users’ requirements and information provision specifications are complex in their representation. The existing methods are not able to effectively support this analysis process. This paper presents a mechanism which can holistically facilitate customisation of information provision based on individual users’ goals, level of knowledge and cognitive styles preferences. An ontology model with embedded norms represents the domain knowledge of information provision in a specific context where users’ needs can be articulated and represented in a user profile. These formal requirements can then be transformed onto information provision specifications which are used to discover suitable information content from repositories and pedagogically organise the selected content to meet the users’ needs. The method is provided with adaptability which enables an appropriate response to changes in users’ requirements during the process of acquiring knowledge and skills.
Resumo:
In the emerging digital economy, the management of information in aerospace and construction organisations is facing a particular challenge due to the ever-increasing volume of information and the extensive use of information and communication technologies (ICTs). This paper addresses the problems of information overload and the value of information in both industries by providing some cross-disciplinary insights. In particular it identifies major issues and challenges in the current information evaluation practice in these two industries. Interviews were conducted to get a spectrum of industrial perspectives (director/strategic, project management and ICT/document management) on these issues in particular to information storage and retrieval strategies and the contrasting approaches to knowledge and information management of personalisation and codification. Industry feedback was collected by a follow-up workshop to strengthen the findings of the research. An information-handling agenda is outlined for the development of a future Information Evaluation Methodology (IEM) which could facilitate the practice of the codification of high-value information in order to support through-life knowledge and information management (K&IM) practice.
Resumo:
The volume–volatility relationship during the dissemination stages of information flow is examined by analyzing various theories relating volume and volatility as complementary rather than competing models. The mixture of distributions hypothesis, sequential arrival of information hypothesis, the dispersion of beliefs hypothesis, and the noise trader hypothesis all add to the understanding of how volume and volatility interact for different types of futures traders. An integrated picture of the volume–volatility relationship is provided by investigating the dynamic linear and nonlinear associations between volatility and the volume of informed (institutional) and uninformed (the general public) traders. In particular, the trading behavior explanation for the persistence of futures volatility, the effect of the timing of private information arrival, and the response of institutional traders to excess noise trading risk is examined
Resumo:
What are the precise brain regions supporting the short-term retention of verbal information? A previous functional magnetic resonance imaging (fMRI) study suggested that they may be topographically variable across individuals, occurring, in most, in regions posterior to prefrontal cortex (PFC), and that detection of these regions may be best suited to a single-subject (SS) approach to fMRI analysis (Feredoes and Postle, 2007). In contrast, other studies using spatially normalized group-averaged (SNGA) analyses have localized storage-related activity to PFC. To evaluate the necessity of the regions identified by these two methods, we applied repetitive transcranial magnetic stimulation (rTMS) to SS- and SNGA-identified regions throughout the retention period of a delayed letter-recognition task. Results indicated that rTMS targeting SS analysis-identified regions of left perisylvian and sensorimotor cortex impaired performance, whereas rTMS targeting the SNGA-identified region of left caudal PFC had no effect on performance. Our results support the view that the short-term retention of verbal information can be supported by regions associated with acoustic, lexical, phonological, and speech-based representation of information. They also suggest that the brain bases of some cognitive functions may be better detected by SS than by SNGA approaches to fMRI data analysis.