1000 resultados para Information de profondeur
Resumo:
Protocols for secure archival storage are becoming increasingly important as the use of digital storage for sensitive documents is gaining wider practice. Wong et al.[8] combined verifiable secret sharing with proactive secret sharing without reconstruction and proposed a verifiable secret redistribution protocol for long term storage. However their protocol requires that each of the receivers is honest during redistribution. We proposed[3] an extension to their protocol wherein we relaxed the requirement that all the recipients should be honest to the condition that only a simple majority amongst the recipients need to be honest during the re(distribution) processes. Further, both of these protocols make use of Feldman's approach for achieving integrity during the (redistribution processes. In this paper, we present a revised version of our earlier protocol, and its adaptation to incorporate Pedersen's approach instead of Feldman's thereby achieving information theoretic secrecy while retaining integrity guarantees.
Resumo:
The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI) outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16). The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis), such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi), parietal (precuenus, inferior parietal lobe) and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.
Resumo:
In this digital age, as social media is emerging as a central site where information is shared and interpreted, it is essential to study information construction issues on social media sites in order to understand how social reality is constructed. While there is a number of studies taking an information-as-objective point of view, this proposed study emphasizes the constructed and interpretive nature of information and explores the processes through which information surrounding acute events comes into being on micro-blogs. In order to conduct this analysis systematically and theoretically, the concept of interpretive communities will be deployed. This research investigates if or not micro-blog based social groups can serve as interpretive communities, and, if so, what role might they play in the construction of information, and the social impacts that may arise. To understand how this process is entangled with the surrounding social, political, technical contexts, cases from both China (focusing on Sina Weibo) and Australia (focusing on Twitter) will be analysed.
Resumo:
This study investigates the use of unsupervised features derived from word embedding approaches and novel sequence representation approaches for improving clinical information extraction systems. Our results corroborate previous findings that indicate that the use of word embeddings significantly improve the effectiveness of concept extraction models; however, we further determine the influence that the corpora used to generate such features have. We also demonstrate the promise of sequence-based unsupervised features for further improving concept extraction.
Resumo:
The range of consumer health and medicines information sources has diversified along with the increased use of the Internet. This has led to a drive to develop medicines information services and to better incorporate the Internet and e-mail into routine practice in health care and in community pharmacies. To support the development of such services more information is needed about the use of online information by consumers, particularly of those who may be the most likely to use and to benefit from the new sources and modes of medicines communication. This study explored the role and utilization of the Internet-based medicines information and information services in the context of a wider network of information sources accessible to the public in Finland. The overall aim was to gather information to develop better and more accessible sources of information for consumers and services to better meet the needs of consumers. Special focus was on the needs and information behavior among people with depression and using antidepressant medicines. This study applied both qualitative and quantitative methods. Consumer medicines information needs and sources were identified by analyzing the utilization of the University Pharmacy operated national drug information call center (Study I) and surveying Finnish adults (n=2348) use of the different medicines information sources (Study II). The utilization of the Internet as a source of antidepressant information among people with depression was explored by focus group discussions among people with depression and with current or past use of the antidepressant(s) (n=29, Studies III & IV). Pharmacy response to the needs of consumers in term of providing e-mail counseling was assessed by conducting a virtual pseudo customer study among the Finnish community pharmacies (n=161, Study V). Physicians and pharmacists were the primary sources of medicines information. People with mental disorders were more frequent users of telephone- and Internet-based medicines information sources and patient information leaflets than people without mental disorders. These sources were used to complement rather than replace information provided face-to-face by health professionals. People with depression used the Internet to seek facts about antidepressants, to share experiences with peers, and for the curiosity. They described that the access to online drug information was empowering. Some people reported lacking the skills necessary to assess the quality of online information. E-mail medication counseling services provided by community pharmacies were rare and varied in quality. Study results suggest that rather than discouraging the use of the Internet, health professionals should direct patients to use accurate and reliable sources of online medicines information. Health care providers, including community pharmacies should also seek to develop new ways of communicating information about medicines with consumers. This study determined that people with depression and using antidepressants need services enabling interactive communication not only with health care professionals, but also with peers. Further research should be focused on developing medicines information service facilitating communication among different patient and consumer groups.
Resumo:
A novel method is proposed to treat the problem of the random resistance of a strictly one-dimensional conductor with static disorder. It is suggested, for the probability distribution of the transfer matrix of the conductor, the distribution of maximum information-entropy, constrained by the following physical requirements: 1) flux conservation, 2) time-reversal invariance and 3) scaling, with the length of the conductor, of the two lowest cumulants of ζ, where = sh2ζ. The preliminary results discussed in the text are in qualitative agreement with those obtained by sophisticated microscopic theories.
Resumo:
This paper describes the design and implementation of ADAMIS (‘A database for medical information systems’). ADAMIS is a relational database management system for a general hospital environment. Apart from the usual database (DB) facilities of data definition and data manipulation, ADAMIS supports a query language called the ‘simplified medical query language’ (SMQL) which is completely end-user oriented and highly non-procedural. Other features of ADAMIS include provision of facilities for statistics collection and report generation. ADAMIS also provides adequate security and integrity features and has been designed mainly for use on interactive terminals.
Resumo:
Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.
Resumo:
The Gascoyne-Murchison region of Western Australia experiences an arid to semi-arid climate with a highly variable temporal and spatial rainfall distribution. The region has around 39.2 million hectares available for pastoral lease and supports predominantly catle and sheep grazing leases. In recent years a number of climate forecasting systems have been available offering rainfall probabilities with different lead times and a forecast period; however, the extent to which these systems are capable of fulfilling the requirements of the local pastoralists is still ambiguous. Issues can range from ensuring forecasts are issued with sufficient lead time to enable key planning or decisions to be revoked or altered, to ensuring forecast language is simple and clear, to negate possible misunderstandings in interpretation. A climate research project sought to provide an objective method to determine which available forecasting systems had the greatest forecasting skill at times of the year relevant to local property management. To aid this climate research project, the study reported here was undertaken with an overall objective of exploring local pastoralists' climate information needs. We also explored how well they understand common climate forecast terms such as 'mean', median' and 'probability', and how they interpret and apply forecast information to decisions. A stratified, proportional random sampling was used for the purpose of deriving the representative sample based on rainfall-enterprise combinations. In order to provide more time for decision-making than existing operational forecasts that are issued with zero lead time, pastoralists requested that forecasts be issued for May-July and January-March with lead times counting down from 4 to 0 months. We found forecasts of between 20 and 50 mm break-of-season or follow-up rainfall were likely to influence decisions. Eighty percent of pastoralists demonstrated in a test question that they had a poor technical understanding of how to interpret the standard wording of a probabilistic median rainfall forecast. this is worthy of further research to investigate whether inappropriate management decisions are being made because the forecasts are being misunderstood. We found more than half the respondents regularly access and use weather and climate forecasts or outlook information from a range of sources and almost three-quarters considered climate information or tools useful, with preferred methods for accessing this information by email, faxback service, internet and the Department of Agriculture Western Australia's Pastoral Memo. Despite differences in enterprise types and rainfall seasonality across the region we found seasonal climate forecasting needs were relatively consistent. It became clear that providing basic training and working with pastoralists to help them understand regional climatic drivers, climate terminology and jargon, and the best ways to apply the forecasts to enhance decision-making are important to improve their use of information. Consideration could also be given to engaging a range of producers to write the climate forecasts themselves in the language they use and understand, in consultation with the scientists who prepare the forecasts.
Resumo:
Timely access to effective technical information is a key ingredient of profitable strawberry production. Through the Better Berries Program, a joint RD&E initiative of government and industry, a number of information products and services have been provided in recent years to Australia's subtropical strawberry industry, centred in southern Queensland. However, there is a lack of knowledge of how well these are meeting the information needs of growers, both in content and delivery. To better understand grower information use and needs, a stratified sample of 25 growers was interviewed on-farm during the 2004 season. Growers were asked about how they currently accessed information, what they thought of a range of information products and ideas on show, and what advice they would provide for information development in the future. Results indicated that information sought by growers and the style in which it is best presented, varied considerably with grower experience, but little with farm size. New growers had a wide range of needs while the needs of experienced growers were focused mainly on problem identification and new production development. Interestingly, the overwhelming majority across all sectors still preferred paper-based information products despite their extensive use of computers for business purposes. The findings were used to develop a strategy for an improved range of technical information products and services that are more accessible, easier to use, more timely, and more relevant to the needs of growers.
Resumo:
We study which factors in terms of trading environment and trader characteristics determine individual information acquisition in experimental asset markets. Traders with larger endowments, existing inconclusive information, lower risk aversion, and less experience in financial markets tend to acquire more information. Overall, we find that traders overacquire information, so that informed traders on average obtain negative profits net of information costs. Information acquisition and the associated losses do not diminish over time. This overacquisition phenomenon is inconsistent with predictions of rational expectations equilibrium, and we argue it resembles the overdissipation results from the contest literature. We find that more acquired information in the market leads to smaller differences between fundamental asset values and prices. Thus, the overacquisition phenomenon is a novel explanation for the high forecasting accuracy of prediction markets.
Resumo:
Public-private partnerships (PPPs) have generated a lot of interest from governments around the world for leveraging private sector involvement in developing and sustaining public infrastructure and services. Initially, PPPs were favoured by transport, energy, and other large infrastructure-intensive sectors. More recently, the concept has been expanded to include social sectors such as education.
Resumo:
Background: With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow Results: A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusions: A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application with source code is freely available for academic users and can be downloaded from http://www.icrisat.org/bt-software-d-lims.htm
Resumo:
Cognitive impairments of attention, memory and executive functions are a fundamental feature of the pathophysiology of schizophrenia. The neurophysiological and neurochemical changes in the auditory cortex are shown to underlie cognitive impairmentsin schizophrenia patients. Functional state of the neural substrate of auditory information processing could be objectively and non-invasively probed with auditory event-related potentials (ERPs) and event- related fields (ERFs). In the current work, we explored the neurochemical effect on the neural origins of auditory information processing in relation to schizophrenia. By means of ERPs/ERFs we aimed to determine how neural substrates of auditory information processing are modulated by antipsychotic medication in schizophrenia spectrum patients (Studies I, II) and by neuropharmacological challenges in healthy human subjects (Studies III, IV). First, with auditory ERPs we investigated the effects of olanzapine (Study I) and risperidone (Study II) in a group of patients with schizophrenia spectrum disorders. After 2 and 4 weeks of treatment, olanzapine has no significant effects on mismatch negativity(MMN) and P300, which, as it has been suggested, respectively reflect preattentive and attention-dependent information processing. After 2 weeks of treatment, risperidone has no significant effect on P300, however risperidone reduces P200 amplitude. This latter effect of risperidone on neural resources responsible for P200 generation could be partly explained through the action of dopamine. Subsequently, we used simultaneous EEG/MEG to investigate the effects of memantine (Study III) and methylphenidate (Study IV) in healthy subjects. We found that memantine modulates MMN response without changing other ERP components. This could be interpreted as being due to the possible influence of memantine through the NMDA receptors on auditory change- detection mechanism, with processing of auditory stimuli remaining otherwise unchanged. Further, we found that methylphenidate does not modulate the MMN response. This finding could indicate no association between catecholaminergic activities and electrophysiological measures of preattentive auditory discrimination processes reflected in the MMN. However, methylphenidate decreases the P200 amplitudes. This could be interpreted as a modulation of auditory information processing reflected in P200 by dopaminergic and noradrenergic systems. Taken together, our set of studies indicates a complex pattern of neurochemical influences produced by the antipsychotic drugs in the neural substrate of auditory information processing in patients with schizophrenia spectrum disorders and by the pharmacological challenges in healthy subjects studied with ERPs and ERFs.