842 resultados para Use of information
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
This article assesses the extent to which sampling variation affects findings about Malmquist productivity change derived using data envelopment analysis (DEA), in the first stage by calculating productivity indices and in the second stage by investigating the farm-specific change in productivity. Confidence intervals for Malmquist indices are constructed using Simar and Wilson's (1999) bootstrapping procedure. The main contribution of this article is to account in the second stage for the information in the second stage provided by the first-stage bootstrap. The DEA SEs of the Malmquist indices given by bootstrapping are employed in an innovative heteroscedastic panel regression, using a maximum likelihood procedure. The application is to a sample of 250 Polish farms over the period 1996 to 2000. The confidence intervals' results suggest that the second half of 1990s for Polish farms was characterized not so much by productivity regress but rather by stagnation. As for the determinants of farm productivity change, we find that the integration of the DEA SEs in the second-stage regression is significant in explaining a proportion of the variance in the error term. Although our heteroscedastic regression results differ with those from the standard OLS, in terms of significance and sign, they are consistent with theory and previous research.
Resumo:
The findings from a study measuring consumer acceptance of genetically modified (GM) foods are presented. The empirical data were collected in an experimental market, an approach used extensively in experimental economics for measuring the monetary value of goods. The approach has several advantages over standard approaches used in sensory and marketing research (e.g., surveys and focus groups) because of its non-hypothetical nature and the realism introduced by using real goods, real money, and market discipline. In each of three US locations, we elicited the monetary compensation consumers required to consume a GM food. Providing positive information about the benefits of GM food production, in some cases, reduced the level of monetary compensation demanded to consume the GM food. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Conservation of crop wild relatives (CWRs) is a complex interdisciplinary process that is being addressed by various national and international initiatives, including two Global Environment Facility projects ('In situ Conservation of Crop Wild Relatives through Enhanced Information Management and Field Application' and 'Design, Testing and Evaluation of Best Practices for in situ Conservation of Economically Important Wild Species'), the European Community-funded project 'European Crop Wild Relative Diversity Assessment and Conservation Forum (PGR Forum)' and the European 'In situ and On Farm Network'. The key issues that have arisen are: (1) the definition of what constitutes a CWR, (2) the need for national and regional information systems and a global system, (3) development and application of priority-determining mechanisms, (4) the incorporation of the conservation of CWRs into existing national, regional and international PGR programmes, (5) assessment of the effectiveness of conservation actions, (6) awareness of the importance of CWRs in agricultural development at local, national and international levels both for the scientific and lay communities and (7) policy development and legal framework. The above issues are illustrated by work on the conservation of a group of legumes known as grasspea chicklings, vetchlings, and horticultural ornamental peas (Lathyrus spp.) in their European and Mediterranean centre of diversity. (c) 2007 Published by Elsevier B.V.
Resumo:
Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.
Resumo:
The completion of the Human Genome Project has revealed a multitude of potential avenues for the identification of therapeutic targets. Extensive sequence information enables the identification of novel genes but does not facilitate a thorough understanding of how changes in gene expression control the molecular mechanisms underlying the development and regulation of a cell or the progression of disease. Proteomics encompasses the study of proteins expressed by a population of cells, and evaluates changes in protein expression, post-translational modifications, protein interactions, protein structure and splice variants, all of which are imperative for a complete understanding of protein function within the cell. From the outset, proteomics has been used to compare the protein profiles of cells in healthy and diseased states and as such can be used to identify proteins associated with disease development and progression. These candidate proteins might provide novel targets for new therapeutic agents or aid the development of assays for disease biomarkers. This review provides an overview of the current proteomic techniques available and focuses on their application in the search for novel therapeutic targets for the treatment of disease.
Resumo:
Objectives: To investigate people's views about the efficacy and specific risks of herbal, over-the-counter (OTC) conventional, and prescribed conventional medicines, and their likelihood of taking a second (herbal or OTC conventional) product in addition to a prescribed medicine. Methods: Experiment 1 (1 factor within-participant design); Experiment 2 (1 factor between-participant design). Convenience samples of general population were given a hypothetical scenario and required to make a number of judgements. Results: People believed herbal remedies to be less effective, but less risky than OTC and prescribed conventional medicines. Herbal medicines were not seen as being safer simply because of their easier availability. Participants indicated that they would be more likely to take a herbal medicine than a conventional OTC medicine in addition to a prescribed medicine, and less likely to consult their doctor in advance. Conclusion: People believe that herbal medicines are natural and relatively safe and can be used with less caution. People need to be given clear information about the risks and benefits of herbal medicines if they are to use such products safety and effectively. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Providing effective information about drug risks and benefits has become a major challenge for health professionals, as many people are ill equipped to understand, retain and use the information effectively. This paper reviews the growing evidence that people’s understanding (and health behaviour) is not only affected by the content of medicines information, but also by the particular way in which it is presented. Such presentational factors include whether information is presented verbally or numerically, framed positively or negatively, whether risk reductions are described in relative or absolute terms (and baseline information included), and whether information is personalized or tailored in any way. It also looks at how understanding is affected by the order in which information is presented, and the way in which it is processed. The paper concludes by making a number of recommendations for providers of medicines information, about both the content and presentation of such information, that should enhance safe and effective medicines usage.
Resumo:
In the emerging digital economy, the management of information in aerospace and construction organisations is facing a particular challenge due to the ever-increasing volume of information and the extensive use of information and communication technologies (ICTs). This paper addresses the problems of information overload and the value of information in both industries by providing some cross-disciplinary insights. In particular it identifies major issues and challenges in the current information evaluation practice in these two industries. Interviews were conducted to get a spectrum of industrial perspectives (director/strategic, project management and ICT/document management) on these issues in particular to information storage and retrieval strategies and the contrasting approaches to knowledge and information management of personalisation and codification. Industry feedback was collected by a follow-up workshop to strengthen the findings of the research. An information-handling agenda is outlined for the development of a future Information Evaluation Methodology (IEM) which could facilitate the practice of the codification of high-value information in order to support through-life knowledge and information management (K&IM) practice.
Resumo:
“Point and click” interactions remain one of the key features of graphical user interfaces (GUIs). People with motion-impairments, however, can often have difficulty with accurate control of standard pointing devices. This paper discusses work that aims to reveal the nature of these difficulties through analyses that consider the cursor’s path of movement. A range of cursor measures was applied, and a number of them were found to be significant in capturing the differences between able-bodied users and motion-impaired users, as well as the differences between a haptic force feedback condition and a control condition. The cursor measures found in the literature, however, do not make up a comprehensive list, but provide a starting point for analysing cursor movements more completely. Six new cursor characteristics for motion-impaired users are introduced to capture aspects of cursor movement different from those already proposed.
Organisational semiotics methods to assess organisational readiness for internal use of social media
Resumo:
The paper presents organisational semiotics (OS) as an approach for identifying organisational readiness factors for internal use of social media within information intensive organisations (IIO). The paper examines OS methods, such as organisational morphology, containment analysis and collateral analysis to reveal factors of readiness within an organisation. These models also help to identify the essential patterns of activities needed for social media use within an organisation, which can provide a basis for future analysis. The findings confirmed many of the factors, previously identified in literature, while also revealing new factors using OS methods. The factors for organisational readiness for internal use of social media include resources, organisational climate, processes, motivational readiness, benefit and organisational control factors. Organisational control factors revealed are security/privacy, policies, communication procedures, accountability and fallback.
Resumo:
In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.
Resumo:
A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
Introduction: Care home residents are at particular risk from medication errors, and our objective was to determine the prevalence and potential harm of prescribing, monitoring, dispensing and administration errors in UK care homes, and to identify their causes. Methods: A prospective study of a random sample of residents within a purposive sample of homes in three areas. Errors were identified by patient interview, note review, observation of practice and examination of dispensed items. Causes were understood by observation and from theoretically framed interviews with home staff, doctors and pharmacists. Potential harm from errors was assessed by expert judgement. Results: The 256 residents recruited in 55 homes were taking a mean of 8.0 medicines. One hundred and seventy-eight (69.5%) of residents had one or more errors. The mean number per resident was 1.9 errors. The mean potential harm from prescribing, monitoring, administration and dispensing errors was 2.6, 3.7, 2.1 and 2.0 (0 = no harm, 10 = death), respectively. Contributing factors from the 89 interviews included doctors who were not accessible, did not know the residents and lacked information in homes when prescribing; home staff’s high workload, lack of medicines training and drug round interruptions; lack of team work among home, practice and pharmacy; inefficient ordering systems; inaccurate medicine records and prevalence of verbal communication; and difficult to fill (and check) medication administration systems. Conclusions: That two thirds of residents were exposed to one or more medication errors is of concern. The will to improve exists, but there is a lack of overall responsibility. Action is required from all concerned.