37 resultados para Use of information
Resumo:
The findings from a study measuring consumer acceptance of genetically modified (GM) foods are presented. The empirical data were collected in an experimental market, an approach used extensively in experimental economics for measuring the monetary value of goods. The approach has several advantages over standard approaches used in sensory and marketing research (e.g., surveys and focus groups) because of its non-hypothetical nature and the realism introduced by using real goods, real money, and market discipline. In each of three US locations, we elicited the monetary compensation consumers required to consume a GM food. Providing positive information about the benefits of GM food production, in some cases, reduced the level of monetary compensation demanded to consume the GM food. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Conservation of crop wild relatives (CWRs) is a complex interdisciplinary process that is being addressed by various national and international initiatives, including two Global Environment Facility projects ('In situ Conservation of Crop Wild Relatives through Enhanced Information Management and Field Application' and 'Design, Testing and Evaluation of Best Practices for in situ Conservation of Economically Important Wild Species'), the European Community-funded project 'European Crop Wild Relative Diversity Assessment and Conservation Forum (PGR Forum)' and the European 'In situ and On Farm Network'. The key issues that have arisen are: (1) the definition of what constitutes a CWR, (2) the need for national and regional information systems and a global system, (3) development and application of priority-determining mechanisms, (4) the incorporation of the conservation of CWRs into existing national, regional and international PGR programmes, (5) assessment of the effectiveness of conservation actions, (6) awareness of the importance of CWRs in agricultural development at local, national and international levels both for the scientific and lay communities and (7) policy development and legal framework. The above issues are illustrated by work on the conservation of a group of legumes known as grasspea chicklings, vetchlings, and horticultural ornamental peas (Lathyrus spp.) in their European and Mediterranean centre of diversity. (c) 2007 Published by Elsevier B.V.
Resumo:
Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.
Resumo:
The completion of the Human Genome Project has revealed a multitude of potential avenues for the identification of therapeutic targets. Extensive sequence information enables the identification of novel genes but does not facilitate a thorough understanding of how changes in gene expression control the molecular mechanisms underlying the development and regulation of a cell or the progression of disease. Proteomics encompasses the study of proteins expressed by a population of cells, and evaluates changes in protein expression, post-translational modifications, protein interactions, protein structure and splice variants, all of which are imperative for a complete understanding of protein function within the cell. From the outset, proteomics has been used to compare the protein profiles of cells in healthy and diseased states and as such can be used to identify proteins associated with disease development and progression. These candidate proteins might provide novel targets for new therapeutic agents or aid the development of assays for disease biomarkers. This review provides an overview of the current proteomic techniques available and focuses on their application in the search for novel therapeutic targets for the treatment of disease.
Resumo:
Objectives: To investigate people's views about the efficacy and specific risks of herbal, over-the-counter (OTC) conventional, and prescribed conventional medicines, and their likelihood of taking a second (herbal or OTC conventional) product in addition to a prescribed medicine. Methods: Experiment 1 (1 factor within-participant design); Experiment 2 (1 factor between-participant design). Convenience samples of general population were given a hypothetical scenario and required to make a number of judgements. Results: People believed herbal remedies to be less effective, but less risky than OTC and prescribed conventional medicines. Herbal medicines were not seen as being safer simply because of their easier availability. Participants indicated that they would be more likely to take a herbal medicine than a conventional OTC medicine in addition to a prescribed medicine, and less likely to consult their doctor in advance. Conclusion: People believe that herbal medicines are natural and relatively safe and can be used with less caution. People need to be given clear information about the risks and benefits of herbal medicines if they are to use such products safety and effectively. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Providing effective information about drug risks and benefits has become a major challenge for health professionals, as many people are ill equipped to understand, retain and use the information effectively. This paper reviews the growing evidence that people’s understanding (and health behaviour) is not only affected by the content of medicines information, but also by the particular way in which it is presented. Such presentational factors include whether information is presented verbally or numerically, framed positively or negatively, whether risk reductions are described in relative or absolute terms (and baseline information included), and whether information is personalized or tailored in any way. It also looks at how understanding is affected by the order in which information is presented, and the way in which it is processed. The paper concludes by making a number of recommendations for providers of medicines information, about both the content and presentation of such information, that should enhance safe and effective medicines usage.
Resumo:
In the emerging digital economy, the management of information in aerospace and construction organisations is facing a particular challenge due to the ever-increasing volume of information and the extensive use of information and communication technologies (ICTs). This paper addresses the problems of information overload and the value of information in both industries by providing some cross-disciplinary insights. In particular it identifies major issues and challenges in the current information evaluation practice in these two industries. Interviews were conducted to get a spectrum of industrial perspectives (director/strategic, project management and ICT/document management) on these issues in particular to information storage and retrieval strategies and the contrasting approaches to knowledge and information management of personalisation and codification. Industry feedback was collected by a follow-up workshop to strengthen the findings of the research. An information-handling agenda is outlined for the development of a future Information Evaluation Methodology (IEM) which could facilitate the practice of the codification of high-value information in order to support through-life knowledge and information management (K&IM) practice.
Resumo:
“Point and click” interactions remain one of the key features of graphical user interfaces (GUIs). People with motion-impairments, however, can often have difficulty with accurate control of standard pointing devices. This paper discusses work that aims to reveal the nature of these difficulties through analyses that consider the cursor’s path of movement. A range of cursor measures was applied, and a number of them were found to be significant in capturing the differences between able-bodied users and motion-impaired users, as well as the differences between a haptic force feedback condition and a control condition. The cursor measures found in the literature, however, do not make up a comprehensive list, but provide a starting point for analysing cursor movements more completely. Six new cursor characteristics for motion-impaired users are introduced to capture aspects of cursor movement different from those already proposed.
Organisational semiotics methods to assess organisational readiness for internal use of social media
Resumo:
The paper presents organisational semiotics (OS) as an approach for identifying organisational readiness factors for internal use of social media within information intensive organisations (IIO). The paper examines OS methods, such as organisational morphology, containment analysis and collateral analysis to reveal factors of readiness within an organisation. These models also help to identify the essential patterns of activities needed for social media use within an organisation, which can provide a basis for future analysis. The findings confirmed many of the factors, previously identified in literature, while also revealing new factors using OS methods. The factors for organisational readiness for internal use of social media include resources, organisational climate, processes, motivational readiness, benefit and organisational control factors. Organisational control factors revealed are security/privacy, policies, communication procedures, accountability and fallback.
Resumo:
In the recent years, the area of data mining has been experiencing considerable demand for technologies that extract knowledge from large and complex data sources. There has been substantial commercial interest as well as active research in the area that aim to develop new and improved approaches for extracting information, relationships, and patterns from large datasets. Artificial neural networks (NNs) are popular biologically-inspired intelligent methodologies, whose classification, prediction, and pattern recognition capabilities have been utilized successfully in many areas, including science, engineering, medicine, business, banking, telecommunication, and many other fields. This paper highlights from a data mining perspective the implementation of NN, using supervised and unsupervised learning, for pattern recognition, classification, prediction, and cluster analysis, and focuses the discussion on their usage in bioinformatics and financial data analysis tasks. © 2012 Wiley Periodicals, Inc.
Resumo:
A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
Introduction: Care home residents are at particular risk from medication errors, and our objective was to determine the prevalence and potential harm of prescribing, monitoring, dispensing and administration errors in UK care homes, and to identify their causes. Methods: A prospective study of a random sample of residents within a purposive sample of homes in three areas. Errors were identified by patient interview, note review, observation of practice and examination of dispensed items. Causes were understood by observation and from theoretically framed interviews with home staff, doctors and pharmacists. Potential harm from errors was assessed by expert judgement. Results: The 256 residents recruited in 55 homes were taking a mean of 8.0 medicines. One hundred and seventy-eight (69.5%) of residents had one or more errors. The mean number per resident was 1.9 errors. The mean potential harm from prescribing, monitoring, administration and dispensing errors was 2.6, 3.7, 2.1 and 2.0 (0 = no harm, 10 = death), respectively. Contributing factors from the 89 interviews included doctors who were not accessible, did not know the residents and lacked information in homes when prescribing; home staff’s high workload, lack of medicines training and drug round interruptions; lack of team work among home, practice and pharmacy; inefficient ordering systems; inaccurate medicine records and prevalence of verbal communication; and difficult to fill (and check) medication administration systems. Conclusions: That two thirds of residents were exposed to one or more medication errors is of concern. The will to improve exists, but there is a lack of overall responsibility. Action is required from all concerned.
Resumo:
Climate data are used in a number of applications including climate risk management and adaptation to climate change. However, the availability of climate data, particularly throughout rural Africa, is very limited. Available weather stations are unevenly distributed and mainly located along main roads in cities and towns. This imposes severe limitations to the availability of climate information and services for the rural community where, arguably, these services are needed most. Weather station data also suffer from gaps in the time series. Satellite proxies, particularly satellite rainfall estimate, have been used as alternatives because of their availability even over remote parts of the world. However, satellite rainfall estimates also suffer from a number of critical shortcomings that include heterogeneous time series, short time period of observation, and poor accuracy particularly at higher temporal and spatial resolutions. An attempt is made here to alleviate these problems by combining station measurements with the complete spatial coverage of satellite rainfall estimates. Rain gauge observations are merged with a locally calibrated version of the TAMSAT satellite rainfall estimates to produce over 30-years (1983-todate) of rainfall estimates over Ethiopia at a spatial resolution of 10 km and a ten-daily time scale. This involves quality control of rain gauge data, generating locally calibrated version of the TAMSAT rainfall estimates, and combining these with rain gauge observations from national station network. The infrared-only satellite rainfall estimates produced using a relatively simple TAMSAT algorithm performed as good as or even better than other satellite rainfall products that use passive microwave inputs and more sophisticated algorithms. There is no substantial difference between the gridded-gauge and combined gauge-satellite products over the test area in Ethiopia having a dense station network; however, the combined product exhibits better quality over parts of the country where stations are sparsely distributed.
Resumo:
Decadal climate predictions exhibit large biases, which are often subtracted and forgotten. However, understanding the causes of bias is essential to guide efforts to improve prediction systems, and may offer additional benefits. Here the origins of biases in decadal predictions are investigated, including whether analysis of these biases might provide useful information. The focus is especially on the lead-time-dependent bias tendency. A “toy” model of a prediction system is initially developed and used to show that there are several distinct contributions to bias tendency. Contributions from sampling of internal variability and a start-time-dependent forcing bias can be estimated and removed to obtain a much improved estimate of the true bias tendency, which can provide information about errors in the underlying model and/or errors in the specification of forcings. It is argued that the true bias tendency, not the total bias tendency, should be used to adjust decadal forecasts. The methods developed are applied to decadal hindcasts of global mean temperature made using the Hadley Centre Coupled Model, version 3 (HadCM3), climate model, and it is found that this model exhibits a small positive bias tendency in the ensemble mean. When considering different model versions, it is shown that the true bias tendency is very highly correlated with both the transient climate response (TCR) and non–greenhouse gas forcing trends, and can therefore be used to obtain observationally constrained estimates of these relevant physical quantities.