910 resultados para click-and-use software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un objectif principal du génie logiciel est de pouvoir produire des logiciels complexes, de grande taille et fiables en un temps raisonnable. La technologie orientée objet (OO) a fourni de bons concepts et des techniques de modélisation et de programmation qui ont permis de développer des applications complexes tant dans le monde académique que dans le monde industriel. Cette expérience a cependant permis de découvrir les faiblesses du paradigme objet (par exemples, la dispersion de code et le problème de traçabilité). La programmation orientée aspect (OA) apporte une solution simple aux limitations de la programmation OO, telle que le problème des préoccupations transversales. Ces préoccupations transversales se traduisent par la dispersion du même code dans plusieurs modules du système ou l’emmêlement de plusieurs morceaux de code dans un même module. Cette nouvelle méthode de programmer permet d’implémenter chaque problématique indépendamment des autres, puis de les assembler selon des règles bien définies. La programmation OA promet donc une meilleure productivité, une meilleure réutilisation du code et une meilleure adaptation du code aux changements. Très vite, cette nouvelle façon de faire s’est vue s’étendre sur tout le processus de développement de logiciel en ayant pour but de préserver la modularité et la traçabilité, qui sont deux propriétés importantes des logiciels de bonne qualité. Cependant, la technologie OA présente de nombreux défis. Le raisonnement, la spécification, et la vérification des programmes OA présentent des difficultés d’autant plus que ces programmes évoluent dans le temps. Par conséquent, le raisonnement modulaire de ces programmes est requis sinon ils nécessiteraient d’être réexaminés au complet chaque fois qu’un composant est changé ou ajouté. Il est cependant bien connu dans la littérature que le raisonnement modulaire sur les programmes OA est difficile vu que les aspects appliqués changent souvent le comportement de leurs composantes de base [47]. Ces mêmes difficultés sont présentes au niveau des phases de spécification et de vérification du processus de développement des logiciels. Au meilleur de nos connaissances, la spécification modulaire et la vérification modulaire sont faiblement couvertes et constituent un champ de recherche très intéressant. De même, les interactions entre aspects est un sérieux problème dans la communauté des aspects. Pour faire face à ces problèmes, nous avons choisi d’utiliser la théorie des catégories et les techniques des spécifications algébriques. Pour apporter une solution aux problèmes ci-dessus cités, nous avons utilisé les travaux de Wiels [110] et d’autres contributions telles que celles décrites dans le livre [25]. Nous supposons que le système en développement est déjà décomposé en aspects et classes. La première contribution de notre thèse est l’extension des techniques des spécifications algébriques à la notion d’aspect. Deuxièmement, nous avons défini une logique, LA , qui est utilisée dans le corps des spécifications pour décrire le comportement de ces composantes. La troisième contribution consiste en la définition de l’opérateur de tissage qui correspond à la relation d’interconnexion entre les modules d’aspect et les modules de classe. La quatrième contribution concerne le développement d’un mécanisme de prévention qui permet de prévenir les interactions indésirables dans les systèmes orientés aspect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open access iiiovemerit and open source software movement plays an important role in creation of knowledge, knowledge management and knowledge dissemination. Scholarly communication and publishing are increasingly taking place in the electronic environment. With a growing proportion of the scholarly record now existing only in digital format, serious issues regarding access and preservation are being raised that are central to future scholarship. Institutional Repositories provide access to past. present and future scholarly literature and research documentation; ensures its preservation; assists users in discovery and use; and offers educational programs to enable users to develop lifelong literacy. This paper explores these aspects on how IR of Cochin University of Science & Technology supports scientific community for knowledge creation. knowledge Management, and knowledge dissemination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current research investigates the possibility of using unmodified and modified nanokaolin, multiwalled carbon nanotube (MWCNT) and graphene as fillers to impart enhancement in mechanical, thermal, and electrical properties to the elastomers. Taking advantage of latex blending method, nanoclay, MWCNT and graphene dispersions, prepared by ultra sound sonication are dispersed in polymer latices. The improvement in material properties indicated better interaction between filler and the polymer.MWCNT and graphene imparted electrical conductivity with simultaneous improvement in mechanical properties. Layered silicates prepared by microwave method also significantly improve the mechanical properties of the nanocomposites. The thesis entitled ‘Studies on the use of Nanokaolin, MWCNT and Graphene in NBR and SBR’ consists of ten chapters. The first chapter is a concise introduction of nanocomposites, nanofillers, elastomeric matrices and applications of polymer nanocomposites. The state-of-art research in elastomer based nanocomposites is also presented. At the end of this chapter the main objectives of the work are mentioned. Chapter 2 outlines the specifications of various materials used, details of experimental techniques employed for preparing and characterizing nanocomposites. Chapter3 includes characterization of the nanofillers, optimsation of cure time of latex based composites and the methods used for the preparation of latex based and dry rubber based nanocomposites. Chapter4 presents the reinforcing effect of the nanofillers in XNBR latex and the characterization of the nanocomposites. Chapter5 comprises the effect of nanofillers on the properties of SBR latex and their characterization Chapter 6 deals with the study of cure characteristics, mechanical and thermal properties and the characterization of NBR based nanocomposites. Chapter7 is the microwave studies of MWCNT and graphene filled elastomeric nanocomposites. Chapter 8 gives details of the preparation of layered silicates, their characterization and use in different elastomeric matrices. Chapter 9 is the study of mechanical properties of nanoclay incorporated nitrile gloves .Chapter 10 presents the summary and conclusions of the investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: Growing numbers of researchers work on improving the results of Web Mining by exploiting semantic structures in the Web, and they use Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The second aim of this paper is to use these concepts to circumscribe what Web space is, what it represents and how it can be represented and analyzed. This is used to sketch the role that Semantic Web Mining and the software agents and human agents involved in it can play in the evolution of Web space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Like elsewhere also in Kabul, Afghanistan urban and peri-urban agriculture (UPA) has often been accused of being resource inefficient and unsustainable causing negatives externalities to community health and to the surroundings. These arise from the inappropriate management and use of agricultural inputs, including often pesticides and inter-city wastes containing heavy metal residues and pathogens. To address these concerns, parallel studies with the aims of quantification of carbon (C), nitrogen (N), phosphorus (P) and potassium (K) horizontal and vertical fluxes; the assessment of heavy metal and pathogen contaminations of UPA produce, and an economic analysis of cereal, vegetable and grape production systems conducted for two years in UPA of Kabul from April 2008 to October 2009. The results of the studies from these three UPA diverse production systems can be abridged as follows: Biennial net balances in vegetable production systems were positive for N (80 kg ha-1 ), P (75 kg ha-1) and C (3,927 kg ha-1), negative for K (-205 kg ha-1), whereas in cereal production systems biennial horizontal balances were positive for P (20 kg ha-1 ) and C (4,900 kg ha-1) negative for N (-155 kg ha-1) and K (-355 kg ha-1) and in vineyards corresponding values were highly positive for N (295 kg ha-1), P (235 kg ha-1), C (3,362 kg ha-1) and slightly positive for K (5 kg ha-1). Regardless of N and C gaseous emissions, yearly leaching losses of N and P in selected vegetable gardens varied from 70 - 205 kg N ha-1 and 5 - 10 kg P ha-1. Manure and irrigation water contributed on average 12 - 79% to total Inputs of N, P, K and C, 10 - 53% to total inputs of C in the gardens and fields. The elevated levels of heavy metal and pathogen loads on fresh UPA vegetables reflected contamination from increasing traffic in the city, deposits of the past decades of war, lacking collection and treatment of raw inter-city wastes which call for solutions to protect consumer and producer health and increase reliability of UPA productions. A cost-revenue analysis of all inputs and outputs of cereal, vegetable and grapes production systems over two years showed substantial differences in net UPA household income. To confirm these results, more detailed studies are needed, but tailoring and managing the optimal application of inputs to crop needs will significantly enhance farmer’s better revenues as will as environmental and produce quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was conducted in 2010 in Eastern Nuba Mountains, Sudan to investigate ethnobotanical food and non-food uses of 16 wild edible fruit producing trees. Quantitative and qualitative information was collected from 105 individuals distributed in 7 villages using a semi-structured questionnaire. Also gathering of data was done using a number of rapid rural appraisal techniques, including key informant interviews, group discussion, secondary data sources and direct observations. Data was analysed using fidelity level and informant consensus factor methods to reveal the cultural importance of species and use category. Utilizations for timber products were found of most community importance than food usages, especially during cultivated food abundance. Balanites aegyptiaca, Ziziphus spina-christi and Tamarindus indica fruits were asserted as most preferable over the others and of high marketability in most of the study sites. Harvesting for timber-based utilizations in addition to agricultural expansion and overgrazing were the principal threats to wild edible food producing trees in the area. The on and off prevailing armed conflict in the area make it crucial to conserve wild food trees which usually play a more significant role in securing food supply during emergency times, especially in times of famine and wars. Increasing the awareness of population on importance of wild food trees and securing alternative income sources, other than wood products, is necessary in any rural development programme aiming at securing food and sustaining its resources in the area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is a presentation for our year one INFO1008 course of Computational Systems. It covers the need for requirements capture and the difficulty of building a specification based on user information. We present UML Use Cases and Use Case diagrams as a way of capturing requirements from the users point of view in a semi-structured way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation discusses the role and purpose of testing in the systems/Software Development Life Cycle. We examine the consequences of the 'cost curve' on defect removal and how agile methods can reduce its effects. We concentrate on Black Box Testing and use Equivalence Partitioning and Boundary Value Analysis to construct the smallest number of test cases, test scenarios necessary for a test plan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wednesday 23rd April 2014 Speaker(s): Willi Hasselbring Organiser: Leslie Carr Time: 23/04/2014 14:00-15:00 Location: B32/3077 File size: 802Mb Abstract The internal behavior of large-scale software systems cannot be determined on the basis of static (e.g., source code) analysis alone. Kieker provides complementary dynamic analysis capabilities, i.e., monitoring/profiling and analyzing a software system's runtime behavior. Application Performance Monitoring is concerned with continuously observing a software system's performance-specific runtime behavior, including analyses like assessing service level compliance or detecting and diagnosing performance problems. Architecture Discovery is concerned with extracting architectural information from an existing software system, including both structural and behavioral aspects like identifying architectural entities (e.g., components and classes) and their interactions (e.g., local or remote procedure calls). In addition to the Architecture Discovery of Java systems, Kieker supports Architecture Discovery for other platforms, including legacy systems, for instance, inplemented in C#, C++, Visual Basic 6, COBOL or Perl. Thanks to Kieker's extensible architecture it is easy to implement and use custom extensions and plugins. Kieker was designed for continuous monitoring in production systems inducing only a very low overhead, which has been evaluated in extensive benchmark experiments. Please, refer to http://kieker-monitoring.net/ for more information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Fractal geometry measures the irregularity of abstract and natural objects with the fractal dimension. Fractal calculations have been applied to the structures of the human body and to quantifications in physiology from the theory of dynamic systems.Material and Methods. The fractal dimensions were calculated, the number of occupation spaces in the space border of box counting and the area of two red blood cells groups, 7 normal ones, group A, and 7 abnormal, group B, coming from patient and of bags for transfusion, were calculated using the method of box counting and a software developed for such effect. The obtained measures were compared, looking for differences between normal and abnormal red blood cells, with the purpose of differentiating samples.Results. The abnormality characterizes by a number of squares of occupation of the fractal space greater or equal to 180; values of areas between 25.117 and 33.548 correspond to normality. In case that the evaluation according to the number of pictures is of normality, must be confirmed with the value of the area applied to adjacent red blood cells within the sample, that in case of having values by outside established and/or the greater or equal spaces to 180, they suggest abnormality of the sample.Conclusions. The developed methodology is effective to differentiate the red globules alterations and probably useful in the analysis of bags of transfusion for clinical use 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to narrow the gap between two different control techniques: the continuous control and the discrete event control techniques DES. This gap can be reduced by the study of Hybrid systems, and by interpreting as Hybrid systems the majority of large-scale systems. In particular, when looking deeply into a process, it is often possible to identify interaction between discrete and continuous signals. Hybrid systems are systems that have both continuous, and discrete signals. Continuous signals are generally supposed continuous and differentiable in time, since discrete signals are neither continuous nor differentiable in time due to their abrupt changes in time. Continuous signals often represent the measure of natural physical magnitudes such as temperature, pressure etc. The discrete signals are normally artificial signals, operated by human artefacts as current, voltage, light etc. Typical processes modelled as Hybrid systems are production systems, chemical process, or continuos production when time and continuous measures interacts with the transport, and stock inventory system. Complex systems as manufacturing lines are hybrid in a global sense. They can be decomposed into several subsystems, and their links. Another motivation for the study of Hybrid systems is the tools developed by other research domains. These tools benefit from the use of temporal logic for the analysis of several properties of Hybrid systems model, and use it to design systems and controllers, which satisfies physical or imposed restrictions. This thesis is focused in particular types of systems with discrete and continuous signals in interaction. That can be modelled hard non-linealities, such as hysteresis, jumps in the state, limit cycles, etc. and their possible non-deterministic future behaviour expressed by an interpretable model description. The Hybrid systems treated in this work are systems with several discrete states, always less than thirty states (it can arrive to NP hard problem), and continuous dynamics evolving with expression: with Ki ¡ Rn constant vectors or matrices for X components vector. In several states the continuous evolution can be several of them Ki = 0. In this formulation, the mathematics can express Time invariant linear system. By the use of this expression for a local part, the combination of several local linear models is possible to represent non-linear systems. And with the interaction with discrete events of the system the model can compose non-linear Hybrid systems. Especially multistage processes with high continuous dynamics are well represented by the proposed methodology. Sate vectors with more than two components, as third order models or higher is well approximated by the proposed approximation. Flexible belt transmission, chemical reactions with initial start-up and mobile robots with important friction are several physical systems, which profits from the benefits of proposed methodology (accuracy). The motivation of this thesis is to obtain a solution that can control and drive the Hybrid systems from the origin or starting point to the goal. How to obtain this solution, and which is the best solution in terms of one cost function subject to the physical restrictions and control actions is analysed. Hybrid systems that have several possible states, different ways to drive the system to the goal and different continuous control signals are problems that motivate this research. The requirements of the system on which we work is: a model that can represent the behaviour of the non-linear systems, and that possibilities the prediction of possible future behaviour for the model, in order to apply an supervisor which decides the optimal and secure action to drive the system toward the goal. Specific problems can be determined by the use of this kind of hybrid models are: - The unity of order. - Control the system along a reachable path. - Control the system in a safe path. - Optimise the cost function. - Modularity of control The proposed model solves the specified problems in the switching models problem, the initial condition calculus and the unity of the order models. Continuous and discrete phenomena are represented in Linear hybrid models, defined with defined eighth-tuple parameters to model different types of hybrid phenomena. Applying a transformation over the state vector : for LTI system we obtain from a two-dimensional SS a single parameter, alpha, which still maintains the dynamical information. Combining this parameter with the system output, a complete description of the system is obtained in a form of a graph in polar representation. Using Tagaki-Sugeno type III is a fuzzy model which include linear time invariant LTI models for each local model, the fuzzyfication of different LTI local model gives as a result a non-linear time invariant model. In our case the output and the alpha measure govern the membership function. Hybrid systems control is a huge task, the processes need to be guided from the Starting point to the desired End point, passing a through of different specific states and points in the trajectory. The system can be structured in different levels of abstraction and the control in three layers for the Hybrid systems from planning the process to produce the actions, these are the planning, the process and control layer. In this case the algorithms will be applied to robotics ¡V a domain where improvements are well accepted ¡V it is expected to find a simple repetitive processes for which the extra effort in complexity can be compensated by some cost reductions. It may be also interesting to implement some control optimisation to processes such as fuel injection, DC-DC converters etc. In order to apply the RW theory of discrete event systems on a Hybrid system, we must abstract the continuous signals and to project the events generated for these signals, to obtain new sets of observable and controllable events. Ramadge & Wonham¡¦s theory along with the TCT software give a Controllable Sublanguage of the legal language generated for a Discrete Event System (DES). Continuous abstraction transforms predicates over continuous variables into controllable or uncontrollable events, and modifies the set of uncontrollable, controllable observable and unobservable events. Continuous signals produce into the system virtual events, when this crosses the bound limits. If this event is deterministic, they can be projected. It is necessary to determine the controllability of this event, in order to assign this to the corresponding set, , controllable, uncontrollable, observable and unobservable set of events. Find optimal trajectories in order to minimise some cost function is the goal of the modelling procedure. Mathematical model for the system allows the user to apply mathematical techniques over this expression. These possibilities are, to minimise a specific cost function, to obtain optimal controllers and to approximate a specific trajectory. The combination of the Dynamic Programming with Bellman Principle of optimality, give us the procedure to solve the minimum time trajectory for Hybrid systems. The problem is greater when there exists interaction between adjacent states. In Hybrid systems the problem is to determine the partial set points to be applied at the local models. Optimal controller can be implemented in each local model in order to assure the minimisation of the local costs. The solution of this problem needs to give us the trajectory to follow the system. Trajectory marked by a set of set points to force the system to passing over them. Several ways are possible to drive the system from the Starting point Xi to the End point Xf. Different ways are interesting in: dynamic sense, minimum states, approximation at set points, etc. These ways need to be safe and viable and RchW. And only one of them must to be applied, normally the best, which minimises the proposed cost function. A Reachable Way, this means the controllable way and safe, will be evaluated in order to obtain which one minimises the cost function. Contribution of this work is a complete framework to work with the majority Hybrid systems, the procedures to model, control and supervise are defined and explained and its use is demonstrated. Also explained is the procedure to model the systems to be analysed for automatic verification. Great improvements were obtained by using this methodology in comparison to using other piecewise linear approximations. It is demonstrated in particular cases this methodology can provide best approximation. The most important contribution of this work, is the Alpha approximation for non-linear systems with high dynamics While this kind of process is not typical, but in this case the Alpha approximation is the best linear approximation to use, and give a compact representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the distinctive characteristics of the water supply system of Greater Amman, the capital of Jordan, is that it has been based on a regime of rationing since 1987, with households receiving water once a week for various durations. This reflects the fact that while Amman's recent growth has been phenomenal, Jordan is one of the ten most water-scarce nations on earth. Amman is highly polarised socio-economically, and by means of household surveys conducted in both high- and low-income divisions of the city, the aim has been to provide detailed empirical evidence concerning the storage and use if water, the strategies used by households to manage water and overall satisfactions with water supply issues, looking specifically at issues of social equity. The analysis demonstrates the social costs of water rationing and consequent household management to be high, as well as emphasising that issues of water quality are of central importance to all consumers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The community pharmacy service medicines use review (MUR) was introduced in 2005 ‘to improve patient knowledge, concordance and use of medicines’ through a private patient–pharmacist consultation. The MUR presents a fundamental change in community pharmacy service provision. While traditionally pharmacists are dispensers of medicines and providers of medicines advice, and patients as recipients, the MUR considers pharmacists providing consultation-type activities and patients as active participants. The MUR facilitates a two-way discussion about medicines use. Traditional patient–pharmacist behaviours transform into a new set of behaviours involving the booking of appointments, consultation processes and form completion, and the physical environment of the patient–pharmacist interaction moves from the traditional setting of the dispensary and medicines counter to a private consultation room. Thus, the new service challenges traditional identities and behaviours of the patient and the pharmacist as well as the environment in which the interaction takes place. In 2008, the UK government concluded there is at present too much emphasis on the quantity of MURs rather than on their quality.[1] A number of plans to remedy the perceived imbalance included a suggestion to reward ‘health outcomes’ achieved, with calls for a more focussed and scientific approach to the evaluation of pharmacy services using outcomes research. Specifically, the UK government set out the main principal research areas for the evaluation of pharmacy services to include ‘patient and public perceptions and satisfaction’as well as ‘impact on care and outcomes’. A limited number of ‘patient satisfaction with pharmacy services’ type questionnaires are available, of varying quality, measuring dimensions relating to pharmacists’ technical competence, behavioural impressions and general satisfaction. For example, an often cited paper by Larson[2] uses two factors to measure satisfaction, namely ‘friendly explanation’ and ‘managing therapy’; the factors are highly interrelated and the questions somewhat awkwardly phrased, but more importantly, we believe the questionnaire excludes some specific domains unique to the MUR. By conducting patient interviews with recent MUR recipients, we have been working to identify relevant concepts and develop a conceptual framework to inform item development for a Patient Reported Outcome Measure questionnaire bespoke to the MUR. We note with interest the recent launch of a multidisciplinary audit template by the Royal Pharmaceutical Society of Great Britain (RPSGB) in an attempt to review the effectiveness of MURs and improve their quality.[3] This template includes an MUR ‘patient survey’. We will discuss this ‘patient survey’ in light of our work and existing patient satisfaction with pharmacy questionnaires, outlining a new conceptual framework as a basis for measuring patient satisfaction with the MUR. Ethical approval for the study was obtained from the NHS Surrey Research Ethics Committee on 2 June 2008. References 1. Department of Health (2008). Pharmacy in England: Building on Strengths – Delivering the Future. London: HMSO. www. official-documents.gov.uk/document/cm73/7341/7341.pdf (accessed 29 September 2009). 2. Larson LN et al. Patient satisfaction with pharmaceutical care: update of a validated instrument. JAmPharmAssoc 2002; 42: 44–50. 3. Royal Pharmaceutical Society of Great Britain (2009). Pharmacy Medicines Use Review – Patient Audit. London: RPSGB. http:// qi4pd.org.uk/index.php/Medicines-Use-Review-Patient-Audit. html (accessed 29 September 2009).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, a number of mechanistic, dynamic simulation models of several components of the dairy production system have become available. However their use has been limited due to the detailed technical knowledge and special software required to run them, and the lack of compatibility between models in predicting various metabolic processes in the animal. The first objective of the current study was to integrate the dynamic models of [Brit. J. Nutr. 72 (1994) 679] on rumen function, [J. Anim. Sci. 79 (2001) 1584] on methane production, [J. Anim. Sci. 80 (2002) 2481 on N partition, and a new model of P partition. The second objective was to construct a decision support system to analyse nutrient partition between animal and environment. The integrated model combines key environmental pollutants such as N, P and methane within a nutrient-based feed evaluation system. The model was run under different scenarios and the sensitivity of various parameters analysed. A comparison of predictions from the integrated model with the original simulation models showed an improvement in N excretion since the integrated model uses the dynamic model of [Brit. J. Nutr. 72 (1994) 6791 to predict microbial N, which was not represented in detail in the original model. The integrated model can be used to investigate the degree to which production and environmental objectives are antagonistic, and it may help to explain and understand the complex mechanisms involved at the ruminal and metabolic levels. A part of the integrated model outputs were the forms of N and P in excreta and methane, which can be used as indices of environmental pollution. (C) 2004 Elsevier B.V All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patients want and need comprehensive and accurate information about their medicines so that they can participate in decisions about their healthcare: In particular, they require information about the likely risks and benefits that are associated with the different treatment options. However, to provide this information in a form that people can readily understand and use is a considerable challenge to healthcare professionals. One recent attempt to standardise the Language of risk has been to produce sets of verbal descriptors that correspond to specific probability ranges, such as those outlined in the European Commission (EC) Pharmaceutical Committee guidelines in 1998 for describing the incidence of adverse effects. This paper provides an overview of a number of studies involving members of the general public, patients, and hospital doctors, that evaluated the utility of the EC guideline descriptors (very common, common, uncommon, rare, very rare). In all studies it was found that people significantly over-estimated the likelihood of adverse effects occurring, given specific verbal descriptors. This in turn resulted in significantly higher ratings of their perceived risks to health and significantly lower ratings of their likelihood of taking the medicine. Such problems of interpretation are not restricted to the EC guideline descriptors. Similar levels of misinterpretation have also been demonstrated with two other recently advocated risk scales (Caiman's verbal descriptor scale and Barclay, Costigan and Davies' lottery scale). In conclusion, the challenge for risk communicators and for future research will be to produce a language of risk that is sufficiently flexible to take into account different perspectives, as well as changing circumstances and contexts of illness and its treatments. In the meantime, we urge the EC and other legislative bodies to stop recommending the use of specific verbal labels or phrases until there is a stronger evidence base to support their use.