35 resultados para behind-the-counter-lääkkeet
Resumo:
The purpose of this paper is to deal with the outcomes of a so-called “employability management needs analysis” that is meant to provide more insight into current employability management activities and its possible benefits for Information and Communication Technology (ICT) professionals working in Small- and Medium-sized enterprises (SMEs) throughout Europe. A considerable series of interviews (N=107) were conducted with managers in SMEs in seven European countries, including Germany, Greece, Italy, the Netherlands, Norway, Poland, and the UK. A semi-structured interview protocol was used during the interviews to cover three issues: employability (13 items), ageing (8 items), and future developments and requirements (13 items). Analysis of all final interview transcriptions was at a national level using an elaborate common coding scheme. Although an interest in employability emerged, actual policy and action lagged behind. The recession in the ICT sector at the time of the investigation and the developmental stage of the sector in each participating country appeared connected. Ageing was not seen as a major issue in the ICT sector because managers considered ICT to be a relatively young sector. There appeared to be a serious lack of investment in the development of expertise of ICT professionals. Generalization of the results to large organizations in the ICT sector should be made with caution. The interview protocol developed is of value for further research and complements survey research undertaken within the employability field of study. It can be concluded that proactive HRM (Human Resource Management) policies and strategies are essential, even in times of economic downturn. Employability management activities are especially important in the light of current career issues. The study advances knowledge regarding HRM practices adopted by SMEs in the ICT sector, especially as there is a gap in knowledge about career development issues in that particular sector.
Resumo:
Factors affecting the current role of the community pharmacist in responding to symptoms are investigated. Communication and collaboration with general medical practitioners (GPs), and the competency of pharmacists and counter assistants to perform the role of responding to symptoms, are examined. A national survey of GPs, conducted by postal questionnaire, explores attitudes towards the role of the community pharmacist in the treatment of patients' symptoms, and towards future extension of such a role. A majority (over 90%) of respondents thought that the counter prescribing activities of the pharmacist should be maintained or increased. Doctors supported treatment of most minor illnesses by pharmacists, but there was relatively little support for the deregulation of selected Prescription Only Medicines. Three quarters of respondents were in favour of joint educational meetings for pharmacists and doctors. Most GPs (85%) expressed support for a formal referral route from pharmacists to doctors, using a "notification card". A pilot study of the use of a notification card was conducted . Two thirds of the patients who were advised to see their doctor by the pharmacist subsequently did so. In most cases , the GP rated the patients' symptoms " significant" and the card "helpful". Pharmacists' and counter assistants' competency in responding to symptoms was assessed by a programme of pharmacy visits, where previously-defined symptoms were presented. Some pharmacists' questioning skills were found to be inadequate, and their knowledge not sufficiently current. Counter assistants asked fewer and less appropriate questions than did pharmacists, and assistants ' knowledge base was shown to be inadequate. Recommendations are made in relation to the education and training of pharmacists and counter assistants in responding to symptoms .
Resumo:
The work presented in this thesis is concerned with the dynamic behaviour of structural joints which are both loaded, and excited, normal to the joint interface. Since the forces on joints are transmitted through their interface, the surface texture of joints was carefully examined. A computerised surface measuring system was developed and computer programs were written. Surface flatness was functionally defined, measured and quantised into a form suitable for the theoretical calculation of the joint stiffness. Dynamic stiffness and damping were measured at various preloads for a range of joints with different surface textures. Dry clean and lubricated joints were tested and the results indicated an increase in damping for the lubricated joints of between 30 to 100 times. A theoretical model for the computation of the stiffness of dry clean joints was built. The model is based on the theory that the elastic recovery of joints is due to the recovery of the material behind the loaded asperities. It takes into account, in a quantitative manner, the flatness deviations present on the surfaces of the joint. The theoretical results were found to be in good agreement with those measured experimentally. It was also found that theoretical assessment of the joint stiffness could be carried out using a different model based on the recovery of loaded asperities into a spherical form. Stepwise procedures are given in order to design a joint having a particular stiffness. A theoretical model for the loss factor of dry clean joints was built. The theoretical results are in reasonable agreement with those experimentally measured. The theoretical models for the stiffness and loss factor were employed to evaluate the second natural frequency of the test rig. The results are in good agreement with the experimentally measured natural frequencies.
Resumo:
The study investigated the potential applications and the limitations of non-standard techniques of visual field investigation utilizing automated perimetry. Normal subjects exhibited a greater sensitivity to kinetic stimuli than to static stimuli of identical size. The magnitude of physiological SKD was found to be largely independent of age, stimulus size, meridian and eccentricity. The absence of a dependency on stimulus size indicated that successive lateral spatial summation could not totally account for the underlying mechanism of physiological SKD. The visual field indices MD and LV exhibited a progressive deterioration during the time course of a conventional central visual field examination both for normal subjects and for ocular hypertensive patients. The fatigue effect was more pronounced in the latter stages and for the second eye tested. The confidence limits for the definition of abnormality should reflect the greater effect of fatigue on the second eye. A 330 cdm-2 yellow background was employed for blue-on-yellow perimetry. Instrument measurement range was preserved by positioning a concave mirror behind the stimulus bulb to increase the light output by 60% . The mean magnitude of SWS pathway isolation was approximately 1.4 log units relative to a 460nm stimulus filter. The absorption spectra of the ocular media exhibited an exponential increase with increase in age, whilst that of the macular pigment showed no systematic trend. The magnitude of ocular media absorption was demonstrated to reduce with increase in wavelength. Ocular media absorption was significantly greater in diabetic patients than in normal subjects. Five diabetic patients with either normal or borderline achromatic sensitivity exhibited an abnormal blue-on-yellow sensitivity; two of these patients showed no signs of retinopathy. A greater vulnerability of the SWS pathway to the diabetic disease process was hypothesized.
Resumo:
The mechanism behind the immunostimulatory effect of the cationic liposomal vaccine adjuvant dimethyldioctadecylammonium and trehalose 6,6′- dibehenate (DDA:TDB) has been linked to the ability of these cationic vesicles to promote a depot after administration, with the liposomal adjuvant and the antigen both being retained at the injection site. This can be attributed to their cationic nature, since reduction in vesicle size does not influence their distribution profile yet neutral or anionic liposomes have more rapid clearance rates. Therefore the aim of this study was to investigate the impact of a combination of reduced vesicle size and surface pegylation on the biodistribution and adjuvanticity of the formulations, in a bid to further manipulate the pharmacokinetic profiles of these adjuvants. From the biodistribution studies, it was found that with small unilamellar vesicles (SUVs), 10% PEGylation of the formulation could influence liposome retention at the injection site after 4 days, whilst higher levels (25 mol%) of PEG blocked the formation of a depot and promote clearance to the draining lymph nodes. Interestingly, whilst the use of 10% PEG in the small unilamellar vesicles did not block the formation of a depot at the site of injection, it did result in earlier antibody response rates and switch the type of T cell responses from a Th1 to a Th2 bias suggesting that the presence of PEG in the formulation not only control the biodistribution of the vaccine, but also results in different types of interactions with innate immune cells. © 2012 Elsevier B.V.
Resumo:
The concept of plagiarism is not uncommonly associated with the concept of intellectual property, both for historical and legal reasons: the approach to the ownership of ‘moral’, nonmaterial goods has evolved to the right to individual property, and consequently a need was raised to establish a legal framework to cope with the infringement of those rights. The solution to plagiarism therefore falls most often under two categories: ethical and legal. On the ethical side, education and intercultural studies have addressed plagiarism critically, not only as a means to improve academic ethics policies (PlagiarismAdvice.org, 2008), but mainly to demonstrate that if anything the concept of plagiarism is far from being universal (Howard & Robillard, 2008). Even if differently, Howard (1995) and Scollon (1994, 1995) argued, and Angèlil-Carter (2000) and Pecorari (2008) later emphasised that the concept of plagiarism cannot be studied on the grounds that one definition is clearly understandable by everyone. Scollon (1994, 1995), for example, claimed that authorship attribution is particularly a problem in non-native writing in English, and so did Pecorari (2008) in her comprehensive analysis of academic plagiarism. If among higher education students plagiarism is often a problem of literacy, with prior, conflicting social discourses that may interfere with academic discourse, as Angèlil-Carter (2000) demonstrates, we then have to aver that a distinction should be made between intentional and inadvertent plagiarism: plagiarism should be prosecuted when intentional, but if it is part of the learning process and results from the plagiarist’s unfamiliarity with the text or topic it should be considered ‘positive plagiarism’ (Howard, 1995: 796) and hence not an offense. Determining the intention behind the instances of plagiarism therefore determines the nature of the disciplinary action adopted. Unfortunately, in order to demonstrate the intention to deceive and charge students with accusations of plagiarism, teachers necessarily have to position themselves as ‘plagiarism police’, although it has been argued otherwise (Robillard, 2008). Practice demonstrates that in their daily activities teachers will find themselves being required a command of investigative skills and tools that they most often lack. We thus claim that the ‘intention to deceive’ cannot inevitably be dissociated from plagiarism as a legal issue, even if Garner (2009) asserts that generally plagiarism is immoral but not illegal, and Goldstein (2003) makes the same severance. However, these claims, and the claim that only cases of copyright infringement tend to go to court, have recently been challenged, mainly by forensic linguists, who have been actively involved in cases of plagiarism. Turell (2008), for instance, demonstrated that plagiarism is often connoted with an illegal appropriation of ideas. Previously, she (Turell, 2004) had demonstrated by comparison of four translations of Shakespeare’s Julius Caesar to Spanish that the use of linguistic evidence is able to demonstrate instances of plagiarism. This challenge is also reinforced by practice in international organisations, such as the IEEE, to whom plagiarism potentially has ‘severe ethical and legal consequences’ (IEEE, 2006: 57). What plagiarism definitions used by publishers and organisations have in common – and which the academia usually lacks – is their focus on the legal nature. We speculate that this is due to the relation they intentionally establish with copyright laws, whereas in education the focus tends to shift from the legal to the ethical aspects. However, the number of plagiarism cases taken to court is very small, and jurisprudence is still being developed on the topic. In countries within the Civil Law tradition, Turell (2008) claims, (forensic) linguists are seldom called upon as expert witnesses in cases of plagiarism, either because plagiarists are rarely taken to court or because there is little tradition of accepting linguistic evidence. In spite of the investigative and evidential potential of forensic linguistics to demonstrate the plagiarist’s intention or otherwise, this potential is restricted by the ability to identify a text as being suspect of plagiarism. In an era with such a massive textual production, ‘policing’ plagiarism thus becomes an extraordinarily difficult task without the assistance of plagiarism detection systems. Although plagiarism detection has attracted the attention of computer engineers and software developers for years, a lot of research is still needed. Given the investigative nature of academic plagiarism, plagiarism detection has of necessity to consider not only concepts of education and computational linguistics, but also forensic linguistics. Especially, if intended to counter claims of being a ‘simplistic response’ (Robillard & Howard, 2008). In this paper, we use a corpus of essays written by university students who were accused of plagiarism, to demonstrate that a forensic linguistic analysis of improper paraphrasing in suspect texts has the potential to identify and provide evidence of intention. A linguistic analysis of the corpus texts shows that the plagiarist acts on the paradigmatic axis to replace relevant lexical items with a related word from the same semantic field, i.e. a synonym, a subordinate, a superordinate, etc. In other words, relevant lexical items were replaced with related, but not identical, ones. Additionally, the analysis demonstrates that the word order is often changed intentionally to disguise the borrowing. On the other hand, the linguistic analysis of linking and explanatory verbs (i.e. referencing verbs) and prepositions shows that these have the potential to discriminate instances of ‘patchwriting’ and instances of plagiarism. This research demonstrates that the referencing verbs are borrowed from the original in an attempt to construct the new text cohesively when the plagiarism is inadvertent, and that the plagiarist has made an effort to prevent the reader from identifying the text as plagiarism, when it is intentional. In some of these cases, the referencing elements prove being able to identify direct quotations and thus ‘betray’ and denounce plagiarism. Finally, we demonstrate that a forensic linguistic analysis of these verbs is critical to allow detection software to identify them as proper paraphrasing and not – mistakenly and simplistically – as plagiarism.
Resumo:
This volume focuses on the closely allied yet differing linguistic varieties of Birmingham and its immediate neighbour to the west, the industrial heartland of the Black Country. Both of these areas rose to economic prominence and success during the Industrial Revolution, and both have suffered economically and socially as a result of post-war industrial decline. The industrial heritage of both areas has meant that tight knit and socially homogeneous individual areas in each region have demonstrated in many respects little linguistic change over time, and have continued to exhibit linguistic features, especially morphological constructions, peculiar to these areas or now restricted to these areas. At the same time, immigration from other areas of the British Isles over time, from Commonwealth countries and later from EU member states, together with increased social mobility, have meant that newly developing structures and more widespread UK linguistic phenomena have spread into these varieties. This volume provides a clear description of the structure of the linguistic varieties spoken in the two areas. Following the structure of the Dialects of English volumes, it provides: •A comprehensive overview of the phonological, grammatical and lexical structure of both varieties, as well as similarities between the two varieties and distinguishing features •Thorough discussion of the historical and social factors behind the development of the varieties and the stigma attached to these varieties •Discussion of the unusual situation of the Black Country as an area undefined in geographical and administrative terms, existing only in the imagination •Examples of the variety from native speakers of differing ethnicities, ages and genders •An annotated bibliography for further consultation
Resumo:
We experimentally demonstrate an all-optical binary counter composed of four semiconductor optical amplifier based all-optical switching gates. The time-of-flight optical circuit operates with bit-differential delays between the exclusive-OR gate used for modulo-2 binary addition and the AND gate used for binary carry detection. A movie of the counter operating in real time is presented.
Resumo:
Background - When a moving stimulus and a briefly flashed static stimulus are physically aligned in space the static stimulus is perceived as lagging behind the moving stimulus. This vastly replicated phenomenon is known as the Flash-Lag Effect (FLE). For the first time we employed biological motion as the moving stimulus, which is important for two reasons. Firstly, biological motion is processed by visual as well as somatosensory brain areas, which makes it a prime candidate for elucidating the interplay between the two systems with respect to the FLE. Secondly, discussions about the mechanisms of the FLE tend to recur to evolutionary arguments, while most studies employ highly artificial stimuli with constant velocities. Methodology/Principal Finding - Since biological motion is ecologically valid it follows complex patterns with changing velocity. We therefore compared biological to symbolic motion with the same acceleration profile. Our results with 16 observers revealed a qualitatively different pattern for biological compared to symbolic motion and this pattern was predicted by the characteristics of motor resonance: The amount of anticipatory processing of perceived actions based on the induced perspective and agency modulated the FLE. Conclusions/Significance - Our study provides first evidence for an FLE with non-linear motion in general and with biological motion in particular. Our results suggest that predictive coding within the sensorimotor system alone cannot explain the FLE. Our findings are compatible with visual prediction (Nijhawan, 2008) which assumes that extrapolated motion representations within the visual system generate the FLE. These representations are modulated by sudden visual input (e.g. offset signals) or by input from other systems (e.g. sensorimotor) that can boost or attenuate overshooting representations in accordance with biased neural competition (Desimone & Duncan, 1995).
Resumo:
PURPOSE. To compare the objective accommodative amplitude and dynamics of eyes implanted with the one-compartment-unit (1CU; HumanOptics AG, Erlangen, Germany) accommodative intraocular lenses (IOLs) with that measured subjectively. METHODS. Twenty eyes with a 1CU accommodative IOL implanted were refracted and distance and near acuity measured with a logMAR (logarithm of the minimum angle of resolution) chart. The objective accommodative stimulus-response curve for static targets between 0.17 and 4.00 D accommodative demand was measured with the SRW-5000 (Shin-Nippon Commerce Inc., Tokyo, Japan) and PowerRefractor (PlusOptiX, Nürnberg, Germany) autorefractors. Continuous objective recording of dynamic accommodation was measured with the SRW-5000, with the subject viewing a target moving from 0 to 2.50 D at 0.3 Hz through a Badal lens system. Wavefront aberrometry measures (Zywave; Bausch & Lomb, Rochester, NY) were made through undilated pupils. Subjective amplitude of accommodation was measured with the RAF (Royal Air Force accommodation and vergence measurement) rule. RESULTS. Four months after implantation best-corrected acuity was -0.01 ± 0.16 logMAR at distance and 0.60 ± 0.09 logMAR at near. Objectively, the static amplitude of accommodation was 0.72 ± 0.38 D. The average dynamic amplitude of accommodation was 0.71 ± 0.47 D, with a lag behind the target of 0.50 ± 0.48 seconds. Aberrometry showed a decrease in power of the lens-eye combination from the center to the periphery in all subjects (on average, -0.38 ± 0.28 D/mm). Subjective amplitude of accommodation was 2.24 ± 0.42 D. Two years after 1CU implantation, refractive error and distance visual acuity remained relatively stable, but near visual acuity, and the subjective and objective amplitudes of accommodation decreased. CONCLUSIONS. The objective accommodating effects of the 1CU lens appear to be limited, although patients are able to track a moving target. Subjective and objective accommodation was reduced at the 2-year follow-up. The greater subjective amplitude of accommodation is likely to result from the eye's depth of focus of and the aspheric nature of the IOL. Copyright © Association for Research in Vision and Ophthalmology.
Resumo:
In less than a decade, personal computers have become part of our daily lives. Many of us come into contact with computers every day, whether at work, school or home. As useful as the new technologies are, they also have a darker side. By making computers part of our daily lives, we run the risk of allowing thieves, swindlers, and all kinds of deviants directly into our homes. Armed with a personal computer, a modem and just a little knowledge, a thief can easily access confidential information, such as details of bank accounts and credit cards. This book helps people avoid harm at the hands of Internet criminals. It offers a tour of the more dangerous parts of the Internet, as the author explains who the predators are, their motivations, how they operate and how to protect against them. In less than a decade, personal computers have become part of our daily lives. Many of us come into contact with computers every day, whether at work, school or home. As useful as the new technologies are, they also have a darker side. By making computers part of our daily lives, we run the risk of allowing thieves, swindlers, and all kinds of deviants directly into our homes. Armed with a personal computer, a modem and just a little knowledge, a thief can easily access confidential information, such as details of bank accounts and credit cards. This book is intended to help people avoid harm at the hands of Internet criminals. It offers a tour of the more dangerous parts of the Internet, as the author explains who the predators are, their motivations, how they operate and how to protect against them. Behind the doors of our own homes, we assume we are safe from predators, con artists, and other criminals wishing us harm. But the proliferation of personal computers and the growth of the Internet have invited these unsavory types right into our family rooms. With a little psychological knowledge a con man can start to manipulate us in different ways. A terrorist can recruit new members and raise money over the Internet. Identity thieves can gather personal information and exploit it for criminal purposes. Spammers can wreak havoc on businesses and individuals. Here, an expert helps readers recognize the signs of a would-be criminal in their midst. Focusing on the perpetrators, the author provides information about how they operate, why they do it, what they hope to do, and how to protect yourself from becoming a victim.
Resumo:
Focal points: This study was designed to elicit the views of community pharmacists on any perceived business and professional changes following the loss of resale price maintenance (RPM)A piloted, 22-point self-completion questionnaire containing open, closed and scaled response questions was distributed to 35 independent (<10 stores), 13 multiple group and three supermarket-based pharmacies, and 40 responses were obtained (29 independent, eight multiple and three supermarket)Theme analysis indicated that 20 respondents felt that an increased range of services was now provided, 27 reported a decreased sales potential and 25 thought that patients now purchased more medicinesThe average price at which eight common over-the-counter medicines were offered was found to be £4.34 in independents, £4.37 in multiples and £4.22 in the supermarket pharmacies, compared with an average standard list price of £4.32There are indications that removal of RPM may have instigated changes in community pharmacy
Resumo:
More information on the biochemical interactions taking place between the tear film and the contact lens is required to further our understanding of the causative mechanisms behind the symptoms of dryness and grittiness often experienced by contact lens wearers. These symptoms can often lead to an intolerance to contact lens wear.
Resumo:
In 1962, D. June Sutor published the first crystallographic analysis of C–H…O hydrogen bonding based on a selection of structures then known. Her follow-up paper the next year cited more structures and provided more details, but her ideas met with formidable opposition. This review begins by describing knowledge of C-H…O hydrogen bonding available at the time from physico-chemical and spectroscopic studies. By comparison of structures cited by Sutor with modern redeterminations, the soundness of her basic data set is assessed. The plausibility of the counter-arguments against her is evaluated. Finally, her biographical details are presented along with consideration of factors that might have impeded the acceptance of her work. © 2012 Taylor & Francis.
Resumo:
In the past thirty years, autofiction has been at the center of many literary studies (Alberca 2005/6, 2007; Colonna 1989, 2004; Gasparini 2004; Genette 1982), although only recently in Hispanic literary studies. Despite the many common characteristics with self-translation, no interdisciplinary perspective has ever been offered by academic researchers in Literary nor in Translation Studies. This article analyses how the Cuban author, Reinaldo Arenas, uses specific methods inherent to autofiction, such as nominal and personal identity exploitation, among others, to translate himself metaphorically into his novel El color del verano [The colour of summer]. Analysing this novel drawing on the theory of self-translation sheds light on the intrinsic and extraneous motives behind the writer’s decision to use this specific literary genre, as well as the benefits presented to the reader who gains access to the ‘interliminal space’ of the writer’s work as a whole.