915 resultados para Sun safe apparel
Resumo:
Introductionþ: L'insulinothérapie intraveineuse est la mieux adaptée pour obtenirun contrôle glycémique rapidement efficace ou lors de besoins en insulinechangeants de façon peu prévisible, mais son emploi hors des soins intensifs seheurte souvent aux manque de formation et réticences des soignants. L'inclusionL'inclusiondu contrôle glycémique rapide dans nos standards institutionnels de priseen charge de l'AVC aigu a suscité une demande de protocole thérapeutiqueadapté aux besoins de l'Unité cérébrovasculaire.Patients et méthodesþ: Le protocole d'insulinothérapie a été dérivé d'algorithmespubliés intégrant glycémie actuelle, cinétique glycémique et sensibilité àl'insuline du patient. Aux repas, une augmentation du débit d'insuline iv. pendant1 h a été ajoutée. Les objectifs glycémiques étaient 4-6þmmol/l en préprandialetþ< 8þmmol/l en postprandial. L'implémentation s'est faite à travers unprocessus de co-construction (outils de gestion, documents et activités de formation)avec les responsables médico-infirmiers du service.Résultatsþ: Les données des 90 premiers patients ont été analysées (diabète connuþ:38, hyperglycémie nouvelleþ: 52, 2715h de traitement cumulées). Les duréesd'insulinothérapie iv. étaient de 34,5 h [interquartile 24-39] et 26,5 h [21-36,3] respectivement(pþ=þ0,03), les délais d'atteinte de l'objectif de 5 h [4.0-8.25] et 7 h[4.0-9.75] (pþ=þns.). Pendant les 24 h suivantes, les taux de glycémies dans la cibleétaient de 70,4þ%/81,3þ% (90,3þ%/94,6þ% entre 4-8þmmol/l), avec un faible tauxd'hypoglycémies (3,9þ%/3,1þ%þ< 4,0þmmol/l, 0,4þ%/0,2þ%þ<þ3,3þmmol/l) et un contrôleglycémique postprandial comparable (excursions +2,6þmmol/l [0,7-3,9] et+1,7þmmol/l [0,6-3,7]þ; Nþ=þ75þ; pþ=þns.).Conclusionþ: L'insulinothérapie intraveineuse hors des soins intensifs est faisable,hautement sûre et efficace, même avec des objectifs glycémiques particulièrementstricts. Outre la fiabilité perçue de l'outil de gestion, la démarche departenariat adoptée avec les soignants, permettant la prise en compte de leurspréoccupations à chaque étape du processus, a été un facteur de succès importantpour son implémentation.
Resumo:
Question: Outdoor workers can be exposed to intense ultraviolet (UV) solar radiation likely to results to sunburns. As sunburn is an important risk factor for skin cancer, in particular melanoma, we investigated the causes of occupational sunburns (OS) in French outdoor workers. Methods: A population-based survey was conducted in May-June 2012 through computer-assisted telephonic interviews in population 25 to 69 years of age. History of sunburn from occupational exposure within the year preceding interview was collected. We analysed the risk of OS in multivariate logistic regression. Results: Out of 1442 individuals who declared having an occupational exposure to solar UV radiation, 403 (27.9%) reported a sunburn from occupational exposure in the year preceding the interview. Sunburns were more frequent in women (30% vs. 26.4% in men although not significant p = 0.14), in younger workers (p = 0.0099), in sensitive phototype (40% in phototype I/II vs. 23% in phototype III/IV, p < 0.001) and in workers taking lunch outdoor (p = 0.0355). Some occupations were more associated with OS (more than 30%): health occupations, managing, research/engineering, construction workers and culture/art/social sciences workers. In multivariate analysis, risk factors for OS are phototype (I vs. IV, OR = 4.30 95% CI [2.65-6.98]), sunburn during leisure time (OR = 3.46 95% CI [2.62-4.59]), seasonality of exposure (seasonal vs. constant exposure OR = 1.36 95% CI [1.02-1.81] and annual UVA exposure (OR for 10J/m² daily average increment 1.08 95% CI [1.02-1.14]). In multivariate analysis the type of occupation was not associated with increased OS. Conclusion: Sunburns from occupation was also observed in non sensitive population, phototype IV, which shows that outdoor workers are potentially exposed to intense UV radiations. This study suggests that prevention should target UV sensitive outdoor workers as well as those cumulating intense UV exposure.
Resumo:
Purpose: Aqueous flow through trabeculectomy blebs has been suggested to influence filtration bleb survival. We investigated the relationship between the requirement to increase aqueous flow via adjustable suture removal and surgical outcomes following "safe trabeculectomy" with mitomycin C (MMC). Methods: 62 consecutive eyes of 53 patients underwent fornix based trabeculectomy with adjustable sutures, intraoperative MMC and intensive postoperative steroids. Subconjunctival antimetabolite injections and bleb needlings were administered according to bleb vascularity and IOP trends. Main outcome measures were: success rates (definition: IOP≤21mmHg and 20% IOP reduction); number of antimetabolite injections; bleb needlings; number of of eyes recommencing glaucoma medications and complications. Results: Mean age was 70.4±16.0 years (mean± SD); mean preoperative IOP was 24.5±9.1 mmHg and decreased to 12.3±8.9mmHg postoperatively. Mean number of sutures was 2.6 ± 0.7. Eyes were divided into 2 groups in relation to the number of sutures removed. The number of subconjunctival MMC injections required for those requiring 2 suture removals was significantly greater than those requiring 1 suture removal (p<0.05) The number of needlings and 5FU injections also increased but did not reach significance (p=0.09 and p=0.34 respectively). Least-squared linear regression analysis showed the number of needlings required had a statistically significant (p=0.05) trend with respect to time elapsed between surgery and first suture removal. No other interventions had significant trends. Mean time between surgery and suture removal was: 4.2±9.2 weeks (suture #1) and 5.7±9.7 weeks (suture#2). Antiglaucoma medication was restarted in only 5 eyes. Postoperative complications were infrequent: Seidel (3.2%), peripheral choroidal effusions at any time (3.2%), and shallow anterior chamber (1.6%). Conclusion: Eyes requiring a greater number of suture removals required a significantly greater number of antifibrosis interventions. The time elapsed before suture removal was inversely related to the number of postoperative needlings, suggesting these eyes may have decreased aqueous production and therefore require aggressive post-operative management to prevent bleb failure.
Resumo:
Plasmapheresis is an extracorporeal technique used to remove pathogenic macromolecules from the circulation, particularly autoantibodies. This is illustrated in 2 female patients. The first patient, aged 61 years, was treated successfully with non-selective plasmapheresis for acute humoral rejection shortly after receiving a renal allograft. In the second patient, aged 82 years, plasmapheresis for refractory myasthenia gravis had to be stopped because of bradycardia and hypotension during the procedure. She was treated successfully with immunoglobulins. Plasmapheresis is used to treat neurological, renal, haematological and systemic disorders. In nonselective plasmapheresis, the plasma is replaced with saline and albumin or donor plasma. In selective plasmapheresis a highly selective filter is used to remove a specific, pathogenic macromolecule. Adverse effects of the treatment include disturbances of the acid-base equilibrium or the coagulation, and allergic reactions. Most of these complications, however, can nowadays be avoided.
Resumo:
We carried out a systematic review of HPV vaccine pre- and post-licensure trials to assess the evidence of their effectiveness and safety. We find that HPV vaccine clinical trials design, and data interpretation of both efficacy and safety outcomes, were largely inadequate. Additionally, we note evidence of selective reporting of results from clinical trials (i.e., exclusion of vaccine efficacy figures related to study subgroups in which efficacy might be lower or even negative from peer-reviewed publications). Given this, the widespread optimism regarding HPV vaccines long-term benefits appears to rest on a number of unproven assumptions (or such which are at odd with factual evidence) and significant misinterpretation of available data. For example, the claim that HPV vaccination will result in approximately 70% reduction of cervical cancers is made despite the fact that the clinical trials data have not demonstrated to date that the vaccines have actually prevented a single case of cervical cancer (let alone cervical cancer death), nor that the current overly optimistic surrogate marker-based extrapolations are justified. Likewise, the notion that HPV vaccines have an impressive safety profile is only supported by highly flawed design of safety trials and is contrary to accumulating evidence from vaccine safety surveillance databases and case reports which continue to link HPV vaccination to serious adverse outcomes (including death and permanent disabilities). We thus conclude that further reduction of cervical cancers might be best achieved by optimizing cervical screening (which carries no such risks) and targeting other factors of the disease rather than by the reliance on vaccines with questionable efficacy and safety profiles.
Resumo:
Nanotechnology is becoming part of our daily life in a wide range of products such as computers, bicycles, sunscreens or nanomedicines. While these applications already become reality, considerable work awaits scientists, engineers, and policy makers, who want such nanotechnological products to yield a maximum of benefit at a minimum of social, environmental, economic and (occupational) health cost. Considerable efforts for coordination and collaboration in research are needed if one wants to reach these goals in a reasonable time frame and an affordable price tag. This is recognized in Europe by the European Commission which funds not only research projects but also supports the coordination of research efforts. One of these coordination efforts is NanoImpactNet, a researcher-operated network, which started in 2008 promote scientific cross-talk across all disciplines on the health and environmental impact of nanomaterials. Stakeholders contribute to these activities, notably the definition of research and knowledge needs. Initial discussions in this domain focused on finding an agreement on common metrics, and which elements are needed for standardized approaches for hazard and exposure identification. There are many nanomaterial properties that may play a role. Hence, to gain the time needed to study this complex matter full of uncertainties, researchers and stakeholders unanimously called for simple, easy and fast risk assessment tools that can support decision making in this rapidly moving and growing domain. Today, several projects are starting or already running that will develop such assessment tools. At the same time, other projects investigate in depth which factors and material properties can lead to unwanted toxicity or exposure, what mechanisms are involved and how such responses can be predicted and modelled. A vision for the future is that once these factors, properties and mechanisms are understood, they can and will be accounted for in the development of new products and production processes following the idea of "Safety by Design". The promise of all these efforts is a future with nanomaterials where most of their risks are recognized and addressed before they even reach the market.
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.
Resumo:
There is a widespread agreement from patient and professional organisations alike that the safety of stem cell therapeutics is of paramount importance, particularly for ex vivo autologous gene therapy. Yet current technology makes it difficult to thoroughly evaluate the behaviour of genetically corrected stem cells before they are transplanted. To address this, we have developed a strategy that permits transplantation of a clonal population of genetically corrected autologous stem cells that meet stringent selection criteria and the principle of precaution. As a proof of concept, we have stably transduced epidermal stem cells (holoclones) obtained from a patient suffering from recessive dystrophic epidermolysis bullosa. Holoclones were infected with self-inactivating retroviruses bearing a COL7A1 cDNA and cloned before the progeny of individual stem cells were characterised using a number of criteria. Clonal analysis revealed a great deal of heterogeneity among transduced stem cells in their capacity to produce functional type VII collagen (COLVII). Selected transduced stem cells transplanted onto immunodeficient mice regenerated a non-blistering epidermis for months and produced a functional COLVII. Safety was assessed by determining the sites of proviral integration, rearrangements and hit genes and by whole-genome sequencing. The progeny of the selected stem cells also had a diploid karyotype, was not tumorigenic and did not disseminate after long-term transplantation onto immunodeficient mice. In conclusion, a clonal strategy is a powerful and efficient means of by-passing the heterogeneity of a transduced stem cell population. It guarantees a safe and homogenous medicinal product, fulfilling the principle of precaution and the requirements of regulatory affairs. Furthermore, a clonal strategy makes it possible to envision exciting gene-editing technologies like zinc finger nucleases, TALENs and homologous recombination for next-generation gene therapy.
Resumo:
Stockton 1789, T.Bewick
Resumo:
BACKGROUND: Several guidelines recommend computed tomography scans for populations with high-risk for lung cancer. The number of individuals evaluated for peripheral pulmonary lesions (PPL) will probably increase, and with it non-surgical biopsies. Associating a guidance method with a target confirmation technique has been shown to achieve the highest diagnostic yield, but the utility of bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy as guidance without a guide sheath has not been reported. METHODS: We conducted a retrospective analysis of bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy procedures for the investigation of PPL performed by experienced bronchoscopists with no specific previous training in this particular technique. Operator learning curves and radiological predictors were assessed for all consecutive patients examined during the first year of application of the technique. RESULTS: Fifty-one PPL were investigated. Diagnostic yield and visualization yield were 72.5 and 82.3% respectively. The diagnostic yield was 64.0% for PPL ≤20mm, and 80.8% for PPL>20mm. No false-positive results were recorded. The learning curve of all diagnostic tools showed a DY of 72.7% for the first sub-group of patients, 81.8% for the second, 72.7% for the third, and 81.8% for the last. CONCLUSION: Bronchoscopy with radial probe endobronchial ultrasound using fluoroscopy as guidance is safe and simple to perform, even without specific prior training, and diagnostic yield is high for PPL>and ≤20mm. Based on these findings, this method could be introduced as a first-line procedure for the investigation of PPL, particularly in centers with limited resources.
Resumo:
INTRODUCTION: Hyperglycemia is a metabolic alteration in major burn patients associated with complications. The study aimed at evaluating the safety of general ICU glucose control protocols applied in major burns receiving prolonged ICU treatment. METHODS: 15year retrospective analysis of consecutive, adult burn patients admitted to a single specialized centre. EXCLUSION CRITERIA: death or length of stay <10 days, age <16years. VARIABLES: demographic variables, burned surface (TBSA), severity scores, infections, ICU stay, outcome. Metabolic variables: total energy, carbohydrate and insulin delivery/24h, arterial blood glucose and CRP values. Analysis of 4 periods: 1, before protocol; 2, tight doctor driven; 3, tight nurse driven; 4, moderate nurse driven. RESULTS: 229 patients, aged 45±20 years (mean±SD), burned 32±20% TBSA were analyzed. SAPSII was 35±13. TBSA, Ryan and ABSI remained stable. Inhalation injury increased. A total of 28,690 blood glucose samples were analyzed: the median value remained unchanged with a narrower distribution over time. After the protocol initiation, the normoglycemic values increased from 34.7% to 65.9%, with a reduction of hypoglycaemic events (no extreme hypoglycemia in period 4). Severe hyperglycemia persisted throughout with a decrease in period 4 (9.25% in period 4). Energy and glucose deliveries decreased in periods 3 and 4 (p<0.0001). Infectious complications increased during the last 2 periods (p=0.01). CONCLUSION: A standardized ICU glucose control protocol improved the glycemic control in adult burn patients, reducing glucose variability. Moderate glycemic control in burns was safe specifically related to hypoglycemia, reducing the incidence of hypoglycaemic events compared to the period before. Hyperglycemia persisted at a lower level.