46 resultados para Key Management Protocol
Resumo:
Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.
Resumo:
Despite abundant research on work meaningfulness, the link between work meaningfulness and general ethical attitude at work has not been discussed so far. In this article, we propose a theoretical framework to explain how work meaningfulness contributes to enhanced ethical behavior. We argue that by providing a way for individuals to relate work to one's personal core values and identity, work meaningfulness leads to affective commitment - the involvement of one's cognitive, emotional, and physical resources. This, in turn, leads to engagement and so facilitates the integration of one's personal values in the daily work routines, and so reduces the risk of unethical behavior. On the contrary, anomie, that is, the absence of meaning and consequently of personal involvement, will lead to lower rational commitment rather than affective commitment, and consequently to disengagement and a-morality. We conclude with implications for the management of ethical attitudes.
Resumo:
OBJECTIVE: To describe chronic disease management programs active in Switzerland in 2007, using an exploratory survey. METHODS: We searched the internet (Swiss official websites and Swiss web-pages, using Google), a medical electronic database (Medline), reference lists of pertinent articles, and contacted key informants. Programs met our operational definition of chronic disease management if their interventions targeted a chronic disease, included a multidisciplinary team (>/=2 healthcare professionals), lasted at least six months, and had already been implemented and were active in December 2007. We developed an extraction grid and collected data pertaining to eight domains (patient population, intervention recipient, intervention content, delivery personnel, method of communication, intensity and complexity, environment, clinical outcomes). RESULTS: We identified seven programs fulfilling our operational definition of chronic disease management. Programs targeted patients with diabetes, hypertension, heart failure, obesity, psychosis and breast cancer. Interventions were multifaceted; all included education and half considered planned follow-ups. The recipients of the interventions were patients, and healthcare professionals involved were physicians, nurses, social workers, psychologists and case managers of various backgrounds. CONCLUSIONS: In Switzerland, a country with universal healthcare insurance coverage and little incentive to develop new healthcare strategies, chronic disease management programs are scarce. For future developments, appropriate evaluations of existing programs, involvement of all healthcare stakeholders, strong leadership and political will are, at least, desirable.
Resumo:
Medication nonadherence is common and its determinants are diverse. Adherence is influenced by many parameters, such as patient's self-efficacy, knowledge of health risk, outcome expectations, benefits of change, and barriers and facilitators. The sociocognitive theory helps professionals to structure their approach and to support patients in managing their treatment. Professionals need skills and time, and benefit from coordination in care, in particular between physicians and pharmacists. This article presents the key elements of a medication adherence program as well as tools and some useful questions.
Resumo:
Protozoan and helminthes are frequently associated with persistent digestive complaints, not only in returning travelers from the tropics, but also in industrialized countries. The symptoms are often more vague than those associated to bacterial or viral infections and diarrhea is not always a key feature of the clinical presentation. Three stool examinations and a full blood cells count looking for eosinophilia is the comer stone of the investigations looking for digestive parasites. This article reviews the epidemiology, clinical presentation, diagnostic and management of digestive protozoans and helminthes.
Resumo:
The aim was to propose a strategy for finding reasonable compromises between image noise and dose as a function of patient weight. Weighted CT dose index (CTDI(w)) was measured on a multidetector-row CT unit using CTDI test objects of 16, 24 and 32 cm in diameter at 80, 100, 120 and 140 kV. These test objects were then scanned in helical mode using a wide range of tube currents and voltages with a reconstructed slice thickness of 5 mm. For each set of acquisition parameter image noise was measured and the Rose model observer was used to test two strategies for proposing a reasonable compromise between dose and low-contrast detection performance: (1) the use of a unique noise level for all test object diameters, and (2) the use of a unique dose efficacy level defined as the noise reduction per unit dose. Published data were used to define four weight classes and an acquisition protocol was proposed for each class. The protocols have been applied in clinical routine for more than one year. CTDI(vol) values of 6.7, 9.4, 15.9 and 24.5 mGy were proposed for the following weight classes: 2.5-5, 5-15, 15-30 and 30-50 kg with image noise levels in the range of 10-15 HU. The proposed method allows patient dose and image noise to be controlled in such a way that dose reduction does not impair the detection of low-contrast lesions. The proposed values correspond to high- quality images and can be reduced if only high-contrast organs are assessed.
Resumo:
Introduction: An excellent coordination between firefighters, policemen and medical rescue is the key to success in the management of major accidents. In order to improve and assist the medical teams engaged on site, the Swiss "medical command and control system" for rescue operations is based on a binomial set up involving one head emergency doctor and one head rescue paramedic, both trained in disaster medicine. We have recently experimented an innovative on-site "medical command and control system", based on the binomial team, supported by a dedicated 144 dispatcher. Methods: A major road traffic accident took place on the highway between Lausanne and Vevey on April 9th 2008. We have retrospectively collected all data concerning the victims as well as the logistics and dedicated structures, reported by the 144, the Hospitals, the Authority of the State and the Police and Fire Departments. Results: The 72-car pileup caused one death and 26 slightly injured patients. The management on the accident site was organized around a tripartite system, gathering together the medical command and control team with the police and fire departments. On the medical side, 16 ambulances, 2 medical response teams (SMUR), the Rega crew and the medical command and control team were dispatched by the 144. On that occasion an advanced medical command car equipped with communication devices and staffed with a 144 dispatcher was also engaged, allowing efficient medical regulation directly from the site. Discussion: The specific skills of one doctor and one paramedic both trained for disaster's management proved to be perfectly complementary. The presence of a dispatcher on site with a medical command car also proved to be useful, improving orders transmission from the medical command team to all other on- and off-site partners. It relieved the need of repeated back-and-forth communication with the 144, allowing both paramedic and doctor to focus on strategy and tactics rather than communication and logistics.
Resumo:
OBJECTIVE: To provide an update to the original Surviving Sepsis Campaign clinical management guidelines, "Surviving Sepsis Campaign guidelines for management of severe sepsis and septic shock," published in 2004. DESIGN: Modified Delphi method with a consensus conference of 55 international experts, several subsequent meetings of subgroups and key individuals, teleconferences, and electronic-based discussion among subgroups and among the entire committee. This process was conducted independently of any industry funding. METHODS: We used the GRADE system to guide assessment of quality of evidence from high (A) to very low (D) and to determine the strength of recommendations. A strong recommendation indicates that an intervention's desirable effects clearly outweigh its undesirable effects (risk, burden, cost), or clearly do not. Weak recommendations indicate that the tradeoff between desirable and undesirable effects is less clear. The grade of strong or weak is considered of greater clinical importance than a difference in letter level of quality of evidence. In areas without complete agreement, a formal process of resolution was developed and applied. Recommendations are grouped into those directly targeting severe sepsis, recommendations targeting general care of the critically ill patient that are considered high priority in severe sepsis, and pediatric considerations. RESULTS: Key recommendations, listed by category, include: early goal-directed resuscitation of the septic patient during the first 6 hrs after recognition (1C); blood cultures prior to antibiotic therapy (1C); imaging studies performed promptly to confirm potential source of infection (1C); administration of broad-spectrum antibiotic therapy within 1 hr of diagnosis of septic shock (1B) and severe sepsis without septic shock (1D); reassessment of antibiotic therapy with microbiology and clinical data to narrow coverage, when appropriate (1C); a usual 7-10 days of antibiotic therapy guided by clinical response (1D); source control with attention to the balance of risks and benefits of the chosen method (1C); administration of either crystalloid or colloid fluid resuscitation (1B); fluid challenge to restore mean circulating filling pressure (1C); reduction in rate of fluid administration with rising filing pressures and no improvement in tissue perfusion (1D); vasopressor preference for norepinephrine or dopamine to maintain an initial target of mean arterial pressure > or = 65 mm Hg (1C); dobutamine inotropic therapy when cardiac output remains low despite fluid resuscitation and combined inotropic/vasopressor therapy (1C); stress-dose steroid therapy given only in septic shock after blood pressure is identified to be poorly responsive to fluid and vasopressor therapy (2C); recombinant activated protein C in patients with severe sepsis and clinical assessment of high risk for death (2B except 2C for post-operative patients). In the absence of tissue hypoperfusion, coronary artery disease, or acute hemorrhage, target a hemoglobin of 7-9 g/dL (1B); a low tidal volume (1B) and limitation of inspiratory plateau pressure strategy (1C) for acute lung injury (ALI)/acute respiratory distress syndrome (ARDS); application of at least a minimal amount of positive end-expiratory pressure in acute lung injury (1C); head of bed elevation in mechanically ventilated patients unless contraindicated (1B); avoiding routine use of pulmonary artery catheters in ALI/ARDS (1A); to decrease days of mechanical ventilation and ICU length of stay, a conservative fluid strategy for patients with established ALI/ARDS who are not in shock (1C); protocols for weaning and sedation/analgesia (1B); using either intermittent bolus sedation or continuous infusion sedation with daily interruptions or lightening (1B); avoidance of neuromuscular blockers, if at all possible (1B); institution of glycemic control (1B) targeting a blood glucose < 150 mg/dL after initial stabilization ( 2C ); equivalency of continuous veno-veno hemofiltration or intermittent hemodialysis (2B); prophylaxis for deep vein thrombosis (1A); use of stress ulcer prophylaxis to prevent upper GI bleeding using H2 blockers (1A) or proton pump inhibitors (1B); and consideration of limitation of support where appropriate (1D). Recommendations specific to pediatric severe sepsis include: greater use of physical examination therapeutic end points (2C); dopamine as the first drug of choice for hypotension (2C); steroids only in children with suspected or proven adrenal insufficiency (2C); a recommendation against the use of recombinant activated protein C in children (1B). CONCLUSION: There was strong agreement among a large cohort of international experts regarding many level 1 recommendations for the best current care of patients with severe sepsis. Evidenced-based recommendations regarding the acute management of sepsis and septic shock are the first step toward improved outcomes for this important group of critically ill patients.
Resumo:
BACKGROUND: Intrathecal analgesia and avoidance of perioperative fluid overload are key items within enhanced recovery pathways. Potential side effects include hypotension and renal dysfunction. STUDY DESIGN: From January 2010 until May 2010, all patients undergoing colorectal surgery within enhanced recovery pathways were included in this retrospective cohort study and were analyzed by intrathecal analgesia (IT) vs none (noIT). Primary outcomes measures were systolic and diastolic blood pressure, mean arterial pressure, and heart rate for 48 hours after surgery. Renal function was assessed by urine output and creatinine values. RESULTS: One hundred and sixty-three consecutive colorectal patients (127 IT and 36 noIT) were included in the analysis. Both patient groups showed low blood pressure values within the first 4 to 12 hours and a steady increase thereafter before return to baseline values after about 24 hours. Systolic and diastolic blood pressure and mean arterial pressure were significantly lower until 16 hours after surgery in patients having IT compared with the noIT group. Low urine output (<0.5 mL/kg/h) was reported in 11% vs 29% (IT vs noIT; p = 0.010) intraoperatively, 20% vs 11% (p = 0.387), 33% vs 22% (p = 0.304), and 31% vs 21% (p = 0.478) for postanesthesia care unit and postoperative days 1 and 2, respectively. Only 3 of 127 (2.4%) IT and 1 of 36 (2.8%) noIT patients had a transitory creatinine increase >50%; no patients required dialysis. CONCLUSIONS: Postoperative hypotension affects approximately 10% of patients within an enhanced recovery pathway and is slightly more pronounced in patients with IT. Hemodynamic depression persists for <20 hours after surgery; it has no measurable negative impact and therefore cannot justify detrimental postoperative fluid overload.
Resumo:
Résumé La cryptographie classique est basée sur des concepts mathématiques dont la sécurité dépend de la complexité du calcul de l'inverse des fonctions. Ce type de chiffrement est à la merci de la puissance de calcul des ordinateurs ainsi que la découverte d'algorithme permettant le calcul des inverses de certaines fonctions mathématiques en un temps «raisonnable ». L'utilisation d'un procédé dont la sécurité est scientifiquement prouvée s'avère donc indispensable surtout les échanges critiques (systèmes bancaires, gouvernements,...). La cryptographie quantique répond à ce besoin. En effet, sa sécurité est basée sur des lois de la physique quantique lui assurant un fonctionnement inconditionnellement sécurisé. Toutefois, l'application et l'intégration de la cryptographie quantique sont un souci pour les développeurs de ce type de solution. Cette thèse justifie la nécessité de l'utilisation de la cryptographie quantique. Elle montre que le coût engendré par le déploiement de cette solution est justifié. Elle propose un mécanisme simple et réalisable d'intégration de la cryptographie quantique dans des protocoles de communication largement utilisés comme les protocoles PPP, IPSec et le protocole 802.1li. Des scénarios d'application illustrent la faisabilité de ces solutions. Une méthodologie d'évaluation, selon les critères communs, des solutions basées sur la cryptographie quantique est également proposée dans ce document. Abstract Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions? This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.
Resumo:
Quantitative information from magnetic resonance imaging (MRI) may substantiate clinical findings and provide additional insight into the mechanism of clinical interventions in therapeutic stroke trials. The PERFORM study is exploring the efficacy of terutroban versus aspirin for secondary prevention in patients with a history of ischemic stroke. We report on the design of an exploratory longitudinal MRI follow-up study that was performed in a subgroup of the PERFORM trial. An international multi-centre longitudinal follow-up MRI study was designed for different MR systems employing safety and efficacy readouts: new T2 lesions, new DWI lesions, whole brain volume change, hippocampal volume change, changes in tissue microstructure as depicted by mean diffusivity and fractional anisotropy, vessel patency on MR angiography, and the presence of and development of new microbleeds. A total of 1,056 patients (men and women ≥ 55 years) were included. The data analysis included 3D reformation, image registration of different contrasts, tissue segmentation, and automated lesion detection. This large international multi-centre study demonstrates how new MRI readouts can be used to provide key information on the evolution of cerebral tissue lesions and within the macrovasculature after atherothrombotic stroke in a large sample of patients.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
Gout is an inflammatory arthritis caused by monosodium urate (MSU) crystal deposits in and around the joint. The formation of urinary calculi can also occur in gout, but are less common than arthritis. Gout usually presents with recurrent episodes of joint inflammation, which over time lead to tophus formation and joint destruction. In the last decade, significant advances have been made regarding not only the epidemiology and genetics of gout and hyperuricemia but also the mechanisms of inflammation and treatment of gout. In addition, knowledge concerning the key role of interleukin 1 (IL-1) has provided new therapeutic perspectives. However, the current management of gout is often suboptimal, with many Patienten either not receiving adequate treatment or being unable to tolerate existing treatments. New therapeutic agents provide interesting new options for Patienten with difficult-to-treat gouty arthritis.The English full-text version of this is available at SpringerLink (under "Supplemental").
Resumo:
A score system integrating the evolution of efficacy and tolerability over time was applied to a subpopulation of the STRATHE trial, a trial performed according to a parallel group design, with a double-blind, random allocation to either a fixed-dose combination strategy (perindopril/indapamide 2 mg/0.625 mg, with the possibility to increase the dose to 3 mg/0.935 mg, and 4 mg/1.250 mg if needed, n = 118), a sequential monotherapy approach (atenolol 50 mg, followed by losartan 50 mg and amlodipine 5 mg if needed, n = 108), or a stepped-care strategy (valsartan 40 mg, followed by valsartan 80 mg and valsartan 80 mg+ hydrochlorothiazide 12.5 mg if needed, n = 103). The aim was to lower blood pressure below 140/90 mmHg within a 9-month period. The treatment could be adjusted after 3 and 6 months. Only patients in whom the study protocol was strictly applied were included in this analysis. At completion of the trial the total score averaged 13.1 +/- 70.5 (mean +/- SD) using the fixed-dose combination strategy, compared with -7.2 +/- 81.0 using the sequential monotherapy approach and -17.5 +/- 76.4 using the stepped-care strategy. In conclusion, the use of a score system allows the comparison of antihypertensive therapeutic strategies, taking into account at the same time efficacy and tolerability. In the STRATHE trial the best results were observed with the fixed-dose combination containing low doses of an angiotensin enzyme converting inhibitor (perindopril) and a diuretic (indapamide).
Resumo:
PURPOSE: To evaluate the technical quality and the diagnostic performance of a protocol with use of low volumes of contrast medium (25 mL) at 64-detector spiral computed tomography (CT) in the diagnosis and management of adult, nontraumatic subarachnoid hemorrhage (SAH). MATERIALS AND METHODS: This study was performed outside the United States and was approved by the institutional review board. Intracranial CT angiography was performed in 73 consecutive patients with nontraumatic SAH diagnosed at nonenhanced CT. Image quality was evaluated by two observers using two criteria: degree of arterial enhancement and venous contamination. The two independent readers evaluated diagnostic performance (lesion detection and correct therapeutic decision-making process) by using rotational angiographic findings as the standard of reference. Sensitivity, specificity, and positive and negative predictive values were calculated for patients who underwent CT angiography and three-dimensional rotational angiography. The intraclass correlation coefficient was calculated to assess interobserver concordance concerning aneurysm measurements and therapeutic management. RESULTS: All aneurysms were detected, either ruptured or unruptured. Arterial opacification was excellent in 62 cases (85%), and venous contamination was absent or minor in 61 cases (84%). In 95% of cases, CT angiographic findings allowed optimal therapeutic management. The intraclass correlation coefficient ranged between 0.93 and 0.95, indicating excellent interobserver agreement. CONCLUSION: With only 25 mL of iodinated contrast medium focused on the arterial phase, 64-detector CT angiography allowed satisfactory diagnostic and therapeutic management of nontraumatic SAH.