974 resultados para Security protocols
Resumo:
OBJECTIVE: To assess whether formatting the medical order sheet has an effect on the accuracy and security of antibiotics prescription. DESIGN: Prospective assessment of antibiotics prescription over time, before and after the intervention, in comparison with a control ward. SETTING: The medical and surgical intensive care unit (ICU) of a university hospital. PATIENTS: All patients hospitalized in the medical or surgical ICU between February 1 and April 30, 1997, and July 1 and August 31, 2000, for whom antibiotics were prescribed. INTERVENTION: Formatting of the medical order sheet in the surgical ICU in 1998. MEASUREMENTS AND MAIN RESULTS: Compliance with the American Society of Hospital Pharmacists' criteria for prescription safety was measured. The proportion of safe orders increased in both units, but the increase was 4.6 times greater in the surgical ICU (66% vs. 74% in the medical ICU and 48% vs. 74% in the surgical ICU). For unsafe orders, the proportion of ambiguous orders decreased by half in the medical ICU (9% vs. 17%) and nearly disappeared in the surgical ICU (1% vs. 30%). The only missing criterion remaining in the surgical ICU was the drug dose unit, which could not be preformatted. The aim of antibiotics prescription (either prophylactic or therapeutic) was indicated only in 51% of the order sheets. CONCLUSIONS: Formatting of the order sheet markedly increased security of antibiotics prescription. These findings must be confirmed in other settings and with different drug classes. Formatting the medical order sheet decreases the potential for prescribing errors before full computerized prescription is available.
New developments of peace research: The impact of recent campaigns on disarmament and human security
Resumo:
The present text, based on previous work done by the authors on peace research (Grasa 1990 and 2010) and the disarmament campaigns linked to Human Security (Alcalde 2009 and 2010), has two objectives. First, to present a new agenda for peace research, based on the resolution/transformation of conflicts and the promotion of collective action in furtherance of human security and human development. Second, to focus specifically on collective action and on a positive reading of some of the campaigns that have taken place during the last decades in order to see how the experiences of such will affect the future agenda for peace research and action for peace.
Resumo:
AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.
Resumo:
L'objectiu d'aquest projecte és desenvolupar una aplicació que implementi la funcionalitat del joc de l'oca de manera segura, amb n jugadors distribuïts en diferents ordinadors amb adreces IP i ports diferents connectats per mitjà d'una xarxa, que pot ser Internet, una intranet o una xarxa corporativa.
Resumo:
El sistema presentat proposa una solució de joc electrònic remot segur per a la ruleta, que usa criptografia de clan pública, certificats i signaturas digitals. Es difineix com es faran les accions, els protocols, per assegurar als participants que el joc és just i honest
Resumo:
In this paper, we define a new scheme to develop and evaluate protection strategies for building reliable GMPLS networks. This is based on what we have called the network protection degree (NPD). The NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability, and an a posteriori evaluation, the failure impact degree (FID), which determines the impact on the network in case of failure, in terms of packet loss and recovery time. Having mathematical formulated these components, experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms in order to offer a certain degree of protection
Resumo:
In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed
Resumo:
Summary The field of public finance focuses on the spending and taxing activities of governments and their influence on the allocation of resources and distribution of income. This work covers in three parts different topics related to public finance which are currently widely discussed in media and politics. The first two parts deal with issues on social security, which is in general one of the biggest spending shares of governments. The third part looks at the main income source of governments by analyzing the perceived value of tax competition. Part one deals with the current problem of increased early retirement by focusing on Switzerland as a special case. Early retirement is predominantly considered to be the result of incentives set by social security and the tax system. But the Swiss example demonstrates that the incidence of early retirement has dramatically increased even in the absence of institutional changes. We argue that the wealth effect also plays an important role in the retirement decision for middle and high income earners. An actuarially fair, but mandatory funded system with a relatively high replacement rate may thus contribute to a low labor market participation rate of elderly workers. We provide evidence using a unique dataset on individual retirement decisions in Swiss pension funds, allowing us to perfectly control for pension scheme details. Our findings suggest that affordability is a key determinant in the retirement decisions. The higher the accumulated pension capital, the earlier men, and to a smaller extent women, tend to leave the workforce. The fact that early retirement has become much more prevalent in the last 15 years is a further indicator of the importance of a wealth effect, as the maturing of the Swiss mandatory funded pension system over that period has led to an increase in the effective replacement rates for middle and high income earners. Part two covers the theoretical side of social security. Theories analyzing optimal social security benefits provide important qualitative results, by mainly using one general type of an economy. Economies are however very diverse concerning numerous aspects, one of the most important being the wealth level. This can lead to significant quantitative benefit differences that imply differences in replacement rates and levels of labor supply. We focus on several aspects related to this fact. In a within cohort social security model, we introduce disability insurance with an imperfect screening mechanism. We then vary the wealth level of the model economy and analyze how the optimal social security benefit structure or equivalently, the optimal replacement rates, changes depending on the wealth level of the economy, and if the introduction of disability insurance into a social security system is preferable for all economies. Second, the screening mechanism of disability insurance and the threshold level at which people are defined as disabled can differ. For economies with different wealth levels, we determine for different thresholds the screening level that maximizes social welfare. Finally, part three turns to the income of governments, by adding an element to the controversy on tax competition versus tax harmonization.2 Inter-jurisdictional tax competition can generate at least two potential benefits or costs: On a public level, tax competition may result in a lower or higher efficiency in the production of public services. But there is also a more private benefit in the form of an option for individuals to move to a community with a lower tax rate in the future. To explore the value citizens attach to tax competition we analyze a unique popular vote for a complete tax harmonization between communities in the third largest Swiss canton, Vaud. Although a majority of voters would have seemingly benefited from replacing the current tax rate by a revenue-neutral average tax rate, the proposal was rejected by a large margin. Our estimates suggest that the estimated combined perceived benefit from tax competition is in the range of 10%.
Resumo:
L'objectiu d'aquest article és mostrar les característiques essencials de dos protocols de xarxes de sensors -l'estàndard 802.15.4 i el protocol ZigBee-, així com les principals amenaces a les que estan exposats. Finalment, mitjançant un cas real i, com a prova de concepte, es mostra com deixar inhabilitat un node d'una xarxa que utilitza el protocol ZigBee.
Resumo:
BACKGROUND Advanced heart failure (HF) is associated with high morbidity and mortality; it represents a major burden for the health system. Episodes of acute decompensation requiring frequent and prolonged hospitalizations account for most HF-related expenditure. Inotropic drugs are frequently used during hospitalization, but rarely in out-patients. The LAICA clinical trial aims to evaluate the effectiveness and safety of monthly levosimendan infusion in patients with advanced HF to reduce the incidence of hospital admissions for acute HF decompensation. METHODS The LAICA study is a multicenter, prospective, randomized, double-blind, placebo-controlled, parallel group trial. It aims to recruit 213 out-patients, randomized to receive either a 24-h infusion of levosimendan at 0.1 μg/kg/min dose, without a loading dose, every 30 days, or placebo. RESULTS The main objective is to assess the incidence of admission for acute HF worsening during 12 months. Secondarily, the trial will assess the effect of intermittent levosimendan on other variables, including the time in days from randomization to first admission for acute HF worsening, mortality and serious adverse events. CONCLUSIONS The LAICA trial results could allow confirmation of the usefulness of intermittent levosimendan infusion in reducing the rate of hospitalization for HF worsening in advanced HF outpatients.
Resumo:
The primary goal of this paper is to discuss how the leading position of Brazil in South America could contribute to boost security cooperation between the European Union and Mercosur. Both parties share common foreign and security policy concerns, including immigration, terrorism and drug trafficking. Through its great influence on the regional security agenda, Brazil could seek closer bilateral cooperation with Europe in tackling these global challenges, acting at the same time as a representative of regional interests.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.