981 resultados para Critical awareness
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
Background: Ethical conflicts are arising as a result of the growing complexity of clinical care, coupled with technological advances. Most studies that have developed instruments for measuring ethical conflict base their measures on the variables"frequency" and"degree of conflict". In our view, however, these variables are insufficient for explaining the root of ethical conflicts. Consequently, the present study formulates a conceptual model that also includes the variable"exposure to conflict", as well as considering six"types of ethical conflict". An instrument was then designed to measure the ethical conflicts experienced by nurses who work with critical care patients. The paper describes the development process and validation of this instrument, the Ethical Conflict in Nursing Questionnaire Critical Care Version (ECNQ-CCV). Methods: The sample comprised 205 nursing professionals from the critical care units of two hospitals in Barcelona (Spain). The ECNQ-CCV presents 19 nursing scenarios with the potential to produce ethical conflict in the critical care setting. Exposure to ethical conflict was assessed by means of the Index of Exposure to Ethical Conflict (IEEC), a specific index developed to provide a reference value for each respondent by combining the intensity and frequency of occurrence of each scenario featured in the ECNQ-CCV. Following content validity, construct validity was assessed by means of Exploratory Factor Analysis (EFA), while Cronbach"s alpha was used to evaluate the instrument"s reliability. All analyses were performed using the statistical software PASW v19. Results: Cronbach"s alpha for the ECNQ-CCV as a whole was 0.882, which is higher than the values reported for certain other related instruments. The EFA suggested a unidimensional structure, with one component accounting for 33.41% of the explained variance. Conclusions: The ECNQ-CCV is shown to a valid and reliable instrument for use in critical care units. Its structure is such that the four variables on which our model of ethical conflict is based may be studied separately or in combination. The critical care nurses in this sample present moderate levels of exposure to ethical conflict. This study represents the first evaluation of the ECNQ-CCV.
Resumo:
The main objective of this article is to assess the risk factors and the types of surface for the development of pressure ulcers (PU) on critical ill patients in an Intensive Care Unit (ICU)
Resumo:
Nerve injuries often lead to neuropathic pain syndrome. The mechanisms contributing to this syndrome involve local inflammatory responses, activation of glia cells, and changes in the plasticity of neuronal nociceptive pathways. Cannabinoid CB(2) receptors contribute to the local containment of neuropathic pain by modulating glial activation in response to nerve injury. Thus, neuropathic pain spreads in mice lacking CB(2) receptors beyond the site of nerve injury. To further investigate the mechanisms leading to the enhanced manifestation of neuropathic pain, we have established expression profiles of spinal cord tissues from wild-type and CB(2)-deficient mice after nerve injury. An enhanced interferon-gamma (IFN-gamma) response was revealed in the absence of CB(2) signaling. Immunofluorescence stainings demonstrated an IFN-gamma production by astrocytes and neurons ispilateral to the nerve injury in wild-type animals. In contrast, CB(2)-deficient mice showed neuronal and astrocytic IFN-gamma immunoreactivity also in the contralateral region, thus matching the pattern of nociceptive hypersensitivity in these animals. Experiments in BV-2 microglia cells revealed that transcriptional changes induced by IFN-gamma in two key elements for neuropathic pain development, iNOS (inducible nitric oxide synthase) and CCR2, are modulated by CB(2) receptor signaling. The most direct support for a functional involvement of IFN-gamma as a mediator of CB(2) signaling was obtained with a double knock-out mouse strain deficient in CB(2) receptors and IFN-gamma. These animals no longer show the enhanced manifestations of neuropathic pain observed in CB(2) knock-outs. These data clearly demonstrate that the CB(2) receptor-mediated control of neuropathic pain is IFN-gamma dependent.
Resumo:
Successful pregnancy depends on well coordinated developmental events involving both maternal and embryonic components. Although a host of signaling pathways participate in implantation, decidualization, and placentation, whether there is a common molecular link that coordinates these processes remains unknown. By exploiting genetic, molecular, pharmacological, and physiological approaches, we show here that the nuclear transcription factor peroxisome proliferator-activated receptor (PPAR) delta plays a central role at various stages of pregnancy, whereas maternal PPARdelta is critical to implantation and decidualization, and embryonic PPARdelta is vital for placentation. Using trophoblast stem cells, we further elucidate that a reciprocal relationship between PPARdelta-AKT and leukemia inhibitory factor-STAT3 signaling pathways serves as a cell lineage sensor to direct trophoblast cell fates during placentation. This novel finding of stage-specific integration of maternal and embryonic PPARdelta signaling provides evidence that PPARdelta is a molecular link that coordinates implantation, decidualization, and placentation crucial to pregnancy success. This study is clinically relevant because deferral of on time implantation leads to spontaneous pregnancy loss, and defective trophoblast invasion is one cause of preeclampsia in humans.
Resumo:
Access management involves balancing the dual roles that roadways must play - through travel and access to property and economic activity. When these roles are not in proper balance, the result is a roadway system that functions sub-optimally. Arterial routes that have a too high driveway density and provide overly extensive access to property have high crash rates and begin to suffer in terms of traffic operations. Such routes become congested, delays increase, and mean travel speeds decline. The Iowa access management research and awareness project has had four distinct phases. Phase I involved a detailed review of the extensive national access management literature so lessons learned elsewhere could be applied in Iowa. In Phase II original case study research was conducted in Iowa. Phase III of the project concentrated on outreach and education about access management. Phase IV of the Iowa access management project extended the work conducted during Phases II and III. The main work products for Phase IV were as follows: 1) three additional before and after case studies, illustrating the impacts of various access management treatments on traffic safety, traffic operations, and business vitality; 2) an access management handbook aimed primarily at local governments in Iowa; 3) a modular access management toolkit with brief descriptions of various access management treatments and considerations; and 4) an extensive outreach plan aimed at getting the results of Phases I through IV of the project out to diverse audiences in Iowa and elsewhere.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
Background: Overdiagnosis is defined as the diagnosis of a condition not associated with a substantial risk for health in an asymptomatic person. There are several causes of overdiagnosis. Clinical and public health implications of overdiagnosis are underappreciated. Objective: To review the causes of overdiagnosis, and its clinical and public health implications Method: Narrative review Results: Overdiagnosis results from some screening activities, increasingly sensitive diagnostic test procedures, incidental findings on routine exams, and widening diagnostic criteria to define a condition requiring an intervention. The fear of missing a diagnosis and the patients' requests for reassurance are further causes of overdiagnosis. Examples of overdiagnosis include some cases of breast and prostate cancers found by screening, pulmonary emboli identified on highly sensitive CT-scans, and kidney cancers found incidentally following abdominal CTscans. Lowering the critical levels of blood pressure, glycemia, and cholesterol to define hypertension, diabetes, and hypercholesterolemia, respectively, is also the causes of overdiagnosis. An overdiagnosed condition implies unnecessary procedures to confirm or exclude the presence of the disease and unnecessary treatments, both having potential adverse effects. Overdiagnosis also diverts health professionals from caring about other health issues and generates costs without any benefit. Measures to prevent overdiagnosis are notably 1) to increase awareness of health professionals and the population about its occurrence, 2) to account systematically for the risks and benefits of screening and diagnostic procedures using an evidence-based framework, and 3) to decide at which risk level to intervene based on the absolute risk of health events and the absolute risk reduction expected from an intervention. Conclusion: Overdiagnosis has major clinical and public health implications. Increasing awareness of its causes and implications is a step toward its prevention.
Resumo:
We study steady states in d-dimensional lattice systems that evolve in time by a probabilistic majority rule, which corresponds to the zero-temperature limit of a system with conflicting dynamics. The rule satisfies detailed balance for d=1 but not for d>1. We find numerically nonequilibrium critical points of the Ising class for d=2 and 3.
Resumo:
Dr. Narakas intended to study a series of 61 cases of shoulder sequelae of obstetric palsy. His vast experience would have enriched our clinical knowledge of this ailment. The authors carry on with that study to clarify his therapeutic approach and share the benefit of his experience.
Resumo:
Today, most software development teams use free and open source software (FOSS) components, because it increases the speed and the quality of the development. Many open source components are the de facto standard of their category. However, FOSS has licensing restrictions, and corporate organizations usually maintain a list of allowed and forbidden licenses. But how do you enforce this policy? How can you make sure that ALL files in your source depot, either belong to you, or fit your licensing policy? A first, preventive approach is to train and increase the awareness of the development team to these licensing issues. Depending on the size of the team, it may be costly but necessary. However, this does not ensure that a single individual will not commit a forbidden icon or library, and jeopardize the legal status of the whole release... if not the company, since software is becoming more and more a critical asset. Another approach is to verify what is included in the source repository, and check whether it belongs to the open-source world. This can be done on-the-fly, whenever a new file is added into the source depot. It can also be part of the release process, as a verification step before publishing the release. In both cases, there are some tools and databases to automate the detection process. We will present the various options regarding FOSS detection, how this process can be integrated in the "software factory", and how the results can be displayed in a usable and efficient way.
Resumo:
Purified, [131I]-labeled goat antibodies against carcinoembryonic antigen, which have been shown to localize in human carcinoma in nude mice, were injected into 27 patients with carcinoma. Patients were scanned with a scintillation camera at various intervals. In 11 patients, radioactivity was detectable in the tumor 48 hours after injection. Computerized subtraction of blood-pool radioactivity provided clearer pictures in positive cases, but in 16 patients the scans remained doubtful or negative. To study the specificity of [131I]-antibody localization, we gave some patients simultaneous injections of [125I]-labeled normal IgG. Both isotopes were measured by means of scintillation counting in tumors and normal tissues recovered after surgery. The results demonstrated that only the anti-CEA antibodies localized in tumors. However, the total antibody-derived radioactivity in the tumor was only about 0.001 of the injected dose. We conclude that, despite the present demonstration of specificity, this method of tumor detection is not yet clinically useful.
Resumo:
Neuropeptide- and hormone-containing secretory granules (SGs) are synthesized at the trans-Golgi network (TGN) as immature secretory granules (ISGs) and complete their maturation in the F-actin-rich cell cortex. This maturation process is characterized by acidification-dependent processing of cargo proteins, condensation of the SG matrix and removal of membrane and proteins not destined to mature secretory granules (MSGs). Here we addressed a potential role of Rab3 isoforms in these maturation steps by expressing their nucleotide-binding deficient mutants in PC12 cells. Our data show that the presence of Rab3D(N135I) decreases the restriction of maturing SGs to the F-actin-rich cell cortex, blocks the removal of the endoprotease furin from SGs and impedes the processing of the luminal SG protein secretogranin II. This strongly suggests that Rab3D is implicated in the subcellular localization and maturation of ISGs.
Resumo:
PURPOSE: The primary objective of this study was to describe the frequency of behaviors observed during rest, a non-nociceptive procedure, and a nociceptive procedure in brain-injured intensive care unit (ICU) patients with different levels of consciousness (LOC). Second, it examined the inter-rater reliability and discriminant and concurrent validity of the behavioral checklist used. METHODS: The non-nociceptive procedure involved calling the patient and shaking his/her shoulder. The nociceptive procedure involved turning the patient. The frequency of behaviors was recorded using a behavioral checklist. RESULTS: Patients with absence of movement, or stereotyped flexion or extension responses to a nociceptive stimulus displayed more behaviors during turning (median 5.5, range 0-14) than patients with localized responses (median 4, range 0-10) or able to self-report their pain (median 4, range 0-10). Face flushing, clenched teeth, clenched fist, and tremor were more frequent in patients with absence of movement, or stereotyped responses to a nociceptive stimulus. The reliability of the checklist was supported by a high intra-class correlation coefficient (0.77-0.92), and the internal consistency was acceptable in all three groups (KR 20, 0.71-0.85). Discriminant validity was supported as significantly more behaviors were observed during nociceptive stimulation than at rest. Concurrent validity was confirmed as checklist scores were correlated to the patients' self-reports of pain (r s = 0.53; 95 % CI 0.21-0.75). CONCLUSION: Brain-injured patients reacted significantly more during a nociceptive stimulus and the number of observed behaviors was higher in patients with a stereotyped response.