867 resultados para key factors
Resumo:
Although rare, stent thrombosis remains a severe complication after stent implantation owing to its high morbidity and mortality. Since the introduction of drug-eluting stents (DES), most interventional centers have noted stent thrombosis up to 3 years after implantation, a complication rarely seen with bare-metal stents. Some data from large registries and meta-analyses of randomized trials indicate a higher risk for DES thrombosis, whereas others suggest an absence of such a risk. Several factors are associated with an increased risk of stent thrombosis, including the procedure itself (stent malapposition and/or underexpansion, number of implanted stents, stent length, persistent slow coronary blood flow, and dissections), patient and lesion characteristics, stent design, and premature cessation of antiplatelet drugs. Drugs released from DES exert distinct biological effects, such as activation of signal transduction pathways and inhibition of cell proliferation. As a result, although primarily aimed at preventing vascular smooth muscle cell proliferation and migration (ie, key factors in the development of restenosis), they also impair reendothelialization, which leads to delayed arterial healing, and induce tissue factor expression, which results in a prothrombogenic environment. In the same way, polymers used to load these drugs have been associated with DES thrombosis. Finally, DES impair endothelial function of the coronary artery distal to the stent, which potentially promotes the risk of ischemia and coronary occlusion. Although several reports raise the possibility of a substantially higher risk of stent thrombosis in DES, evidence remains inconclusive; as a consequence, both large-scale and long-term clinical trials, as well as further mechanistic studies, are needed. The present review focuses on the pathophysiological mechanisms and pathological findings of stent thrombosis in DES.
Resumo:
Gene duplication is one of the key factors driving genetic innovation, i.e., producing novel genetic variants. Although the contribution of whole-genome and segmental duplications to phenotypic diversity across species is widely appreciated, the phenotypic spectrum and potential pathogenicity of small-scale duplications in individual genomes are less well explored. This review discusses the nature of small-scale duplications and the phenotypes produced by such duplications. Phenotypic variation and disease phenotypes induced by duplications are more diverse and widespread than previously anticipated, and duplications are a major class of disease-related genomic variation. Pathogenic duplications particularly involve dosage-sensitive genes with both similar and dissimilar over- and underexpression phenotypes, and genes encoding proteins with a propensity to aggregate. Phenotypes related to human-specific copy number variation in genes regulating environmental responses and immunity are increasingly recognized. Small genomic duplications containing defense-related genes also contribute to complex common phenotypes.
Resumo:
The present paper discusses a conceptual, methodological and practical framework within which the limitations of the conventional notion of natural resource management (NRM) can be overcome. NRM is understood as the application of scientific ecological knowledge to resource management. By including a consideration of the normative imperatives that arise from scientific ecological knowledge and submitting them to public scrutiny, ‘sustainable management of natural resources’ can be recontextualised as ‘sustainable governance of natural resources’. This in turn makes it possible to place the politically neutralising discourse of ‘management’ in a space for wider societal debate, in which the different actors involved can deliberate and negotiate the norms, rules and power relations related to natural resource use and sustainable development. The transformation of sustainable management into sustainable governance of natural resources can be conceptualised as a social learning process involving scientists, experts, politicians and local actors, and their corresponding scientific and non-scientific knowledges. The social learning process is the result of what Habermas has described as ‘communicative action’, in contrast to ‘strategic action’. Sustainable governance of natural resources thus requires a new space for communicative action aiming at shared, intersubjectively validated definitions of actual situations and the goals and means required for transforming current norms, rules and power relations in order to achieve sustainable development. Case studies from rural India, Bolivia and Mali explore the potentials and limitations for broadening communicative action through an intensification of social learning processes at the interface of local and external knowledge. Key factors that enable or hinder the transformation of sustainable management into sustainable governance of natural resources through social learning processes and communicative action are discussed.
Resumo:
The prevalence of Ventilated Improved Pit (VIP) latrines in Ghana suggests that the design must have a high user acceptance. The two key factors attributed to user acceptance of a VIP latrine over an alternative latrine design, such as the basic pit latrine, are its ability to remove foul odors and maintain low fly populations; both of which are a direct result of an adequate ventilation flow rate. Adequate ventilation for odorless conditions in a VIP latrine has been defined by the United Nations Development Program (UNDP) and the World Bank, as an air flow rate equivalent to 6 air changes per hour (6 ACH) of the superstructure’s air volume. Additionally, the UNDP determined that the three primary factors that affect ventilation are: 1) wind passing over the mouth of the vent pipe, 2) wind passing into the superstructure, and 3) solar radiation on to the vent pipe. Previous studies also indicate that vent pipes with larger diameters increase flow rates, and the application of carbonaceous materials to the pit sludge reduces odor and insect prevalence. Furthermore, proper design and construction is critical for the correct functioning of VIP latrines. Under-designing could cause problems with odor and insect control; over-designing would increase costs unnecessarily, thereby making it potentially unaffordable for benefactors to independently construct, repair or replace a VIP latrine. The present study evaluated the design of VIP latrines used by rural communities in the Upper West Region of Ghana with the focus of assessing adequate ventilation for odor removal and insect control. Thirty VIP latrines from six communities in the Upper West Region of Ghana were sampled. Each VIP latrine’s ventilation flow rate and micro-environment was measured using a hot-wire anemometer probe and portable weather station for a minimum of four hours. To capture any temporal or seasonal variations in ventilation, ten of the latrines were sampled monthly over the course of three months for a minimum of 12 hours. A latrine usage survey and a cost analysis were also conducted to further assess the VIP latrine as an appropriated technology for sustainable development in the Upper West Region. It was found that the average air flow rate over the entire sample set was 11.3 m3/hr. The minimum and maximum air flow rates were 0.0 m3/hr and 48.0 m3/hr respectively. Only 1 of the 30 VIP latrines (3%) was found to have an air flow rate greater than the UNDP-defined odorless condition of 6 ACH. Furthermore, 19 VIP latrines (63%) were found to have an average air flow rate of less than half the flow rate required to achieve 6 ACH. The dominant factors affecting ventilation flow rate were wind passing over the mouth of the vent pipe and air buoyancy forces, which were the effect of differences in temperature between the substructure and the ambient environment. Of 76 usable VIP latrines found in one community, 68.4% were in actual use. The cost of a VIP latrine was found to be equivalent to approximately 12% of the mean annual household income for Upper West Region inhabitants.
Resumo:
In the Dominican Republic economic growth in the past twenty years has not yielded sufficient improvement in access to drinking water services, especially in rural areas where 1.5 million people do not have access to an improved water source (WHO, 2006). Worldwide, strategic development planning in the rural water sector has focused on participatory processes and the use of demand filters to ensure that service levels match community commitment to post-project operation and maintenance. However studies have concluded that an alarmingly high percentage of drinking water systems (20-50%) do not provide service at the design levels and/or fail altogether (up to 90%): BNWP (2009), Annis (2006), and Reents (2003). World Bank, USAID, NGOs, and private consultants have invested significant resources in an effort to determine what components make up an “enabling environment” for sustainable community management of rural water systems (RWS). Research has identified an array of critical factors, internal and external to the community, which affect long term sustainability of water services. Different frameworks have been proposed in order to better understand the linkages between individual factors and sustainability of service. This research proposes a Sustainability Analysis Tool to evaluate the sustainability of RWS, adapted from previous relevant work in the field to reflect the realities in the Dominican Republic. It can be used as a diagnostic tool for government entities and development organizations to characterize the needs of specific communities and identify weaknesses in existing training regimes or support mechanisms. The framework utilizes eight indicators in three categories (Organization/Management, Financial Administration, and Technical Service). Nineteen independent variables are measured resulting in a score of sustainability likely (SL), possible (SP), or unlikely (SU) for each of the eight indicators. Thresholds are based upon benchmarks from the DR and around the world, primary data collected during the research, and the author’s 32 months of field experience. A final sustainability score is calculated using weighting factors for each indicator, derived from Lockwood (2003). The framework was tested using a statistically representative geographically stratified random sample of 61 water systems built in the DR by initiatives of the National Institute of Potable Water (INAPA) and Peace Corps. The results concluded that 23% of sample systems are likely to be sustainable in the long term, 59% are possibly sustainable, and for 18% it is unlikely that the community will be able to overcome any significant challenge. Communities that were scored as unlikely sustainable perform poorly in participation, financial durability, and governance while the highest scores were for system function and repair service. The Sustainability Analysis Tool results are verified by INAPA and PC reports, evaluations, and database information, as well as, field observations and primary data collected during the surveys. Future research will analyze the nature and magnitude of relationships between key factors and the sustainability score defined by the tool. Factors include: gender participation, legal status of water committees, plumber/operator remuneration, demand responsiveness, post construction support methodologies, and project design criteria.
Resumo:
Ein dynamisches Umfeld erforderte die permanente Anpassung (intra-)logistischer Prozesse zur Aufrechterhaltung der Leistungs- und Wettbewerbsfähigkeit von Unternehmen. In der Standardisierung von Prozessen und in unternehmensübergreifenden Prozessmodellen wurden Schlüsselfaktoren für ein effizientes Geschäftsprozessmanagement gesehen, insbesondere in Netzwerken. In der Praxis fehlten wissenschaftlich fundierte und detaillierte Referenzprozessmodelle für die (Intra-)Logistik. Mit der Erforschung und Entwicklung eines Referenzprozessmodells und der prototypischen Realisierung einer Prozess-Workbench zur generischen Erstellung wandelbarer Prozessketten wurde ein Beitrag zur Prozessstandardisierung in der Logistik geleistet. Im Folgenden wird der beschrittene Lösungsweg dargestellt, der erstens aus der Entwicklung eines Metamodells für die Referenzmodellierung, zweitens aus dem empirischen Nachweis der Generierung eines Referenzprozessmodells aus „atomaren“ Elementen und drittens aus der Modellevaluation bestand.
Resumo:
The paper deals with the development of a general as well as integrative and holistic framework to systematize and assess vulnerability, risk and adaptation. The framework is a thinking tool meant as a heuristic that outlines key factors and different dimensions that need to be addressed when assessing vulnerability in the context of natural hazards and climate change. The approach underlines that the key factors of such a common framework are related to the exposure of a society or system to a hazard or stressor, the susceptibility of the system or community exposed, and its resilience and adaptive capacity. Additionally, it underlines the necessity to consider key factors and multiple thematic dimensions when assessing vulnerability in the context of natural and socio-natural hazards. In this regard, it shows key linkages between the different concepts used within the disaster risk management (DRM) and climate change adaptation (CCA) research. Further, it helps to illustrate the strong relationships between different concepts used in DRM and CCA. The framework is also a tool for communicating complexity and stresses the need for societal change in order to reduce risk and to promote adaptation. With regard to this, the policy relevance of the framework and first results of its application are outlined. Overall, the framework presented enhances the discussion on how to frame and link vulnerability, disaster risk, risk management and adaptation concepts.
Resumo:
The aim of this study was to evaluate the ability of dual energy X-rays absorptiometry (DXA) areal bone mineral density (aBMD) measured in different regions of the proximal part of the human femur for predicting the mechanical properties of matched proximal femora tested in two different loading configurations. 36 pairs of fresh frozen femora were DXA scanned and tested until failure in two loading configurations: a fall on the side or a one-legged standing. The ability of the DXA output from four different regions of the proximal femur in predicting the femoral mechanical properties was measured and compared for the two loading scenarios. The femoral neck DXA BMD was best correlated to the femoral ultimate force for both configurations and predicted significantly better femoral failure load (R2=0.80 vs. R2=0.66, P<0.05) when simulating a side than when simulating a standing configuration. Conversely, the work to failure was predicted similarly for both loading configurations (R2=0.54 vs. R2=0.53, P>0.05). Therefore, neck BMD should be considered as one of the key factors for discriminating femoral fracture risk in vivo. Moreover, the better predictive ability of neck BMD for femoral strength if tested in a fall compared to a one-legged stance configuration suggests that DXA's clinical relevance may not be as high for spontaneous femoral fractures than for fractures associated to a fall.
Resumo:
This study of ambulance workers for the emergency medical services of the City of Houston studied the factors related to shiftwork tolerance and intolerance. The EMS personnel work a 24-hour shift with rotating days of the week. Workers are assigned to A, B, C, D shift, each of which rotate 24-hours on, 24-hours off, 24-hours on and 4 days off. One-hundred and seventy-six male EMTs, paramedics and chauffeurs from stations of varying levels of activity were surveyed. The sample group ranged in age from 20 to 45. The average tenure on the job was 8.2 years. Over 68% of the workers held a second job, the majority of which worked over 20 hours a week at the second position.^ The survey instrument was a 20-page questionnaire modeled after the Folkard Standardized Shiftwork Index. In addition to demographic data, the survey tool provided measurements of general job satisfaction, sleep quality, general health complaints, morningness/eveningness, cognitive and somatic anxiety, depression, and circadian types. The survey questionnaire included an EMS-specific scaler of stress.^ A conceptual model of Shiftwork Tolerance was presented to identify the key factors examined in the study. An extensive list of 265 variables was reduced to 36 key variables that related to: (1) shift schedule and demographic/lifestyle factors, (2) individual differences related to traits and characteristics, and (3) tolerance/intolerance effects. Using the general job satisfaction scaler as the key measurement of shift tolerance/intolerance, it was shown that a significant relationship existed between this dependent variable and stress, number of years working a 24-hour shift, sleep quality, languidness/vigorousness. The usual amount of sleep received during the shift, general health complaints and flexibility/rigidity (R$\sp2$ =.5073).^ The sample consisted of a majority of morningness-types or extreme-morningness types, few evening-types and no extreme-evening types, duplicating the findings of Motohashi's previous study of ambulance workers. The level of activity by station was not significant on any of the dependent variables examined. However, the shift worked had a relationship with sleep quality, despite the fact that all shifts work the same hours and participate in the same rotation schedule. ^
Resumo:
BACKGROUND Advanced lower extremity peripheral artery disease (PAD), whether presenting as acute limb ischemia (ALI) or chronic critical limb ischemia (CLI), is associated with high rates of cardiovascular ischemic events, amputation, and death. Past research has focused on strategies of revascularization, but few data are available that prospectively evaluate the impact of key process of care factors (spanning pre-admission, acute hospitalization, and post-discharge) that might contribute to improving short and long-term health outcomes. METHODS/DESIGN The FRIENDS registry is designed to prospectively evaluate a range of patient and health system care delivery factors that might serve as future targets for efforts to improve limb and systemic outcomes for patients with ALI or CLI. This hypothesis-driven registry was designed to evaluate the contributions of: (i) pre-hospital limb ischemia symptom duration, (ii) use of leg revascularization strategies, and (iii) use of risk-reduction pharmacotherapies, as pre-specified factors that may affect amputation-free survival. Sequential patients would be included at an index "vascular specialist-defined" ALI or CLI episode, and patients excluded only for non-vascular etiologies of limb threat. Data including baseline demographics, functional status, co-morbidities, pre-hospital time segments, and use of medical therapies; hospital-based use of revascularization strategies, time segments, and pharmacotherapies; and rates of systemic ischemic events (e.g., myocardial infarction, stroke, hospitalization, and death) and limb ischemic events (e.g., hospitalization for revascularization or amputation) will be recorded during a minimum of one year follow-up. DISCUSSION The FRIENDS registry is designed to evaluate the potential impact of key factors that may contribute to adverse outcomes for patients with ALI or CLI. Definition of new "health system-based" therapeutic targets could then become the focus of future interventional clinical trials for individuals with advanced PAD.
Resumo:
Reducing risk that emerges from hazards of natural origin and societal vulnerability is a key challenge for the development of more resilient communities and the overall goal of sustainable development. The following chapter outlines a framework for multidimensional, holistic vulnerability assessment that is understood as part of risk evaluation and risk management in the context of Disaster Risk Management (DRM) and Climate Change Adaptation (CCA). As a heuristic, the framework is a thinking tool to guide systematic assessments of vulnerability and to provide a basis for comparative indicators and criteria development to assess key factors and various dimensions of vulnerability, particularly in regions in Europe, however, it can also be applied in other world regions. The framework has been developed within the context of the research project MOVE (Methods for the Improvement of Vulnerability Assessment in Europe; ) sponsored by the European Commission within the framework of the FP 7 program.
Resumo:
The primary isolation of a Mycobacterium sp. of the Mycobacterium tuberculosis complex from an infected animal provides a definitive diagnosis of tuberculosis. However, as Mycobacterium bovis and Mycobacterium caprae are difficult to isolate, particularly for animals in the early stages of disease, success is dependent on the optimal performance of all aspects of the bacteriological process, from the initial choice of tissue samples at post-mortem examination or clinical samples, to the type of media and conditions used to cultivate the microorganism. Each step has its own performance characteristics, which can contribute to sensitivity and specificity of the procedure, and may need to be optimized in order to achieve the gold standard diagnosis. Having isolated the slow-growing mycobacteria, species identification and fine resolution strain typing are keys to understanding the epidemiology of the disease and to devise strategies to limit transmission of infection. New technologies have emerged that can now even discriminate different isolates from the same animal. In this review we highlight the key factors that contribute to the accuracy of bacteriological diagnosis of M. bovis and M. caprae, and describe the development of advanced genotyping techniques that are increasingly used in diagnostic laboratories for the purpose of supporting detailed epidemiological investigations.
Resumo:
BACKGROUND Peripheral artery disease (PAD) is a major cause of cardiovascular ischemic events and amputation. Knowledge gaps exist in defining and measuring key factors that predict these events. The objective of this study was to assess whether duration of limb ischemia would serve as a major predictor of limb and patient survival. METHODS The FReedom from Ischemic Events: New Dimensions for Survival (FRIENDS) registry enrolled consecutive patients with limb-threatening peripheral artery disease at a single tertiary care hospital. Demographic information, key clinical care time segments, functional status and use of revascularization, and pharmacotherapy data were collected at baseline, and vascular ischemic events, cardiovascular mortality, and all-cause mortality were recorded at 30 days and 1 year. RESULTS A total of 200 patients with median (interquartile range) age of 76 years (65-84 years) were enrolled in the registry. Median duration of limb ischemia was 0.75 days for acute limb ischemia (ALI) and 61 days for chronic critical limb ischemia (CLI). Duration of limb ischemia of <12, 12 to 24, and >24 hours in patients with ALI was associated with much higher rates of first amputation (P = .0002) and worse amputation-free survival (P = .037). No such associations were observed in patients with CLI. CONCLUSIONS For individuals with ischemic symptoms <14 days, prolonged limb ischemia is associated with higher 30-day and 1-year amputation, systemic ischemic event rates, and worse amputation-free survival. No such associations are evident for individuals with chronic CLI. These data imply that prompt diagnosis and revascularization might improve outcomes for patients with ALI.
Resumo:
BACKGROUND AND PURPOSE To address the increasing need to counsel patients about treatment indications for unruptured intracranial aneurysms (UIA), we endeavored to develop a consensus on assessment of UIAs among a group of specialists from diverse fields involved in research and treatment of UIAs. METHODS After composition of the research group, a Delphi consensus was initiated to identify and rate all features, which may be relevant to assess UIAs and their treatment by using ranking scales and analysis of inter-rater agreement (IRA) for each factor. IRA was categorized as very high, high, moderate, or low. RESULTS Ultimately, 39 specialists from 4 specialties agreed (high or very high IRAs) on the following key factors for or against UIA treatment decisions: (1) patient age, life expectancy, and comorbid diseases; (2) previous subarachnoid hemorrhage from a different aneurysm, family history for UIA or subarachnoid hemorrhage, nicotine use; (3) UIA size, location, and lobulation; (4) UIA growth or de novo formation on serial imaging; (5) clinical symptoms (cranial nerve deficit, mass effect, and thromboembolic events from UIAs); and (6) risk factors for UIA treatment (patient age and life expectancy, UIA size, and estimated risk of treatment). However, IRAs for features rated with low relevance were also generally low, which underlined the existing controversy about the natural history of UIAs. CONCLUSIONS Our results highlight that neurovascular specialists currently consider many features as important when evaluating UIAs but also highlight that the appreciation of natural history of UIAs remains uncertain, even within a group of highly informed individuals.
Resumo:
Introduction To meet the quality standards for high-stakes OSCEs, it is necessary to ensure high quality standardized performance of the SPs involved.[1] One of the ways this can be assured is through the assessment of the quality of SPs` performance in training and during the assessment. There is some literature concerning validated instruments that have been used to assess SP performance in formative contexts but very little related to high stakes contexts.[2], [3], [4]. Content and structure During this workshop different approaches to quality control for SPs` performance, developed in medicine, pharmacy and nursing OSCEs, will be introduced. Participants will have the opportunity to use these approaches in simulated interactions. Advantages and disadvantages of these approaches will be discussed. Anticipated outcomes By the end of this session, participants will be able to discuss the rationale for quality control of SPs` performance in high stakes OSCEs, outline key factors in creating strategies for quality control, identify various strategies for assuring quality control, and reflect on applications to their own practice. Who should attend The workshop is designed for those interested in quality assurance of SP performance in high stakes OSCEs. Level All levels are welcome. References Adamo G. 2003. Simulated and standardized patients in OSCEs: achievements and challenges:1992-2003. Med Teach. 25(3), 262- 270. Wind LA, Van Dalen J, Muijtjens AM, Rethans JJ. Assessing simulated patients in an educational setting: the MaSP (Maastricht Assessment of Simulated Patients). Med Educ 2004, 38(1):39-44. Bouter S, van Weel-Baumgarten E, Bolhuis S. Construction and validation of the Nijmegen Evaluation of the Simulated Patient (NESP): Assessing Simulated Patients' ability to role-play and provide feedback to students. Acad Med: Journal of the Association of American Medical Colleges 2012. May W, Fisher D, Souder D: Development of an instrument to measure the quality of standardized/simulated patient verbal feedback. Med Educ 2012, 2(1).