904 resultados para POTENTIAL RISK
Resumo:
The detection of Colorectal Cancer (CRC), at early stages, is one of the proven strategies resulting in a higher cure rate. In recent years, several studies have appeared identifying potential cancer markers in serum, plasma and stool in an attempt to improve actual screening procedures. Thus, the aim of the study was (1) Evaluate MN frequency, (2) Evaluate plasma ultrafiltrate capacity to induce MN formation, (3) Evaluate SEPT9 and NOTCH3 promoter methylation profile in peripheral blood lymphocytes from subjects resulted positive to fecal occult blood test and examined by colonoscopy. MN frequency was significantly higher in subjects with histological diagnosis of CRC and adenoma than control (p ≤ 0.001 and p ≤ 0.01, respectively). About, CF-MN analysis, a statistically significant difference was observed between CRC and control (p ≤ 0.05). On the other hand, SEPT9 and NOTCH3 promoter methylation status was significantly lower in CRC subjects than controls; additionally, NOTCH3 promoter methylation status was significantly lower in CRC subjects than adenoma subjects (p ≤ 0.01). The results obtained allow conclude that MN frequency varies according CRC pathologic status and, together with other variables, is a valid biomarker for adenoma and CRC risk. Additionally, the plasma of patients affected with CRC not only serve as a biomarker for oxidative stress but also as biomarker of genetic damage correlated with the carcinogenic process that verifies in colon-rectum. SEPT9 and NOTCH3 promoter methylation status, at peripheral blood level, varies according hystopathological changes observed in colon-rectum, suggesting that promoter methylation profile of these genes could be a reliable biomarker for CRC risk.
Resumo:
Proper hazard identification has become progressively more difficult to achieve, as witnessed by several major accidents that took place in Europe, such as the Ammonium Nitrate explosion at Toulouse (2001) and the vapour cloud explosion at Buncefield (2005), whose accident scenarios were not considered by their site safety case. Furthermore, the rapid renewal in the industrial technology has brought about the need to upgrade hazard identification methodologies. Accident scenarios of emerging technologies, which are not still properly identified, may remain unidentified until they take place for the first time. The consideration of atypical scenarios deviating from normal expectations of unwanted events or worst case reference scenarios is thus extremely challenging. A specific method named Dynamic Procedure for Atypical Scenarios Identification (DyPASI) was developed as a complementary tool to bow-tie identification techniques. The main aim of the methodology is to provide an easier but comprehensive hazard identification of the industrial process analysed, by systematizing information from early signals of risk related to past events, near misses and inherent studies. DyPASI was validated on the two examples of new and emerging technologies: Liquefied Natural Gas regasification and Carbon Capture and Storage. The study broadened the knowledge on the related emerging risks and, at the same time, demonstrated that DyPASI is a valuable tool to obtain a complete and updated overview of potential hazards. Moreover, in order to tackle underlying accident causes of atypical events, three methods for the development of early warning indicators were assessed: the Resilience-based Early Warning Indicator (REWI) method, the Dual Assurance method and the Emerging Risk Key Performance Indicator method. REWI was found to be the most complementary and effective of the three, demonstrating that its synergy with DyPASI would be an adequate strategy to improve hazard identification methodologies towards the capture of atypical accident scenarios.
Resumo:
Salmonella and Campylobacter are common causes of human gastroenteritis. Their epidemiology is complex and a multi-tiered approach to control is needed, taking into account the different reservoirs, pathways and risk factors. In this thesis, trends in human gastroenteritis and food-borne outbreak notifications in Italy were explored. Moreover, the improved sensitivity of two recently-implemented regional surveillance systems in Lombardy and Piedmont was evidenced, providing a basis for improving notification at the national level. Trends in human Salmonella serovars were explored: serovars Enteritidis and Infantis decreased, Typhimurium remained stable and 4,[5],12:i:-, Derby and Napoli increased, suggesting that sources of infection have changed over time. Attribution analysis identified pigs as the main source of human salmonellosis in Italy, accounting for 43–60% of infections, followed by Gallus gallus (18–34%). Attributions to pigs and Gallus gallus showed increasing and decreasing trends, respectively. Potential bias and sampling issues related to the use of non-local/non-recent multilocus sequence typing (MLST) data in Campylobacter jejuni/coli source attribution using the Asymmetric Island (AI) model were investigated. As MLST data become increasingly dissimilar with increasing geographical/temporal distance, attributions to sources not sampled close to human cases can be underestimated. A combined case-control and source attribution analysis was developed to investigate risk factors for human Campylobacter jejuni/coli infection of chicken, ruminant, environmental, pet and exotic origin in The Netherlands. Most infections (~87%) were attributed to chicken and cattle. Individuals infected from different reservoirs had different associated risk factors: chicken consumption increased the risk for chicken-attributed infections; animal contact, barbecuing, tripe consumption, and never/seldom chicken consumption increased that for ruminant-attributed infections; game consumption and attending swimming pools increased that for environment-attributed infections; and dog ownership increased that for environment- and pet-attributed infections. Person-to-person contacts around holiday periods were risk factors for infections with exotic strains, putatively introduced by returning travellers.
Resumo:
Tractor rollover represent a primary cause of death or serious injury in agriculture and despite the mandatory Roll-Over Protective Structures (ROPS), that reduced the number of injuries, tractor accidents are still of great concern. Because of their versatility and wide use many studies on safety are concerned with the stability of tractors, but they often prefer controlled tests or laboratory tests. The evaluation of tractors working in field, instead, is a very complex issue because the rollover could be influenced by the interaction among operator, tractor and environment. Recent studies are oriented towards the evaluation of the actual working conditions developing prototypes for driver assistance and data acquisition. Currently these devices are produced and sold by manufacturers. A warning device was assessed in this study with the aim to evaluate its performance and to collect data on different variables influencing the dynamics of tractors in field by monitoring continuously the working conditions of tractors operating at the experimental farm of the Bologna University. The device consists of accelerometers, gyroscope, GSM/GPRS, GPS for geo-referencing and a transceiver for the automatic recognition of tractor-connected equipment. A microprocessor processes data and provides information, through a dedicated algorithm requiring data on the geometry of the tested tractor, on the level of risk for the operator in terms of probable loss of stability and suggests corrective measures to reduce the potential instability of the tractor.
Resumo:
Background. A sizable group of patients with symptomatic aortic stenosis (AS) can undergo neither surgical aortic valve replacement (AVR) nor transcatheter aortic valve implantation (TAVI) because of clinical contraindications. The aim of this study was to assess the potential role of balloon aortic valvuloplasty (BAV) as a “bridge-to-decision” in selected patients with severe AS and potentially reversible contraindications to definitive treatment. Methods. We retrospectively enrolled 645 patients who underwent first BAV at our Institution between July 2007 and December 2012. Of these, the 202 patients (31.2%) who underwent BAV as bridge-to-decision (BTD) requiring clinical re-evaluation represented our study population. BTD patients were further subdivided in 5 groups: low left ventricular ejection fraction; mitral regurgitation grade ≥3; frailty; hemodynamic instability; comorbidity. The main objective of the study was to evaluate how BAV influenced the final treatment strategy in the whole BTD group and in its single specific subgroups. Results. Mean logistic EuroSCORE was 23.5±15.3%, mean age was 81±7 years. Mean transaortic gradient decreased from 47±17 mmHg to 33±14 mmHg. Of the 193 patients with BTD-BAV who received a second heart team evaluation, 72.5% were finally deemed eligible for definitive treatment (25.4%for AVR; 47.2% for TAVI): respectively, 96.7% of patients with left ventricular ejection fraction recovery; 70.5% of patients with mitral regurgitation reduction; 75.7% of patients who underwent BAV in clinical hemodynamic instability; 69.2% of frail patients and 68% of patients who presented relevant comorbidities. 27.5% of the study population was deemed ineligible for definitive treatment and treated with standard therapy/repeated BAV. In-hospital mortality was 4.5%, cerebrovascular accident occurred in 1% and overall vascular complications were 4% (0.5% major; 3.5% minor). Conclusions. Balloon aortic valvuloplasty should be considered as bridge-to-decision in high-risk patients with severe aortic stenosis who cannot be immediate candidates for definitive percutaneous or surgical treatment.
Resumo:
Introduction The survival of patients admitted to an emergency department is determined by the severity of acute illness and the quality of care provided. The high number and the wide spectrum of severity of illness of admitted patients make an immediate assessment of all patients unrealistic. The aim of this study is to evaluate a scoring system based on readily available physiological parameters immediately after admission to an emergency department (ED) for the purpose of identification of at-risk patients. Methods This prospective observational cohort study includes 4,388 consecutive adult patients admitted via the ED of a 960-bed tertiary referral hospital over a period of six months. Occurrence of each of seven potential vital sign abnormalities (threat to airway, abnormal respiratory rate, oxygen saturation, systolic blood pressure, heart rate, low Glasgow Coma Scale and seizures) was collected and added up to generate the vital sign score (VSS). VSSinitial was defined as the VSS in the first 15 minutes after admission, VSSmax as the maximum VSS throughout the stay in ED. Occurrence of single vital sign abnormalities in the first 15 minutes and VSSinitial and VSSmax were evaluated as potential predictors of hospital mortality. Results Logistic regression analysis identified all evaluated single vital sign abnormalities except seizures and abnormal respiratory rate to be independent predictors of hospital mortality. Increasing VSSinitial and VSSmax were significantly correlated to hospital mortality (odds ratio (OR) 2.80, 95% confidence interval (CI) 2.50 to 3.14, P < 0.0001 for VSSinitial; OR 2.36, 95% CI 2.15 to 2.60, P < 0.0001 for VSSmax). The predictive power of VSS was highest if collected in the first 15 minutes after ED admission (log rank Chi-square 468.1, P < 0.0001 for VSSinitial;,log rank Chi square 361.5, P < 0.0001 for VSSmax). Conclusions Vital sign abnormalities and VSS collected in the first minutes after ED admission can identify patients at risk of an unfavourable outcome.
Resumo:
Background. Metabolic complications, including cardiovascular events and diabetes mellitus (DM), are a major long-term concern in human immunodeficienc virus (HIV)-infected individuals. Recent genome-wide association studies have reliably associated multiple single nucleotide polymorphisms (SNPs) to DM in the general population. Methods. We evaluated the contribution of 22 SNPs identifie in genome-wide association studies and of longitudinally measured clinical factors to DM. We genotyped all 94 white participants in the Swiss HIV Cohort Study who developed DM from 1 January 1999 through 31 August 2009 and 550 participants without DM. Analyses were based on 6054 person-years of follow-up and 13,922 measurements of plasma glucose. Results. The contribution to DM risk explained by SNPs (14% of DM variability) was larger than the contribution to DM risk explained by current or cumulative exposure to different antiretroviral therapy combinations (3% of DM variability). Participants with the most unfavorable genetic score (representing 12% and 19% of the study population, respectively, when applying 2 different genetic scores) had incidence rate ratios for DM of 3.80 (95% confidenc interval [CI], 2.05–7.06) and 2.74 (95% CI, 1.53–4.88), respectively, compared with participants with a favorable genetic score. However, addition of genetic data to clinical risk factors that included body mass index only slightly improved DM prediction. Conclusions. In white HIV-infected persons treated with antiretroviral therapy, the DM effect of genetic variants was larger than the potential toxic effects of antiretroviral therapy. SNPs contributed significantl to DM risk, but their addition to a clinical model improved DM prediction only slightly, similar to studies in the general population.
Resumo:
Despite successful intensive care a substantial portion of critically ill patients dies after discharge from the intensive care unit or hospital. Observational studies investigating long-term survival of critically ill patients reported that most deaths occur during the first months or year after discharge. Only limited data on the causes of impaired quality of life and post-intensive care unit deaths exist in the current literature. In this manuscript we hypothesize that the acute inflammatory response which characteristically accompanies critical illness is ensued by a prolonged imbalance or activation of the immune system. Such a chronic low-grade inflammatory response to critical illness may be sub-clinical and persist for a variable period of time after discharge from the intensive care unit and hospital. Chronic inflammation is a well-recognized risk factor for long-term morbidity and mortality, particularly from cardiovascular causes, and may thus partly contribute to the impaired quality of life as well as increased morbidity and mortality following intensive care unit and hospital discharge of critically ill patients. Assuming that critical illness is indeed followed by a prolonged inflammatory response, important implications for treatment would arise. An interesting and potentially beneficial therapy could be the administration of immune-modulating drugs during the time after intensive care unit or hospital discharge until chronic inflammation has subsided. Statins are well-investigated and effective drugs to attenuate chronic inflammation and could potentially also improve long-term outcome of critically ill patients after intensive care unit or hospital discharge. Future studies evaluating the course of inflammation during and after critical illness as well as its response to statin therapy are required.
Resumo:
Many people with acute myocardial infarction die from sudden cardiac arrest before reaching the hospital. The current clinical understanding of the mechanisms and risk factors surrounding sudden cardiac death is limited. However, 2 factors related to sudden death, namely the occluded coronary vessel (right coronary, left circumflex, or left anterior descending artery) and the extent of collateral circulation, are of potential relevance. Recent data suggest that the risk differs between the different coronary arteries and that coronary collateral circulation seems to have an important protective 'antiarrhythmic' effect. This editorial will address possible mechanisms and potential implications in clinical practice.
Resumo:
Dronedarone restores sinus rhythm and reduces hospitalization or death in intermittent atrial fibrillation. It also lowers heart rate and blood pressure and has antiadrenergic and potential ventricular antiarrhythmic effects. We hypothesized that dronedarone would reduce major vascular events in high-risk permanent atrial fibrillation.
Resumo:
Cerebral vasospasm is a common complication occurring after aneurysmal subarachnoid hemorrhage (SAH). It is recognized as a leading preventable cause of morbidity and mortality in this patient group, but its management is challenging, and new treatments are needed. Clazosentan is an endothelin receptor antagonist designed to prevent endothelin-mediated cerebral vasospasm. Vajkoczy et al. (Neurosurg 103:9-17, 2005) initially demonstrated that clazosentan reduced moderate/severe angiographically proven vasospasm by 55% relative to placebo. These findings led to the initiation of the CONSCIOUS trial program to further examine the efficacy and safety of clazosentan in reducing angiographic vasospasm and improving clinical outcome after aneurysmal SAH. In the first of these studies, CONSCIOUS-1, 413 patients were randomized to placebo or clazosentan 1, 5 or 15 mg/h. Clazosentan reduced angiographic vasospasm dose-dependently relative to placebo with a maximum risk reduction of 65% with the highest dose. Despite this, there was no benefit of clazosentan on the secondary protocol-defined morbidity/mortality endpoint; however, additional post-hoc and modified endpoint analyses provided some evidence for a potential clinical benefit. Two additional large-scale studies (CONSCIOUS-2 and CONSCIOUS-3) are now underway to further investigate the potential of clazosentan to improve long-term clinical outcome.
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
Learned irrelevance (LIrr) refers to a form of selective learning that develops as a result of prior noncorrelated exposures of the predicted and predictor stimuli. In learning situations that depend on the associative link between the predicted and predictor stimuli, LIrr is expressed as a retardation of learning. It represents a form of modulation of learning by selective attention. Given the relevance of selective attention impairment to both positive and cognitive schizophrenia symptoms, the question remains whether LIrr impairment represents a state (relating to symptom manifestation) or trait (relating to schizophrenia endophenotypes) marker of human psychosis. We examined this by evaluating the expression of LIrr in an associative learning paradigm in (1) asymptomatic first-degree relatives of schizophrenia patients (SZ-relatives) and in (2) individuals exhibiting prodromal signs of psychosis ("ultrahigh risk" [UHR] patients) in each case relative to demographically matched healthy control subjects. There was no evidence for aberrant LIrr in SZ-relatives, but LIrr as well as associative learning were attenuated in UHR patients. It is concluded that LIrr deficiency in conjunction with a learning impairment might be a useful state marker predictive of psychotic state but a relatively weak link to a potential schizophrenia endophenotype.
Resumo:
BACKGROUND: Detecting a benefit from closure of patent foramen ovale in patients with cryptogenic stroke is hampered by low rates of stroke recurrence and uncertainty about the causal role of patent foramen ovale in the index event. A method to predict patent foramen ovale-attributable recurrence risk is needed. However, individual databases generally have too few stroke recurrences to support risk modeling. Prior studies of this population have been limited by low statistical power for examining factors related to recurrence. AIMS: The aim of this study was to develop a database to support modeling of patent foramen ovale-attributable recurrence risk by combining extant data sets. METHODS: We identified investigators with extant databases including subjects with cryptogenic stroke investigated for patent foramen ovale, determined the availability and characteristics of data in each database, collaboratively specified the variables to be included in the Risk of Paradoxical Embolism database, harmonized the variables across databases, and collected new primary data when necessary and feasible. RESULTS: The Risk of Paradoxical Embolism database has individual clinical, radiologic, and echocardiographic data from 12 component databases, including subjects with cryptogenic stroke both with (n = 1925) and without (n = 1749) patent foramen ovale. In the patent foramen ovale subjects, a total of 381 outcomes (stroke, transient ischemic attack, death) occurred (median follow-up 2·2 years). While there were substantial variations in data collection between studies, there was sufficient overlap to define a common set of variables suitable for risk modeling. CONCLUSION: While individual studies are inadequate for modeling patent foramen ovale-attributable recurrence risk, collaboration between investigators has yielded a database with sufficient power to identify those patients at highest risk for a patent foramen ovale-related stroke recurrence who may have the greatest potential benefit from patent foramen ovale closure.
Resumo:
Background Chronic localized pain syndromes, especially chronic low back pain (CLBP), are common reasons for consultation in general practice. In some cases chronic localized pain syndromes can appear in combination with chronic widespread pain (CWP). Numerous studies have shown a strong association between CWP and several physical and psychological factors. These studies are population-based cross-sectional and do not allow for assessing chronology. There are very few prospective studies that explore the predictors for the onset of CWP, where the main focus is identifying risk factors for the CWP incidence. Until now there have been no studies focusing on preventive factors keeping patients from developing CWP. Our aim is to perform a cross sectional study on the epidemiology of CLBP and CWP in general practice and to look for distinctive features regarding resources like resilience, self-efficacy and coping strategies. A subsequent cohort study is designed to identify the risk and protective factors of pain generalization (development of CWP) in primary care for CLBP patients. Methods/Design Fifty-nine general practitioners recruit consecutively, during a 5 month period, all patients who are consulting their family doctor because of chronic low back pain (where the pain is lasted for 3 months). Patients are asked to fill out a questionnaire on pain anamnesis, pain-perception, co-morbidities, therapy course, medication, socio demographic data and psychosomatic symptoms. We assess resilience, coping resources, stress management and self-efficacy as potential protective factors for pain generalization. Furthermore, we raise risk factors for pain generalization like anxiety, depression, trauma and critical life events. During a twelve months follow up period a cohort of CLBP patients without CWP will be screened on a regular basis (3 monthly) for pain generalization (outcome: incident CWP). Discussion This cohort study will be the largest study which prospectively analyzes predictors for transition from CLBP to CWP in primary care setting. In contrast to the typically researched risk factors, which increase the probability of pain generalization, this study also focus intensively on protective factors, which decrease the probability of pain generalization.