871 resultados para Bíblia - N.T. -João 9,1-41 - Comentário
Resumo:
BACKGROUND: Coronary stents improve immediate and late results of balloon angioplasty by tacking up dissections and preventing wall recoil. These goals are achieved within weeks after angioplasty, but with current technology stents permanently remain in the artery, with many limitations including the need for long-term antiplatelet treatment to avoid thrombosis. We report a prospective multicentre clinical trial of coronary implantations of absorbable magnesium stents. METHODS: We enrolled 63 patients (44 men; mean age 61.3 [SD 9.5 years]) in eight centres with single de novo lesions in a native coronary artery in a multicentre, non-randomised prospective study. Follow-up included coronary angiography and intravascular ultrasound at 4 months and clinical assessment at 6 months and 12 months. The primary endpoint was cardiac death, non-fatal myocardial infarction, or clinically driven target lesion revascularisation at 4 months FINDINGS: 71 stents, 10-15 mm in length and 3.0-3.5 mm in diameter, were successfully implanted after pre-dilatation in 63 patients. Diameter stenosis was reduced from 61.5 (SD 13.1%) to 12.6 (5.6%) with an acute gain of 1.41 mm (0.46 mm) and in-stent late loss of 1.08 mm (0.49 mm). The ischaemia-driven target lesion revascularisation rate was 23.8% after 4 months, and the overall target lesion revascularisation rate was 45% after 1 year. No myocardial infarction, subacute or late thrombosis, or death occurred. Angiography at 4 months showed an increased diameter stenosis of 48.4 (17.0%). After serial intravascular ultrasound examinations, only small remnants of the original struts were visible, well embedded into the intima. Neointimal growth and negative remodelling were the main operating mechanisms of restenosis. INTERPRETATION: This study shows that biodegradable magnesium stents can achieve an immediate angiographic result similar to the result of other metal stents and can be safely degraded after 4 months. Modifications of stent characteristics with prolonged degradation and drug elution are currently in development.
Resumo:
AIM: To present a novel, minimally invasive strabismus surgery (MISS) technique for rectus muscle operations. METHODS: In this prospective study with a non-concurrent, retrospective comparison group, the first 20 consecutive patients treated with MISS were matched by age, diagnosis and muscles operated on, with 20 patients with a limbal opening operated on by the same surgeon at Kantonsspital, St Gallen, Switzerland. A total of 39 muscles were operated on. MISS is performed by applying two small radial cuts along the superior and inferior muscle margin. After muscle separation from surrounding tissue, a recession or plication is performed through the resulting tunnel. Alignment, binocular single vision, variations in vision, refraction, and number and types of complications during the first 6 postoperative months were registered. RESULTS: Visual acuity decreased at postoperative day 1 in both groups. The decrease was less pronounced in the group operated on with MISS (difference of decrease 0.14 logMAR, p<0.001). An abnormal lid swelling at day 1 was more frequent in the control group (21%, 95% confidence interval (CI) 9% to 41%, 5/24 v 0%, 95% CI 0 to 13%, 0/25, p<0.05). No significant difference was found for final alignment, binocular single vision, other visual acuities, refractive changes or complications (allergic reactions, dellen formation, abnormal conjuctival findings). A conversion to a limbal opening was necessary in 5% (95% CI 2% to 17%, 2/39) of muscles. CONCLUSIONS: This study shows that this new, small-incision, minimal dissection technique is feasible. The MISS technique seems to be superior in the direct postoperative period as better visual acuities and less lid swelling were observed. Long-term results did not differ in the two groups.
Resumo:
The status and dynamics of glaciers are crucial for agriculture in semiarid parts of Central Asia, since river flow is characterized by major runoff in spring and summer, supplied by glacier- and snowmelt. Ideally, this coincides with the critical period of water demand for irrigation. The present study shows a clear trend in glacier retreat between 1963 and 2000 in the Sokoluk watershed, a catchment of the Northern Tien Shan mountain range in Kyrgyzstan. The overall area loss of 28% observed for the period 1963–2000, and a clear acceleration of wastage since the 1980s, correlate with the results of previous studies in other regions of the Tien Shan as well as the Alps. In particular, glaciers smaller than 0.5 km2 have exhibited this phenomenon most starkly. While they registered a medium decrease of only 9.1% for 1963–1986, they lost 41.5% of their surface area between 1986 and 2000. Furthermore, a general increase in the minimum glacier elevation of 78 m has been observed over the last three decades. This corresponds to about one-third of the entire retreat of the minimum glacier elevation in the Northern Tien Shan since the Little Ice Age maximum.
Resumo:
To assess the prevalence of tooth wear on buccal/facial and lingual/palatal tooth surfaces and identify related risk factors in a sample of young European adults, aged 18-35 years. Calibrated and trained examiners measured tooth wear, using the basic erosive wear examination (BEWE) on in 3187 patients in seven European countries and assessed the impact of risk factors with a previously validated questionnaire. Each individual was characterized by the highest BEWE score recorded for any scoreable surface. Bivariate analyses examined the proportion of participants who scored 2 or 3 in relation to a range of demographic, dietary and oral care variables. The highest tooth wear BEWE score was 0 for 1368 patients (42.9%), 1 for 883 (27.7%), 2 for 831 (26.1%) and 3 for 105 (3.3%). There were large differences between different countries with the highest levels of tooth wear observed in the UK. Important risk factors for tooth wear included heartburn or acid reflux, repeated vomiting, residence in rural areas, electric tooth brushing and snoring. We found no evidence that waiting after breakfast before tooth brushing has any effect on the degree of tooth wear (p=0.088). Fresh fruit and juice intake was positively associated with tooth wear. In this adult sample 29% had signs of tooth wear making it a common presenting feature in European adults.
Resumo:
PURPOSE To assess the need for clinically-driven secondary revascularization in critical limb ischemia (CLI) patients subsequent to tibial angioplasty during a 2-year follow-up. METHODS Between 2008 and 2010, a total of 128 consecutive CLI patients (80 men; mean age 76.5±9.8 years) underwent tibial angioplasty in 139 limbs. Rutherford categories, ankle-brachial index measurements, and lower limb oscillometries were prospectively assessed. All patients were followed at 3, 6, 12 months, and annually thereafter. Rates of death, primary and secondary sustained clinical improvement, target lesion (TLR) and target extremity revascularization (TER), as well as major amputation, were analyzed retrospectively. Primary clinical improvement was defined as improvement in Rutherford category to a level of intermittent claudication without unplanned amputation or TLR. RESULTS All-cause mortality was 8.6%, 14.8%, 22.9%, and 29.1% at 3, 6, 12, and 24 months. At the same intervals, rates of primary sustained clinical improvement were 74.5%, 53.0%, 42.7%, and 37.1%; for secondary improvement, the rates were 89.1%, 76.0%, 68.4%, and 65.0%. Clinically-driven TLR rates were 14.6%, 29.1%, 41.6%, 46.2%; the rates for TER were 3.0%, 13.6%, 17.2%, and 27.6% in corresponding intervals, while the rates of major amputation were 1.5%, 5.5%, 10.1%, and 10.1%. CONCLUSION Clinically-driven TLR is frequently required to maintain favorable functional clinical outcomes in CLI patients following tibial angioplasty. Dedicated technologies addressing tibial arterial restenosis warrant further academic scrutiny.
Resumo:
In this study, we document glacial deposits and reconstruct the glacial history in the Karagöl valley system in the eastern Uludağ in northwestern Turkey based on 42 cosmogenic 10Be exposure ages from boulders and bedrock. Our results suggest the Last Glacial Maximum (LGM) advance prior to 20.4 ± 1.2 ka and at least three re-advances until 18.6 ± 1.2 ka during the global LGM within Marine Isotope Stage-2. In addition, two older advances of unknown age are geomorphologically well constrained, but not dated due to the absence of suitable boulders. Glaciers advanced again two times during the Lateglacial. The older is exposure dated to not later than 15.9 ± 1.1 ka and the younger is attributed to the Younger Dryas (YD) based on field evidence. The timing of the glaciations in the Karagöl valley correlates well with documented archives in the Anatolian and Mediterranean mountains and the Alps. These glacier fluctuations may be explained by the change in the atmospheric circulation pattern during the different phases of North Atlantic Oscillation (NAO) winter indices.
Resumo:
The trabecular bone score (TBS) is an index of bone microarchitectural texture calculated from anteroposterior dual-energy X-ray absorptiometry (DXA) scans of the lumbar spine (LS) that predicts fracture risk, independent of bone mineral density (BMD). The aim of this study was to compare the effects of yearly intravenous zoledronate (ZOL) versus placebo (PLB) on LS BMD and TBS in postmenopausal women with osteoporosis. Changes in TBS were assessed in the subset of 107 patients recruited at the Department of Osteoporosis of the University Hospital of Berne, Switzerland, who were included in the HORIZON trial. All subjects received adequate calcium and vitamin D3. In these patients randomly assigned to either ZOL (n = 54) or PLB (n = 53) for 3 years, BMD was measured by DXA and TBS assessed by TBS iNsight (v1.9) at baseline and 6, 12, 24, and 36 months after treatment initiation. Baseline characteristics (mean ± SD) were similar between groups in terms of age, 76.8 ± 5.0 years; body mass index (BMI), 24.5 ± 3.6 kg/m(2) ; TBS, 1.178 ± 0.1 but for LS T-score (ZOL-2.9 ± 1.5 versus PLB-2.1 ± 1.5). Changes in LS BMD were significantly greater with ZOL than with PLB at all time points (p < 0.0001 for all), reaching +9.58% versus +1.38% at month 36. Change in TBS was significantly greater with ZOL than with PLB as of month 24, reaching +1.41 versus-0.49% at month 36; p = 0.031, respectively. LS BMD and TBS were weakly correlated (r = 0.20) and there were no correlations between changes in BMD and TBS from baseline at any visit. In postmenopausal women with osteoporosis, once-yearly intravenous ZOL therapy significantly increased LS BMD relative to PLB over 3 years and TBS as of 2 years.
Resumo:
AIMS High-density lipoprotein (HDL) cholesterol is a strong predictor of cardiovascular mortality. This work aimed to investigate whether the presence of coronary artery disease (CAD) impacts on its predictive value. METHODS AND RESULTS We studied 3141 participants (2191 males, 950 females) of the LUdwigshafen RIsk and Cardiovascular health (LURIC) study. They had a mean ± standard deviation age of 62.6 ± 10.6 years, body mass index of 27.5 ± 4.1 kg/m², and HDL cholesterol of 38.9 ± 10.8 mg/dL. The cohort consisted of 699 people without CAD, 1515 patients with stable CAD, and 927 patients with unstable CAD. The participants were prospectively followed for cardiovascular mortality over a median (inter-quartile range) period of 9.9 (8.7-10.7) years. A total of 590 participants died from cardiovascular diseases. High-density lipoprotein cholesterol by tertiles was inversely related to cardiovascular mortality in the entire cohort (P = 0.009). There was significant interaction between HDL cholesterol and CAD in predicting the outcome (P = 0.007). In stratified analyses, HDL cholesterol was strongly associated with cardiovascular mortality in people without CAD [3rd vs. 1st tertile: HR (95% CI) = 0.37 (0.18-0.74), P = 0.005], but not in patients with stable [3rd vs. 1st tertile: HR (95% CI) = 0.81 (0.61-1.09), P = 0.159] and unstable [3rd vs. 1st tertile: HR (95% CI) = 0.91 (0.59-1.41), P = 0.675] CAD. These results were replicated by analyses in 3413 participants of the AtheroGene cohort and 5738 participants of the ESTHER cohort, and by a meta-analysis comprising all three cohorts. CONCLUSION The inverse relationship of HDL cholesterol with cardiovascular mortality is weakened in patients with CAD. The usefulness of considering HDL cholesterol for cardiovascular risk stratification seems limited in such patients.
Resumo:
OBJECTIVE To validate use of stress MRI for evaluation of stifle joints of dogs with an intact or deficient cranial cruciate ligament (CrCL). SAMPLE 10 cadaveric stifle joints from 10 dogs. PROCEDURES A custom-made limb-holding device and a pulley system linked to a paw plate were used to apply axial compression across the stifle joint and induce cranial tibial translation with the joint in various degrees of flexion. By use of sagittal proton density-weighted MRI, CrCL-intact and deficient stifle joints were evaluated under conditions of loading stress simulating the tibial compression test or the cranial drawer test. Medial and lateral femorotibial subluxation following CrCL transection measured under a simulated tibial compression test and a cranial drawer test were compared. RESULTS By use of tibial compression test MRI, the mean ± SD cranial tibial translations in the medial and lateral compartments were 9.6 ± 3.7 mm and 10 ± 4.1 mm, respectively. By use of cranial drawer test MRI, the mean ± SD cranial tibial translations in the medial and lateral compartments were 8.3 ± 3.3 mm and 9.5 ± 3.5 mm, respectively. No significant difference in femorotibial subluxation was found between stress MRI techniques. Femorotibial subluxation elicited by use of the cranial drawer test was greater in the lateral than in the medial compartment. CONCLUSIONS AND CLINICAL RELEVANCE Both stress techniques induced stifle joint subluxation following CrCL transection that was measurable by use of MRI, suggesting that both methods may be further evaluated for clinical use.
Resumo:
BACKGROUND Previous studies indicate increased prevalences of suicidal ideation, suicide attempts, and completed suicide in Huntington's disease (HD) compared with the general population. This study investigates correlates and predictors of suicidal ideation in HD. METHODS The study cohort consisted of 2106 HD mutation carriers, all participating in the REGISTRY study of the European Huntington's Disease Network. Of the 1937 participants without suicidal ideation at baseline, 945 had one or more follow-up measurements. Participants were assessed for suicidal ideation by the behavioural subscale of the Unified Huntington's Disease Rating Scale (UHDRS). Correlates of suicidal ideation were analyzed using logistic regression analysis and predictors were analyzed using Cox regression analysis. RESULTS At baseline, 169 (8.0%) mutation carriers endorsed suicidal ideation. Disease duration (odds ratio [OR]=0.96; 95% confidence interval [CI]: 0.9-1.0), anxiety (OR=2.14; 95%CI: 1.4-3.3), aggression (OR=2.41; 95%CI: 1.5-3.8), a previous suicide attempt (OR=3.95; 95%CI: 2.4-6.6), and a depressed mood (OR=13.71; 95%CI: 6.7-28.0) were independently correlated to suicidal ideation at baseline. The 4-year cumulative incidence of suicidal ideation was 9.9%. Longitudinally, the presence of a depressed mood (hazard ratio [HR]=2.05; 95%CI: 1.1-4.0) and use of benzodiazepines (HR=2.44; 95%CI: 1.2-5.0) at baseline were independent predictors of incident suicidal ideation, whereas a previous suicide attempt was not predictive. LIMITATIONS As suicidal ideation was assessed by only one item, and participants were a selection of all HD mutation carriers, the prevalence of suicidal ideation was likely underestimated. CONCLUSIONS Suicidal ideation in HD frequently occurs. Assessment of suicidal ideation is a priority in mutation carriers with a depressed mood and in those using benzodiazepines.
Resumo:
Einleitung Eine eher unbekannte Art des Mentalen Trainings ist das Training im Klartraum (Erla-cher, Stumbrys & Schredl, 2011-12). Im Klartraum ist sich der Träumende bewusst, dass er träumt, und kann dadurch den fortlaufenden Trauminhalt kontrollieren. Frühere Stu-dien zeigten, dass es möglich ist, motorische Aufgaben im Klartraum zu üben, um dadurch eine verbesserte Leistung im Wachzustand zu erreichen (Erlacher & Schredl, 2010). Jedoch ist wenig über die Prävalenz von Klarträumern im Sport bekannt. Methode Die Stichprobe umfasste 840 deutsche (D: 483 m, 357 w) und 1323 japanische (J: 1000 m, 323 w) Athleten. Das Durchschnittsalter betrug 20,4 Jahre (D: 21,6 J: 19,7). Die Teil-nehmer wurden in verschiedenen Sportarten – von Mannschaftssportarten (z.B. Basket-ball) bis Einzelsportarten (z.B. Leichtathletik) – rekrutiert und füllten einen Fragebogen zum Thema Sport, Schlaf und Traum aus. Die Athleten waren durchschnittlich 9,1 Jahre (D: 11.1, J: 7,9) aktiv und trainierten etwa 14,4 Stunden (D: 11.1, J: 16,7) pro Woche. Der Fragebogen erfasste auf einer 8-stufigen Skala die Klartraumhäufigkeit (Plus Definition: Für ein klares Verständnis von Klarträumen); die Anwendung (z.B. Training) für den Sport und, wenn dies bestätigt wurde, ob sportliche Verbesserungen erlebten wurden. Ergebnisse 47% (D: 57%, J: 41%) der Athleten gaben an, mindesten einen Klartraum erlebt zu ha-ben, 20% (D: 24% J: 18%) sind häufige Klarträumer (mit einem oder mehrere Klarträume pro Monat) und 9% (D 9% , J: 9%) nutzen Klarträume für ihren Sport, davon berichtet die Mehrheit, dass das Klartraumtraining die sportliche Leistung im Wachzustand verbessert. Diskussion Etwa die Hälfte der Athleten kennt das Klarträumen aus eigener Erfahrung, ein Fünftel sind häufige Klarträumer und etwa jeder zehnte Athlet nutzt Klarträume für seinen Sport. Für die deutsche Stichprobe ist die Prävalenzrate der Athleten ähnlich wie in der Bevöl-kerung. Für die japanische Stichprobe liegen keine repräsentativen Bevölkerungsdaten vor, auf der Grundlage der hier vorgestellten Fragebogendaten scheint es jedoch, dass kulturellen Unterschiede eine untergeordnete Rolle spielen. Literatur Erlacher, D. & Schredl, M. (2010). Practicing a motor task in a lucid dream enhances subsequent perfor-mance: A pilot study. The Sport Psychologist, 24(2), 157-167. Erlacher, D., Stumbrys, T. & Schredl, M. (2011-2012). Frequency of lucid dreams and lucid dream practice in German athletes. Imagination, Cognition and Personality, 31(3), 237-246.
Resumo:
PURPOSE The objectives of this systematic review are (1) to quantitatively estimate the esthetic outcomes of implants placed in postextraction sites, and (2) to evaluate the influence of simultaneous bone augmentation procedures on these outcomes. MATERIALS AND METHODS Electronic and manual searches of the dental literature were performed to collect information on esthetic outcomes based on objective criteria with implants placed after extraction of maxillary anterior and premolar teeth. All levels of evidence were accepted (case series studies required a minimum of 5 cases). RESULTS From 1,686 titles, 114 full-text articles were evaluated and 50 records included for data extraction. The included studies reported on single-tooth implants adjacent to natural teeth, with no studies on multiple missing teeth identified (6 randomized controlled trials, 6 cohort studies, 5 cross-sectional studies, and 33 case series studies). Considerable heterogeneity in study design was found. A meta-analysis of controlled studies was not possible. The available evidence suggests that esthetic outcomes, determined by esthetic indices (predominantly the pink esthetic score) and positional changes of the peri-implant mucosa, may be achieved for single-tooth implants placed after tooth extraction. Immediate (type 1) implant placement, however, is associated with a greater variability in outcomes and a higher frequency of recession of > 1 mm of the midfacial mucosa (eight studies; range 9% to 41% and median 26% of sites, 1 to 3 years after placement) compared to early (type 2 and type 3) implant placement (2 studies; no sites with recession > 1 mm). In two retrospective studies of immediate (type 1) implant placement with bone graft, the facial bone wall was not detectable on cone beam CT in 36% and 57% of sites. These sites had more recession of the midfacial mucosa compared to sites with detectable facial bone. Two studies of early implant placement (types 2 and 3) combined with simultaneous bone augmentation with GBR (contour augmentation) demonstrated a high frequency (above 90%) of facial bone wall visible on CBCT. Recent studies of immediate (type 1) placement imposed specific selection criteria, including thick tissue biotype and an intact facial socket wall, to reduce esthetic risk. There were no specific selection criteria for early (type 2 and type 3) implant placement. CONCLUSIONS Acceptable esthetic outcomes may be achieved with implants placed after extraction of teeth in the maxillary anterior and premolar areas of the dentition. Recession of the midfacial mucosa is a risk with immediate (type 1) placement. Further research is needed to investigate the most suitable biomaterials to reconstruct the facial bone and the relationship between long-term mucosal stability and presence/absence of the facial bone, the thickness of the facial bone, and the position of the facial bone crest.
Resumo:
BACKGROUND AND AIMS We investigated the association between significant liver fibrosis, determined by AST-to-platelet ratio index (APRI), and all-cause mortality among HIV-infected patients prescribed antiretroviral therapy (ART) in Zambia METHODS: Among HIV-infected adults who initiated ART, we categorized baseline APRI scores according to established thresholds for significant hepatic fibrosis (APRI ≥1.5) and cirrhosis (APRI ≥2.0). Using multivariable logistic regression we identified risk factors for elevated APRI including demographic characteristics, body mass index (BMI), HIV clinical and immunologic status, and tuberculosis. In the subset tested for hepatitis B surface antigen (HBsAg), we investigated the association of hepatitis B virus co-infection with APRI score. Using Kaplan-Meier analysis and Cox proportional hazards regression we determined the association of elevated APRI with death during ART. RESULTS Among 20,308 adults in the analysis cohort, 1,027 (5.1%) had significant liver fibrosis at ART initiation including 616 (3.0%) with cirrhosis. Risk factors for significant fibrosis or cirrhosis included male sex, BMI <18, WHO clinical stage 3 or 4, CD4+ count <200 cells/mm(3) , and tuberculosis. Among the 237 (1.2%) who were tested, HBsAg-positive patients had four times the odds (adjusted odds ratio, 4.15; 95% CI, 1.71-10.04) of significant fibrosis compared HBsAg-negatives. Both significant fibrosis (adjusted hazard ratio 1.41, 95% CI, 1.21-1.64) and cirrhosis (adjusted hazard ratio 1.57, 95% CI, 1.31-1.89) were associated with increased all-cause mortality. CONCLUSION Liver fibrosis may be a risk factor for mortality during ART among HIV-infected individuals in Africa. APRI is an inexpensive and potentially useful test for liver fibrosis in resource-constrained settings. This article is protected by copyright. All rights reserved.
Resumo:
AIM The optimal duration of dual antiplatelet therapy (DAPT) following the use of new generation drug-eluting stents is unknown. METHODS AND RESULTS The association between DAPT interruption and the rates of stent thrombosis (ST) and cardiac death/target-vessel myocardial infarction (CD/TVMI) in patients receiving a Resolute zotarolimus-eluting stent (R-ZES) was analysed in 4896 patients from the pooled RESOLUTE clinical programme. Daily acetylsalicylate (ASA) and a thienopyridine for 6-12 months were prescribed. A DAPT interruption was defined as any interruption of ASA and/or a thienopyridine of >1 day; long interruptions were >14 days. Three groups were analysed: no interruption, interruption during the first month, and >1-12 months. There were 1069 (21.83%) patients with a DAPT interruption and 3827 patients with no interruption. Among the 166 patients in the 1-month interruption group, 6 definite/probable ST events occurred (3.61%; all long DAPT interruptions), and among the 903 patients in the >1-12 months (60% occurred between 6 and 12 months) interruption group, 1 ST event occurred (0.11%; 2-day DAPT interruption). Among patients with no DAPT interruption, 32 ST events occurred (0.84%). Rates of CD/TVMI were 6.84% in the 1-month long interruption group, 1.41% in the >1-12 months long interruption group, and 4.08% in patients on continuous DAPT. CONCLUSION In a pooled population of patients receiving an R-ZES, DAPT interruptions within 1 month are associated with a high risk of adverse outcomes. Dual antiplatelet therapy interruptions between 1 and 12 months were associated with low rates of ST and adverse cardiac outcomes. Randomized clinical trials are needed to determine whether early temporary or permanent interruption of DAPT is truly safe. CLINICAL TRIALSGOV IDENTIFIERS NCT00617084; NCT00726453; NCT00752128; NCT00927940.
Resumo:
QUESTIONS UNDER STUDY To improve the response of deteriorating patients during their hospital stay, the University Hospital Bern has introduced a Medical Emergency Team (MET). Aim of this retrospective cohort study is to review the preceding factors, patient characteristics, process parameters and their correlation to patient outcomes of MET calls since the introduction of the team. METHODS Data on patient characteristics, parameters related to MET activation and intervention and patient outcomes were evaluated. A Vital Sign Score (VSS), which is defined as the sum of the occurrence of each vital sign abnormalities, was calculated for all physiological parameters pre MET event, during event and correlation with hospital outcomes. RESULTS A total of 1,628 MET calls in 1,317 patients occurred; 262 (19.9%) of patients with MET calls during their hospital stay died. The VSS pre MET event (odds ratio [OR] 1.78, 95% confidence interval [CI] 1.50-2.13; AUROC 0.63; all p <0.0001) and during the MET call (OR 1.60, 95% CI 1.41-1.83; AUROC 0.62; all p <0.0001) were significantly correlated to patient outcomes. A significant increase in MET calls from 5.2 to 16.5 per 1000 hospital admissions (p <0.0001) and a decrease in cardiac arrest calls in the MET perimeter from 1.6 in 2008 to 0.8 per 1000 admissions was observed during the study period (p = 0.014). CONCLUSIONS The VSS is a significant predictor of mortality in patients assessed by the MET. Increasing MET utilisation coincided with a decrease in cardiac arrest calls in the MET perimeter.