987 resultados para NEW-ONSET


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background. The incidence of Clostridium difficile -associated diarrhea (CDAD) is increasing worldwide likely because of increased use of broad spectrum antibiotics and the introduction of a clonal hyper-virulent strain called the BI strain. Short-term complications of CDAD include recurrent disease, requirement for colectomy, and persistent disease. However, data on the long-term consequences of CDAD are scarce. Among other infectious diseases (Shigella, Salmonella, and Campylobacter), long-term consequences such as irritable bowel syndrome (IBS), chronic dyspepsia/diarrhea, and other GI effects have been noted. Since the mechanism of action of these agents is similar to C.difficile, we hypothesized that patients with CDAD have greater likelihood of developing IBS and other functional gastrointestinal disorders (FGIDs) in the long-term as compared to a general sample of recently hospitalized patients. ^ Objective. To evaluate the long-term gastrointestinal complications of CDAD, (IBS, functional diarrhea, functional abdominal bloating, functional constipation and functional abdominal pain syndrome). ^ Methods. The current study was a secondary analysis of a previously completed observational case-control outcome study. Adult CDAD patients at St. Luke's Episcopal Hospital, Houston (SLEH) were followed up and interviewed by telephone six months after the initial diagnosis thereafter evaluated for the development of IBS and other FGIDs. A total of 46 patients with CDAD infection were recruited at SLEH between May-November 2007. The comparators were patients hospitalized in SLEH within one month before or after the admission of the reference case, hospital length of stay within one week longer or shorter than reference case, and age within 10 years more or less than the reference case. Cases and comparators were compared using Fisher's exact test. A p<0.05 was considered significant. ^ Results. Thirty CDAD patients responded to the questionnaires and were compared to 40 comparators. No comparator developed a FGID, while 3 (10%) CDAD patients developed new onset IBS (p=0.07), 4 (13.3%) developed new onset Functional Diarrhea (p=0.03), and 3 (10%) developed new onset Functional Constipation (p=0.07). No patient developed Functional Abdominal Bloating and Functional Abdominal Pain Syndrome. ^ Conclusion. In this study, new onset functional diarrhea was significantly more common in patients CDAD within six months after initial infection compared to matched controls.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Heart failure (CHF) is the most frequent and prognostically severe symptom of aortic stenosis (AS), and the most common indication for surgery. The mainstay of treatment for AS is aortic valve replacement (AVR), and the main indication for an AVR is development of symptomatic disease. ACC/AHA guidelines define severe AS as an aortic valve area (AVA) ≤1cm², but there is little data correlating echocardiogram AVA with the onset of symptomatic CHF. We evaluated the risk of developing CHF with progressively decreasing echocardiographic AVA. We also compared echocardiographic AVA with Jet velocity (V2) and indexed AVA (AVAI) to assess the best predictor of development of symptomatic CHF.^ Methods and Results: This retrospective cohort study evaluated 518 patients with asymptomatic moderate or severe AS from a single community based cardiology practice. A total of 925 echocardiograms were performed over an 11-year period. Each echocardiogram was correlated with concurrent clinical assessments while the investigator was blinded to the echocardiogram severity of AS. The Cox Proportional hazards model was used to analyze the relationship between AVA and the development of CHF. The median age of patients at entry was 76.1 years, with 54% males. A total of 116 patients (21.8%) developed new onset CHF during follow-up. Compared to patients with AVA >1.0cm², patients with lower AVA had an exponentially increasing risk of developing CHF for each 0.2cm² decrement in AVA, becoming statistically significant only at an AVA less than 0.8 cm². Also, compared to V2 and AVAI, AVA added more information to assessing risk for development of CHF (p=0.041). ^ Conclusion: In patients with normal or mildly impaired LVEF, the risk of CHF rises exponentially with decreasing valve area and becomes statistically significant after AVA falls below 0.8cm². AVA is a better predictor of CHF when compared to V2 or AVAI.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To describe and follow cotton wool spots (CWS) in branch retinal vein occlusion (BRVO) using multimodal imaging. METHODS: In this prospective cohort study including 24 patients with new-onset BRVO, CWS were described and analyzed in color fundus photography (CF), spectral domain optical coherence tomography (SD-OCT), infrared (IR) and fluorescein angiography (FA) every 3 months for 3 years. The CWS area on SD-OCT and CF was evaluated using OCT-Tool-Kit software: CWS were marked in each single OCT B-scan and the software calculated the area by interpolation. RESULTS: 29 central CWS lesions were found. 100% of these CWS were visible on SD-OCT, 100% on FA and 86.2% on IR imaging, but only 65.5% on CF imaging. CWS were visible for 12.4 ± 7.5 months on SD-OCT, for 4.4 ± 3 months and 4.3 ± 3.4 months on CF and on IR, respectively, and for 17.5 ± 7.1 months on FA. The evaluated CWS area on SD-OCT was larger than on CF (0.26 ± 0.17 mm(2) vs. 0.13 ± 0.1 mm(2), p < 0.0001). The CWS area on SD-OCT and surrounding pathology such as intraretinal cysts, avascular zones and intraretinal hemorrhage were predictive for how long CWS remained visible (r(2) = 0.497, p < 0.002). CONCLUSIONS: The lifetime and presentation of CWS in BRVO seem comparable to other diseases. SD-OCT shows a higher sensitivity for detecting CWS compared to CF. The duration of visibility of CWS varies among different image modalities and depends on the surrounding pathology and the CWS size.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the development of an insulin autoantibody (IAA) assay performed in 96-well filtration plates, we have evaluated prospectively the development of IAA in NOD mice (from 4 weeks of age) and children (from 7 to 10 months of age) at genetic risk for the development of type 1 diabetes. NOD mice had heterogeneous expression of IAA despite being inbred. IAA reached a peak between 8 and 16 weeks and then declined. IAA expression by NOD mice at 8 weeks of age was strongly associated with early development of diabetes, which occurred at 16–18 weeks of age (NOD mice IAA+ at 8 weeks: 83% (5/6) diabetic by 18 weeks versus 11% (1/9) of IAA negative at 8 weeks; P < .01). In man, IAA was frequently present as early as 9 months of age, the first sampling time. Of five children found to have persistent IAA before 1 year of age, four have progressed to diabetes (all before 3.5 years of age) and the fifth is currently less than age 2. Of the 929 children not expressing persistent IAA before age 1, only one has progressed to diabetes to date (age onset 3), and this child expressed IAA at his second visit (age 1.1). In new onset patients, the highest levels of IAA correlated with an earlier age of diabetes onset. Our data suggest that the program for developing diabetes of NOD mice and humans is relatively “fixed” early in life and, for NOD mice, a high risk of early development of diabetes is often determined by 8 weeks of age. With such early determination of high risk of progression to diabetes, immunologic therapies in humans may need to be tested in children before the development of IAA for maximal efficacy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A lamotrigina (LTG) é um fármaco pertencente à classe das feniltriazinas utilizado no tratamento de crises epilépticas generalizadas e focais e no tratamento adjunto da epilepsia refratária. Devido à alta variabilidade interindividual, às interações medicamentosas e aos efeitos adversos apresentados durante a administração da LTG, a monitorização terapêutica nos pacientes que fazem uso deste fármaco é necessária para ajuste de dose individual e evitar os efeitos adversos. Assim, o objetivo deste trabalho foi a avaliação de duas técnicas de microextração: a microextração em fase líquida com fibras ocas (HF-LPME) e a microextração líquido-líquido dispersiva (DLLME) para análise da lamotrigina em amostras de plasma de pacientes epilépticos. Primeiramente foram definidas as condições eletroforéticas: foi utilizado um capilar de sílica fundida de 75 ?m de diâmetro interno e 50 cm de comprimento efetivo. O eletrólito de corrida (BGE) foi composto por ácido 2-morfolinoetanosulfônico (MES), na concentração de 130 mmol L-1 e pH 5,0. As análises foram realizadas à temperatura de 20°C e tensão de 15 kV. A amostra foi injetada hidrodinamicamente (0,5 psi por 10 s) e a detecção foi feita em 214 nm. Nestas condições a LTG e o padrão interno (PI), lidocaína, puderam ser analisados em menos de 7 minutos. A HF-LPME foi avaliada no modo de 3 fases, usando 500 ?L de plasma e 3,5 mL de solução fosfato de sódio 50 mmol L-1 pH 9,0 como fase doadora. O solvente utilizado para impregnar a fibra foi o 1-octanol. Como fase aceptora foram utilizados 60 ?L de solução de ácido clorídrico pH 4,0. Para avaliação da DLLME, foi necessária uma etapa de pré-tratamento da amostra (500 ?L de plasma) com 1 mL de acetonitrila. Após isto, 1,3 mL do sobrenadante foram adicionados a 4 mL de solução fosfato de sódio 50 mmol L-1 pH 9,0 e 120 ?L de clorofórmio (solvente extrator) foram injetados nesta amostra aquosa e 165 ?L de fase sedimentada foram recuperados. As características de desempenho analítico para ambos os métodos foram avaliadas, sendo obtida linearidade na faixa de concentração plasmática de 1-20 ?g/mL e limite inferior de quantificação (LIQ) de 1 ?g mL-1. Os ensaios de precisão e exatidão apresentaram valores de acordo com os guias oficiais. Além disso, os métodos foram seletivos, não apresentaram efeito residual e as amostras foram estáveis. Os valores de recuperação foram de 54,3 e 23% para HF-LPME e DLLME, respectivamente. Os métodos validados foram aplicados com sucesso em amostras de plasma de pacientes epilépticos em tratamento com a LTG. Além disso, as duas técnicas foram comparadas e a HF-LPME apresentou vantagens em relação à DLLME, mostrando ser uma técnica promissora para análise de matrizes complexas, com reduzido consumo de solvente orgânico e possibilidade de automação.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The immigrant population living in Spain grew exponentially in the early 2000s but has been particularly affected by the economic crisis. This study aims to analyse health inequalities between immigrants born in middle- or low-income countries and natives in Spain, in 2006 and 2012, taking into account gender, year of arrival and socioeconomic exposures. Methods: Study of trends using two cross-sections, the 2006 and 2012 editions of the Spanish National Health Survey, including residents in Spain aged 15–64 years (20 810 natives and 2950 immigrants in 2006, 14 291 natives and 2448 immigrants in 2012). Fair/poor self-rated health, poor mental health (GHQ-12 > 2), chronic activity limitation and use of psychotropic drugs were compared between natives and immigrants who arrived in Spain before 2006, adjusting robust Poisson regression models for age and socioeconomic variables to obtain prevalence ratios (PR) and 95% confidence interval (CI). Results: Inequalities in poor self-rated health between immigrants and natives tend to increase among women (age-adjusted PR2006 = 1.39; 95% CI: 1.24–1.56, PR2012 = 1.56; 95% CI: 1.33–1.82). Among men, there is a new onset of inequalities in poor mental health (PR2006 = 1.10; 95% CI: 0.86–1.40, PR2012 = 1.34; 95% CI: 1.06–1.69) and an equalization of the previously lower use of psychotropic drugs (PR2006 = 0.22; 95% CI: 0.11–0.43, PR2012 = 1.20; 95% CI: 0.73–2.01). Conclusions: Between 2006 and 2012, immigrants who arrived in Spain before 2006 appeared to worsen their health status when compared with natives. The loss of the healthy immigrant effect in the context of a worse impact of the economic crisis on immigrants appears as potential explanation. Employment, social protection and re-universalization of healthcare would prevent further deterioration of immigrants’ health status.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Relaxation of the upper age limits for solid organ transplantation coupled with improvements in post-transplant survival have resulted in greater numbers of elderly patients receiving immunosuppressant drugs such as tacrolimus. Tacrolimus is a potent agent with a narrow therapeutic window and large inter- and intraindividual pharmacokinetic variability. Numerous physiological changes occur with aging that could potentially affect the pharmacokinetics of tacrolimus and, hence, patient dosage requirements. Tacrolimus is primarily metabolised by cytochrome P450 (CYP) 3A enzymes in the gut wall and liver. It is also a substrate for P-glycoprotein, which counter-transports diffused tacrolimus out of intestinal cells and back into the gut lumen. Age-associated alterations in CYP3A and P-glycoprotein expression and/or activity, along with liver mass and body composition changes, would be expected to affect the pharmacokinetics of tacrolimus in the elderly. However, interindividual variation in these processes may mask any changes caused by aging. More investigation is needed into the impact aging has on CYP and P-glycoprotein activity and expression. No single-dose, intense blood-sampling study has specifically compared the pharmacokinetics of tacrolimus across different patient age groups. However, five population pharmacokinetic studies, one in kidney, one in bone marrow and three in liver transplant recipients, have investigated age as a co-variate. None found a significant influence for age on tacrolimus bioavailability, volume of distribution or clearance. The number of elderly patients included in each study, however, was not documented and may have been only small. It is likely that inter- and intraindividual pharmacokinetic variability associated with tacrolimus increase in elderly populations. In addition to pharmacokinetic differences, donor organ viability, multiple co-morbidity, polypharmacy and immunological changes need to be considered when using tacrolimus in the elderly. Aging is associated with decreased immunoresponsiveness, a slower body repair process and increased drug adverse effects. Elderly liver and kidney transplant recipients are more likely to develop new-onset diabetes mellitus than younger patients. Elderly transplant recipients exhibit higher mortality from infectious and cardiovascular causes than younger patients but may be less likely to develop acute rejection. Elderly kidney recipients have a higher potential for chronic allograft nephropathy, and a single rejection episode can be more devastating. There is a paucity of information on optimal tacrolimus dosage and target trough concentration in the elderly. The therapeutic window for tacrolimus concentrations may be narrower. Further integrated pharmacokinetic-pharmaco-dynamic studies of tacrolimus are required. It would appear reasonable, based on current knowledge, to commence tacrolimus at similar doses as those used in younger patients. Maintenance dose requirements over the longer term may be lower in the elderly, but the increased variability in kinetics and the variety of factors that impact on dosage suggest that patient care needs to be based around more frequent monitoring in this age group.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia. Some non-coding RNAs (miRNAs) have been involved in regulatory activity in arrhythmogenesis, targeting genes that contribute to the development of AF. The present study aimed to evaluate the expression of candidate miRNAs in plasma from patients with AF and new-onset AF and its application as future markers for diagnosis and monitoring of disease. miR-21, miR-133a, miR-133b, miR-150, miR-328 and miR-499 were selected as targets in this study through a prior literature review. They were isolated from plas-ma of individuals aged from 20 to 85 years old with AF (n = 17), new-onset AF (n = 5) and without AF (n = 15), where the latter was the control group. The results were ana-lyzed by Real-Time PCR (RT-PCR) with miScript SYBR Green PCR. We observed that miR-21, miR-133b, miR-328 and miR-499 had different levels of expression be-tween the three groups (p <0.05). Increased expression of miR-21 (0.6-fold), miR-133b (1.4-fold), miR-328 (2.0-fold) and miR-499 (2.3-fold) in patients with new-onset AF when compared to AF and control subjects. The miR-133a and miR-150 expression did not differ among the groups. miR-21, miR-133b, miR-328 and miR-499 may be potential biomarkers for AF as well as for new-onset AF, for monitoring and for the di-agnosis. These findings may contribute to the understanding of the process that trig-gers AF and suggest application these molecules as future biomarkers for AF.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia. Some non-coding RNAs (miRNAs) have been involved in regulatory activity in arrhythmogenesis, targeting genes that contribute to the development of AF. The present study aimed to evaluate the expression of candidate miRNAs in plasma from patients with AF and new-onset AF and its application as future markers for diagnosis and monitoring of disease. miR-21, miR-133a, miR-133b, miR-150, miR-328 and miR-499 were selected as targets in this study through a prior literature review. They were isolated from plas-ma of individuals aged from 20 to 85 years old with AF (n = 17), new-onset AF (n = 5) and without AF (n = 15), where the latter was the control group. The results were ana-lyzed by Real-Time PCR (RT-PCR) with miScript SYBR Green PCR. We observed that miR-21, miR-133b, miR-328 and miR-499 had different levels of expression be-tween the three groups (p <0.05). Increased expression of miR-21 (0.6-fold), miR-133b (1.4-fold), miR-328 (2.0-fold) and miR-499 (2.3-fold) in patients with new-onset AF when compared to AF and control subjects. The miR-133a and miR-150 expression did not differ among the groups. miR-21, miR-133b, miR-328 and miR-499 may be potential biomarkers for AF as well as for new-onset AF, for monitoring and for the di-agnosis. These findings may contribute to the understanding of the process that trig-gers AF and suggest application these molecules as future biomarkers for AF.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Singapore Cohort Study of the Risk Factors for myopia (SCORM) is a longitudinal school-based study that recruited 1979 children, aged 7 to 9 years old between 1999 and 2001, who were re-examined as adolescents in 2006 and 2007. This current study is to determine the prevalence, incidence and progression of myopia among Singapore teenagers and describe any trend in the SCORM study.

At each visit, participants underwent comprehensive eye examinations that included cycloplegic autorefraction and ocular biometry measurements. The prevalence of myopia (SE<-0.5D) and high myopia (SE<-6.0D) among Singapore teenagers aged 11-18 years old was 69.1% [95% confidence interval (CI) 66.5-71.7] and 7.1% (95% CI 5.8-8.7), respectively, with the highest prevalence in people of Chinese ethnicity (p<0.001). The annual incidence was 13.7% (95% CI 9.8-17.6). Males had twice the incidence of females (p=0.043), and adolescents with longer axial lengths (p<0.001) and deeper vitreous chamber (p<0.001) had higher myopia incidence. Annual myopia progression was -0.32 Diopters (D) (SD=0.40), with no difference by age, race or gender. However, adolescents with higher myopia levels at 2006 had significantly faster myopia progression rates (p<0.001).

Myopia prevalence in Singapore teenagers, especially Singapore Chinese teenagers, is one of the highest in the world. In adolescents, there is still a high rate of new onset and rapid progression of myopia. These findings indicate that adolescence may still represent a viable period for intervention programs to mitigate myopia onset and progression.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Preeclampsia (PE) is a pregnancy complication that is new-onset of hypertension and proteinuria after 20 weeks of gestation. However, subclinical renal dysfunction may be apparent earlier in gestation prior to the clinical presentation of PE. Although the maternal syndrome of PE resolves early postpartum, women with a history of PE are at higher risk of renal dysfunction later in life. Mineral metabolism, such as phosphate balance is heavily dependent on renal function, yet, phosphate handling in women with a history of PE is largely unknown. To investigate whether women with a history of PE would exhibit changes in phosphate metabolism compared to healthy parous women, phosphate loading test was used. Women with or without a history of PE, who were 6 months to 5 years postpartum, were recruited for this study. Blood and urine samples were collected before and after the oral dosing of 500mg phosphate solution. Biochemical markers of phosphate metabolism and renal function were evaluated. In order to assess the difference in renal function alteration between first trimester women who were or were not destined to develop PE, plasma cystatin C concentration was analysed. After phosphate loading, women with a history of PE had significantly elevated serum phosphate at both 1- and 2-hour, while controls had higher urine phosphate:urine creatinine excretion ratio at 1-hour than women with a history of PE. Women with a history of PE had no changes in intact parathyroid hormone (iPTH) concentration throughout the study period, whereas controls had elevated iPTH at 1-hour from baseline. In terms of renal function in the first trimester, there was no difference in plasma cystatin C concentration between women who were or were not destined to develop PE. The elevation of serum phosphate in women with a history of PE could be due to the delay in phosphate excretion. Prolong elevation of serum phosphate can have serious consequences later in life. Thus, oral phosphate challenge may serve as a useful method of early screening for altered phosphate metabolism and renal function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Preeclampsia (PE) is a pregnancy complication that is new-onset of hypertension and proteinuria after 20 weeks of gestation. However, subclinical renal dysfunction may be apparent earlier in gestation prior to the clinical presentation of PE. Although the maternal syndrome of PE resolves early postpartum, women with a history of PE are at higher risk of renal dysfunction later in life. Mineral metabolism, such as phosphate balance is heavily dependent on renal function, yet, phosphate handling in women with a history of PE is largely unknown. To investigate whether women with a history of PE would exhibit changes in phosphate metabolism compared to healthy parous women, phosphate loading test was used. Women with or without a history of PE, who were 6 months to 5 years postpartum, were recruited for this study. Blood and urine samples were collected before and after the oral dosing of 500mg phosphate solution. Biochemical markers of phosphate metabolism and renal function were evaluated. In order to assess the difference in renal function alteration between first trimester women who were or were not destined to develop PE, plasma cystatin C concentration was analysed. After phosphate loading, women with a history of PE had significantly elevated serum phosphate at both 1- and 2-hour, while controls had higher urine phosphate:urine creatinine excretion ratio at 1-hour than women with a history of PE. Women with a history of PE had no changes in intact parathyroid hormone (iPTH) concentration throughout the study period, whereas controls had elevated iPTH at 1-hour from baseline. In terms of renal function in the first trimester, there was no difference in plasma cystatin C concentration between women who were or were not destined to develop PE. The elevation of serum phosphate in women with a history of PE could be due to the delay in phosphate excretion. Prolong elevation of serum phosphate can have serious consequences later in life. Thus, oral phosphate challenge may serve as a useful method of early screening for altered phosphate metabolism and renal function.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AIMS: Heart failure has been demonstrated in previous studies to have a dismal prognosis. However, the modern-day prognosis of patients with new onset heart failure diagnosed in the community managed within a disease management programme is not known. The purpose of this study is to report on prognosis of patients presenting with new onset heart failure in the community who are subsequently followed in a disease management program.

METHODS AND RESULTS: A review of patients referred to a rapid access heart failure diagnostic clinic between 2002 and 2012 was undertaken. Details of diagnosis, demographics, medical history, medications, investigations and mortality data were analysed. A total of 733 patients were seen in Rapid Access Clinic for potential new diagnosis of incident of heart failure. 38.9% (n=285) were diagnosed with heart failure, 40.7% (n=116) with HF-REF and 59.3% (n=169) with HF-PEF. There were 84 (29.5%) deaths in the group of patients diagnosed with heart failure; 41 deaths (35.3%) occurred in patients with HF-REF and 43 deaths (25.4%) occurred in patients with HF-PEF. In patients with heart failure, 52.4% (n=44) died from cardiovascular causes. 63.8% of HF patients were alive after 5 years resulting on average in a month per year loss of life expectancy over that period compared with aged matched simulated population.

CONCLUSIONS: In this community-based cohort, the prognosis of heart failure was better than reported in previous studies. This is likely due to the impact of prompt diagnosis, the improvement in therapies and care within a disease management structure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In an attempt to reduce the heart failure epidemic,screening and prevention will become an increasing focus ofmanagement in the wider at-risk population. Refining riskprediction through the use of biomarkers in isolation or incombination is emerging as a critical step in this process.The utility of biomarkers to identify disease manifestationsbefore the onset of symptoms and detrimental myocardialdamage is proving to be valuable. In addition, biomarkers thatpredict the likelihood and rate of disease progression over timewill help streamline and focus clinical efforts and therapeuticstrategies. Importantly, several recent early intervention studiesusing biomarker strategies are promising and indicate thatnot only can new-onset heart failure be reduced but also thedevelopment of other cardiovascular conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La sténose aortique est la cardiopathie valvulaire la plus fréquente retrouvée chez les patients agés. Suite à l’apparition des symptômes, la survie des patients diminue de façon drastique en l’absence d’un remplacement valvulaire aortique. Cependant, une proportion considérable de ces patients n’est pas opérée en raison d’un risque chirurgical élevé, l’âge étant l’une des principales raisons de refus d’un remplacement valvulaire aortique chirurgical. Ce défaut dans la prise en charge des ces patients a favorisé le développement du remplacement valvulaire aortique par cathéter où implantation valvulaire aortique par cathèter (TAVR ou TAVI), qui a représenté une révolution dans le traitement de la sténose aortique. Cette intervention est actuellement un traitement de routine chez les patients à haut risque chirurgical atteints d’une sténose aortique, même si la chirurgie cardiaque n’est pas contre-indiquée. Ces dernières années ont vu un changement de profil des candidats potentiels vers une population à plus faible risque. Cependant, plusieurs préoccupations demeurent. L’une des plus importantes est la survenue des arythmies et de troubles de conduction, notamment le bloc de branche gauche et le bloc auriculo-ventriculaire, qui sont des complications fréquemment associées au TAVR. Malgré l’évolution de la technologie et le développement de nouveaux dispositifs réduisant le taux global de complications, aucune amélioration n’a pas été intégrée pour prévenir l’apparition de telles complications. De plus, l’utilisation de certains dispositifs de nouvelle génération semble être associée à un risque accru de troubles de conduction, et par conséquent, l’incidence de ces complications pourrait augmenter dans le futur. Cependant, L’impact et l’évolution de ces complications sont inconnus. Ce travail de recherche évalue l’incidence et l’évolution des troubles de conduction suite au TAVR et l’impact des blocs de branche gauche de novo et de l’implantation d’un pacemaker sur les résultats cliniques et échocardiographiques.