41 resultados para Administration system
em Université de Lausanne, Switzerland
Resumo:
BACKGROUND: Most clinical guidelines recommend that AIDS-free, HIV-infected persons with CD4 cell counts below 0.350 × 10(9) cells/L initiate combined antiretroviral therapy (cART), but the optimal CD4 cell count at which cART should be initiated remains a matter of debate. OBJECTIVE: To identify the optimal CD4 cell count at which cART should be initiated. DESIGN: Prospective observational data from the HIV-CAUSAL Collaboration and dynamic marginal structural models were used to compare cART initiation strategies for CD4 thresholds between 0.200 and 0.500 × 10(9) cells/L. SETTING: HIV clinics in Europe and the Veterans Health Administration system in the United States. PATIENTS: 20, 971 HIV-infected, therapy-naive persons with baseline CD4 cell counts at or above 0.500 × 10(9) cells/L and no previous AIDS-defining illnesses, of whom 8392 had a CD4 cell count that decreased into the range of 0.200 to 0.499 × 10(9) cells/L and were included in the analysis. MEASUREMENTS: Hazard ratios and survival proportions for all-cause mortality and a combined end point of AIDS-defining illness or death. RESULTS: Compared with initiating cART at the CD4 cell count threshold of 0.500 × 10(9) cells/L, the mortality hazard ratio was 1.01 (95% CI, 0.84 to 1.22) for the 0.350 threshold and 1.20 (CI, 0.97 to 1.48) for the 0.200 threshold. The corresponding hazard ratios were 1.38 (CI, 1.23 to 1.56) and 1.90 (CI, 1.67 to 2.15), respectively, for the combined end point of AIDS-defining illness or death. Limitations: CD4 cell count at cART initiation was not randomized. Residual confounding may exist. CONCLUSION: Initiation of cART at a threshold CD4 count of 0.500 × 10(9) cells/L increases AIDS-free survival. However, mortality did not vary substantially with the use of CD4 thresholds between 0.300 and 0.500 × 10(9) cells/L.
Resumo:
Drug delivery is one of the most common clinical routines in hospitals, and is critical to patients' health and recovery. It includes a decision making process in which a medical doctor decides the amount (dose) and frequency (dose interval) on the basis of a set of available patients' feature data and the doctor's clinical experience (a priori adaptation). This process can be computerized in order to make the prescription procedure in a fast, objective, inexpensive, non-invasive and accurate way. This paper proposes a Drug Administration Decision Support System (DADSS) to help clinicians/patients with the initial dose computing. The system is based on a Support Vector Machine (SVM) algorithm for estimation of the potential drug concentration in the blood of a patient, from which a best combination of dose and dose interval is selected at the level of a DSS. The addition of the RANdom SAmple Consensus (RANSAC) technique enhances the prediction accuracy by selecting inliers for SVM modeling. Experiments are performed for the drug imatinib case study which shows more than 40% improvement in the prediction accuracy compared with previous works. An important extension to the patient features' data is also proposed in this paper.
Resumo:
Liposomes are vesicular lipidic systems allowing encapsulation of drugs. This article reviews the relevant issues in liposome structure (composition and size), and their influence on intravitreal pharmacokinetics. Liposome-mediated drug delivery to the posterior segment of the eye via intravitreal administration has been addressed by several authors and remains experimental. Liposomes have been used for intravitreal delivery of antibiotics, antivirals, antifungal drugs, antimetabolites, and cyclosporin. Encapsulation of these drugs within liposomes markedly increased their intravitreal half-life, and reduced their retinal toxicity. Liposomes have also shown an attractive potential for retinal gene transfer by intravitreal delivery of plasmids or oligonucleotides.
Resumo:
RATIONALE: A dysregulation of the hypothalamic-pituitary-adrenal (HPA) axis is a well-documented neurobiological finding in major depression. Moreover, clinically effective therapy with antidepressant drugs may normalize the HPA axis activity. OBJECTIVE: The aim of this study was to test whether citalopram (R/S-CIT) affects the function of the HPA axis in patients with major depression (DSM IV). METHODS: Twenty depressed patients (11 women and 9 men) were challenged with a combined dexamethasone (DEX) suppression and corticotropin-releasing hormone (CRH) stimulation test (DEX/CRH test) following a placebo week and after 2, 4, and 16 weeks of 40 mg/day R/S-CIT treatment. RESULTS: The results show a time-dependent reduction of adrenocorticotrophic hormone (ACTH) and cortisol response during the DEX/CRH test both in treatment responders and nonresponders within 16 weeks. There was a significant relationship between post-DEX baseline cortisol levels (measured before administration of CRH) and severity of depression at pretreatment baseline. Multiple linear regression analyses were performed to identify the impact of psychopathology and hormonal stress responsiveness and R/S-CIT concentrations in plasma and cerebrospinal fluid (CSF). The magnitude of decrease in cortisol responsivity from pretreatment baseline to week 4 on drug [delta-area under the curve (AUC) cortisol] was a significant predictor (p<0.0001) of the degree of symptom improvement following 16 weeks on drug (i.e., decrease in HAM-D21 total score). The model demonstrated that the interaction of CSF S-CIT concentrations and clinical improvement was the most powerful predictor of AUC cortisol responsiveness. CONCLUSION: The present study shows that decreased AUC cortisol was highly associated with S-CIT concentrations in plasma and CSF. Therefore, our data suggest that the CSF or plasma S-CIT concentrations rather than the R/S-CIT dose should be considered as an indicator of the selective serotonergic reuptake inhibitors (SSRIs) effect on HPA axis responsiveness as measured by AUC cortisol response.
Resumo:
Abstract: The increasingly high hygienic standards characterizing westernized societies correlate with an increasingly high prevalence of allergic disease. Initially based on these observations, the hygiene hypothesis postulates that reduced microbial stimulation during infancy impairs the immune system development and increases the risk of allergy. Moreover, there is increasing evidence that the crosstalk existing between the intestine and the resident microbiota is crucial for gut homeostasis. In particular, bacterial colonization of the gut affects the integrity of the gut barrier and stimulates the development of the gut associated immune tissue, both phenomena being essential for the immune system to mount a controlled response to food antigens. Therefore, alterations in the microbial colonization process, by compromising the barrier homeostasis, may increase the risk of food allergy. In this context, antibiotic treatment, frequently prescribed during infancy, affects gut colonization by bacteria. However, little is known about the impact of alterations in the colonization process on the maturation of the gut barrier and on the immunological response to oral antigens. The objective of this work was to determine the impact of a commercial antibiotic preparation employed in pediatric settings on the gut barrier status at the critical period of the suckling/weaning transition and to evaluate the physiological consequences of this treatment in terms of immune response to food antigens. We established an antibiotic-treated suckling rat model relevant to the pediatric population in terms of type, dose and route of administration of the antibiotic and of changes in the patterns of microbial colonization. Oral tolerance to a novel luminal antigen (ovalbumin) was impaired when the antigen was introduced during antibiotic treatment. These results paralleled to alterations in the intestinal permeability to macromolecules and reduced intestinal expression of genes coding for the major histocomptatibility complex II molecules, which suggest a reduced capacity of antigen handling and presentation in the intestine of the antibiotic-treated animals. In addition, low luminal IgA levels and reduced intestinal expression of genes coding for antimicrobial proteins suggest that protection against pathogens was reduced under antibiotic treatment. In conclusion, we observed in suckling rats that treatment with abroad-spectrum antibiotic commonly used in pediatric practices reduced the capacity of the immune system to develop tolerance. The impact of the antibiotic treatment on the immune response to the antigen-was likely mediated by the alterations of the gut microbiota, through impairment in the mechanisms of antigen handling and presentation. This work reinforces the body of data supporting a key role of the intestinal microbiota modulating the risk of allergy development and leads us to propose that the introduction of new food antigens should be avoided during antibiotic treatment in infants. Résumé: L'augmentation du niveau d'hygiène caractérisant les sociétés occidentales semble être fortement corrélée avec l'augmentation des cas d'allergie dans ces pays. De cette observation est née l'hypothèse qu'une diminution des stimuli microbiens pendant l'enfance modifie le développement du système immunitaire augmentant ainsi le risque d'allergie. En ce sens, un nombre croissant de données indiquent que les interactions existant entre l'intestin et les bactéries résidantes sont cruciales pour l'équilibre du système. En effet, la présence de bactéries dans l'intestin affecte l'intégrité de sa fonction de barrière et stimule le développement du système immunitaire intestinal. Ces deux paramètres étant essentiels à la mise en place d'une réponse contrôlée vis à vis d'un antigène reçu oralement, toute modification du processus naturel de colonisation compromettant l'équilibre intestinal pourrait augmenter le risque d'allergie. Les traitements aux antibiotiques, fréquemment prescrits en pédiatrie, modifient de façon conséquente le processus de colonisation bactérienne. Cependant peu de données existent concernant l'impact d'une altération du processus de colonisation sur la maturation de la barrière intestinale et de la réponse immunitaire dirigée contre un antigène. L'objectif de ce travail était de déterminer l'impact d'un antibiotique commercial et employé en pédiatrie sur l'état de la barrière intestinale au moment critique du sevrage et d'évaluer les conséquences physiologiques d'un tel traitement sur la réponse immune à un antigène alimentaire. Nous avons mis en place un modèle de rats allaités, traités à l'antibiotique, le plus proche possible des pratiques pédiatriques, en terme de nature, dose et voie d'administration de l'antibiotique. Nous avons constaté que l'établissement de la tolérance orale à un nouvel antigène (l'ovalbumine) est altéré quand celui-ci est donné pour la première fois au cours du traitement antibiotique. Ces résultats coïncident avec une diminution de la perméabilité intestinale aux macromolécules, ainsi qu'avec une diminution de l'expression des gènes codant pour les molécules du complexe majeur d'histocomptatibilité de classe II, suggérant une modification de l'apprêtement et de la présentation de l'antigène au niveau intestinal chez les rats traités à l'antibiotique. De plus, un faible taux d'IgA et une diminution de l'expression des gènes codant pour des protéines antimicrobiennes, observés après l'administration d'antibiotique, laissent à penser que la protection contre un pathogène est diminuée lors d'un traitement antibiotique. En conclusion, nous avons observé qu'un traitement antibiotique à large spectre d'activité, couramment utilisé en pédiatrie, réduit la capacité d'induction de la tolérance orale chez le rat allaité. L'impact du traitement antibiotique sur la réponse immune semble induite par l'altération de la flore intestinale via son effet sur les mécanismes d'apprêtement et de présentation de l'antigène. Ce travail renforce l'ensemble des données existantes qui accorde à la flore intestinale un rôle clef dans la modulation du risque de développement d'allergie et nous amène à recommander d'éviter l'introduction d'un nouvel aliment lorsqu'un enfant est traité aux antibiotiques.
Resumo:
RésuméL'addiction aux drogues est une maladie multifactorieile affectant toutes les strates de notre société. Cependant, la vulnérabilité à développer une addiction dépend de facteurs environnementaux, génétiques et psychosociaux. L'addiction aux drogues est décrite comme étant une maladie chronique avec un taux élevé de rechutes. Elle se caractérise par un besoin irrépressible de consommer une drogue et une augmentation progressive de la consommation en dépit des conséquences néfastes. Les mécanismes cérébraux responsables des dépendances aux drogues ne sont que partiellement élucidés, malgré une accumulation croissante d'évidences démontrant des adaptations au niveau moléculaire et cellulaire au sein des systèmes dopaminergique et glutamatergique. L'identification de nouveaux facteurs neurobiologiques responsables de la vulnérabilité aux substances d'abus est cruciale pour le développement de nouveaux traitements thérapeutiques capables d'atténuer et de soulager les symptômes liés à la dépendance aux drogues.Au cours des dernières années, de nombreuses études ont démontré qu'un nouveau circuit cérébral, le système hypocrétinergique, était impliqué dans plusieurs fonctions physiologiques, tel que l'éveil, le métabolisme énergétique, la motivation, le stress et les comportements liés aux phénomènes de récompense. Le système hypocrétinergique est composé d'environ 3000-4000 neurones issus de l'hypothalamus latéral projetant dans tout ie cerveau. Des souris transgéniques pour le gène des hypocrétines ont été générées et leur phénotype correspond à celui des animaux sauvages, excepté le fait qu'elles soient atteintes d'attaques de sommeil similaires à celles observées chez les patients narcoleptiques. H semblerait que les hypocrétines soient requises pour l'acquisition et l'expression de la dépendance aux drogues. Cependant, le mécanisme précis reste encore à être élucidé. Dans ce rapport, nous rendons compte des comportements liés aux phénomènes de récompense liés à l'alcool et à la cocaine chez les souris knock-out (KO), hétérozygotes (HET) et sauvages (WT).Nous avons, dans un premier temps, évalué l'impact d'injections répétées de cocaïne (15 mg/kg, ip) sur la sensibilisation locomotrice et sur le conditionnement place préférence. Nous avons pu observer que les souris WT, HET et KO exprimaient une sensibilisation locomotrice induite par une administration chronique de cocaïne, cependant les souris déficientes en hypocrétines démontraient une sensibilisation retardée et atténuée. Π est intéressant de mentionner que les mâles HET exprimaient une sensibilisation comportementale intermédiaire. Après normalisation des données, toutes les souris exprimaient une amplitude de sensibilisation similaire, excepté les souris mâles KO qui affichaient, le premier jour de traitement, une sensibilisation locomotrice réduite et retardée, reflétant un phénotype hypoactif plutôt qu'une altération de la réponse aux traitements chroniques de cocaïne. Contre toute attente, toutes les souris femelles exprimaient un pattern similaire de sensibilisation locomotrice à la cocaïne. Nous avons ensuite évalué l'effet d'un conditionnement comportemental à un environnement associé à des injections répétées de cocaine (15 mg / kg ip). Toutes les souris, quelque soit leur sexe ou leur génotype, ont manifesté une préférence marquée pour l'environnement apparié à la cocaïne. Après deux semaines d'abstinence à la cocaïne, les mâles et les femelles déficientes en hypocrétines n'exprimaient plus aucune préférence pour le compartiment précédemment associé à la cocaïne. Alors que les souris WT et HET maintenaient leur préférence pour le compartiment associé à la cocaïne. Pour finir, à l'aide d'un nouveau paradigme appelé IntelliCage®, nous avons pu évaluer la consommation de liquide chez les femelles WT, HET et KO. Lorsqu'il n'y avait que de l'eau disponible, nous avons observé que les femelles KO avaient tendance à moins explorer les quatre coins de la cage. Lorsque les souris étaient exposées à quatre types de solutions différentes (eau, ImM quinine ou 0.2% saccharine, alcool 8% et alcool 16%), les souris KO avaient tendance à moins consommer l'eau sucrée et les solutions alcoolisées. Cependant, après normalisation des données, aucune différence significative n'a pu être observée entre les différents génotypes, suggérant que la consommation réduite d'eau sucrée ou d'alcool peut être incombée à l'hypoactivité des souris KO.Ces résultats confirment que le comportement observé chez les souris KO serait dû à des compensations développementales, puisque la sensibilisation locomotrice et le conditionnement comportemental à la cocaïne étaient similaires aux souris HET et WT. En ce qui concerne la consommation de liquide, les souris KO avaient tendance à consommer moins d'eau sucrée et de solutions alcoolisées. Le phénotype hypoactif des souris déficientes en hypocrétine est probablement responsable de leur tendance à moins explorer leur environnement. Il reste encore à déterminer si l'expression de ce phénotype est la conséquence d'un état de vigilance amoindri ou d'une motivation diminuée à la recherche de récompense. Nos résultats suggèrent que les souris déficientes en hypocrétine affichent une motivation certaine à la recherche de récompense lorsqu'elles sont exposées à des environnements où peu d'efforts sont à fournir afin d'obtenir une récompense.AbstractDrug addiction is a multifactorial disorder affecting human beings regardless their education level, their economic status, their origin or even their gender, but the vulnerability to develop addiction depends on environmental, genetic and psychosocial dispositions. Drug addiction is defined as a chronic relapsing disorder characterized by compulsive drug seeking, with loss of control over drug intake and persistent maladaptive decision making in spite of adverse consequences. The brain mechanisms responsible for drug abuse remain partially unknown despite accumulating evidence delineating molecular and cellular adaptations within the glutamatergic and the dopaminergic systems. However, these adaptations do not fully explain the complex brain disease of drug addiction. The identification of other neurobiological factors responsible for the vulnerability to substance abuse is crucial for the development of promising therapeutic treatments able to alleviate signs of drug dependence.For the past few years, growing evidence demonstrated that a recently discovered brain circuit, the hypocretinergic system, is implicated in many physiological functions, including arousal, energy metabolism, motivation, stress and reward-related behaviors. The hypocretin system is composed of a few thousands neurons arising from the lateral hypothalamus and projecting to the entire brain. Hypocretin- deficient mice have been generated, and unexpectedly, their phenotype resembles that of wild type mice excepting sleep attacks strikingly similar to those of human narcolepsy patients. Evidence suggesting that hypocretins are required for the acquisition and the expression of drug addiction has also been reported; however the precise mechanism by which hypocretins modulate drug seeking behaviors remains a matter of debate. Here, we report alcohol and cocaine reward-related behaviors in hypocretin-deficient mice (KO), as well as heterozygous (HET) and wild type (WT) littermates.We first evaluated the impact of repeated cocaine injections (15 mg/kg, ip) on locomotor sensitization and conditioned place preference. We observed that WT, HET and KO mice exhibited behavioral sensitization following repeated cocaine administrations, but hypocretin deficient males displayed a delayed and attenuated response to chronic cocaine administrations. Interestingly, HET males exhibited an intermediate pattern of behavioral sensitization. However, after standardization of the post-injection data versus the period of habituation prior to cocaine injections, all mice displayed similar amplitudes of behavioral sensitization, except a reduced response in KO males on the first day, suggesting that the delayed and reduced cocaine-induced locomotor sensitization may reflect a hypoactive phenotype and probably not an altered response to repeated cocaine administrations. Unexpectedly, all female mice exhibited similar patterns of cocaine-induced behavioral sensitization. We then assessed the behavioral conditioning for an environment repeatedly paired with cocaine injections (15 mg/kg ip). All mice, whatever their gender or genotype, exhibited a robust preference for the environment previously paired with cocaine administrations. Noteworthy, following two weeks of cocaine abstinence, hypocretin-deficient males and females no longer exhibited any preference for the compartment previously paired with cocaine rewards whereas both WT and HET mice continued manifesting a robust preference. We finally assessed drinking behaviors in WT, HET and KO female mice using a novel paradigm, the IntelliCages®. We report here that KO females tended to less explore the four cage comers where water was easily available. When exposed to four different kinds of liquid solutions (water, ImM quinine or saccharine 0.2%, alcohol 8% and alcohol 16%), KO mice tended to less consume the sweet and the alcoholic beverages. However, after data standardization, no significant differences were noticed between genotypes suggesting that the hypoactive phenotype is most likely accountable for the trend regarding the reduced sweet or alcohol intake in KO.Taken together, the present findings confirm that the behavior seen in Hcrt KO mice likely reflects developmental compensations since only a slightly altered cocaine-induced behavioral sensitization and a normal behavioral conditioning with cocaine were observed in these mice compared to HET and WT littermates. With regards to drinking behaviors, KO mice barely displayed any behavioral changes but a trend for reducing sweet and alcoholic beverages. Overall, the most striking observation is the constant hypoactive phenotype seen in the hypocretin-deficient mice that most likely is accountable for their reduced tendency to explore the environment. Whether this hypoactive phenotype is due to a reduced alertness or reduced motivation for reward seeking remains debatable, but our findings suggest that the hypocretin-deficient mice barely display any altered motivation for reward seeking in environments where low efforts are required to access to a reward.
Resumo:
The feasibility of three-dimensional (3D) whole-heart imaging of the coronary venous (CV) system was investigated. The hypothesis that coronary magnetic resonance venography (CMRV) can be improved by using an intravascular contrast agent (CA) was tested. A simplified model of the contrast in T(2)-prepared steady-state free precession (SSFP) imaging was applied to calculate optimal T(2)-preparation durations for the various deoxygenation levels expected in venous blood. Non-contrast-agent (nCA)- and CA-enhanced images were compared for the delineation of the coronary sinus (CS) and its main tributaries. A quantitative analysis of the resulting contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) in both approaches was performed. Precontrast visualization of the CV system was limited by the poor CNR between large portions of the venous blood and the surrounding tissue. Postcontrast, a significant increase in CNR between the venous blood and the myocardium (Myo) resulted in a clear delineation of the target vessels. The CNR improvement was 347% (P < 0.05) for the CS, 260% (P < 0.01) for the mid cardiac vein (MCV), and 430% (P < 0.05) for the great cardiac vein (GCV). The improvement in SNR was on average 155%, but was not statistically significant for the CS and the MCV. The signal of the Myo could be significantly reduced to about 25% (P < 0.001).
Resumo:
Whether a higher dose of a long-acting angiotensin II receptor blocker (ARB) can provide as much blockade of the renin-angiotensin system over a 24-hour period as the combination of an angiotensin-converting enzyme inhibitor and a lower dose of ARB has not been formally demonstrated so far. In this randomized double-blind study we investigated renin-angiotensin system blockade obtained with 3 doses of olmesartan medoxomil (20, 40, and 80 mg every day) in 30 normal subjects and compared it with that obtained with lisinopril alone (20 mg every day) or combined with olmesartan medoxomil (20 or 40 mg). Each subject received 2 dose regimens for 1 week according to a crossover design with a 1-week washout period between doses. The primary endpoint was the degree of blockade of the systolic blood pressure response to angiotensin I 24 hours after the last dose after 1 week of administration. At trough, the systolic blood pressure response to exogenous angiotensin I was 58% +/- 19% with 20 mg lisinopril (mean +/- SD), 58% +/- 11% with 20 mg olmesartan medoxomil, 62% +/- 16% with 40 mg olmesartan medoxomil, and 76% +/- 12% with the highest dose of olmesartan medoxomil (80 mg) (P = .016 versus 20 mg lisinopril and P = .0015 versus 20 mg olmesartan medoxomil). With the combinations, blockade was 80% +/- 22% with 20 mg lisinopril plus 20 mg olmesartan medoxomil and 83% +/- 9% with 20 mg lisinopril plus 40 mg olmesartan medoxomil (P = .3 versus 80 mg olmesartan medoxomil alone). These data demonstrate that a higher dose of the long-acting ARB olmesartan medoxomil can produce an almost complete 24-hour blockade of the blood pressure response to exogenous angiotensin in normal subjects. Hence, a higher dose of a long-acting ARB is as effective as a lower dose of the same compound combined with an angiotensin-converting enzyme inhibitor in terms of blockade of the vascular effects of angiotensin.
Resumo:
OBJECTIVE: To assess the change in non-compliant items in prescription orders following the implementation of a computerized physician order entry (CPOE) system named PreDiMed. SETTING: The department of internal medicine (39 and 38 beds) in two regional hospitals in Canton Vaud, Switzerland. METHOD: The prescription lines in 100 pre- and 100 post-implementation patients' files were classified according to three modes of administration (medicines for oral or other non-parenteral uses; medicines administered parenterally or via nasogastric tube; pro re nata (PRN), as needed) and analyzed for a number of relevant variables constitutive of medical prescriptions. MAIN OUTCOME MEASURE: The monitored variables depended on the pharmaceutical category and included mainly name of medicine, pharmaceutical form, posology and route of administration, diluting solution, flow rate and identification of prescriber. RESULTS: In 2,099 prescription lines, the total number of non-compliant items was 2,265 before CPOE implementation, or 1.079 non-compliant items per line. Two-thirds of these were due to missing information, and the remaining third to incomplete information. In 2,074 prescription lines post-CPOE implementation, the number of non-compliant items had decreased to 221, or 0.107 non-compliant item per line, a dramatic 10-fold decrease (chi(2) = 4615; P < 10(-6)). Limitations of the computerized system were the risk for erroneous items in some non-prefilled fields and ambiguity due to a field with doses shown on commercial products. CONCLUSION: The deployment of PreDiMed in two departments of internal medicine has led to a major improvement in formal aspects of physicians' prescriptions. Some limitations of the first version of PreDiMed were unveiled and are being corrected.
Oral cancer treatments and adherence: medication event monitoring system assessment for capecitabine
Resumo:
Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.
Resumo:
BACKGROUND: Magnetic resonance imaging (MRI) of pacemakers is a relative contraindication because of the risks to the patient from potentially hazardous interactions between the MRI and the pacemaker system. Chest scans (ie, cardiac magnetic resonance scans) are of particular importance and higher risk. The previously Food and Drug Administration-approved magnetic resonance conditional system includes positioning restrictions, limiting the powerful utility of MRI. OBJECTIVE: To confirm the safety and effectiveness of a pacemaker system designed for safe whole body MRI without MRI scan positioning restrictions. METHODS: Primary eligibility criteria included standard dual-chamber pacing indications. Patients (n = 263) were randomized in a 2:1 ratio to undergo 16 chest and head scans at 1.5 T between 9 and 12 weeks postimplant (n = 177) or to not undergo MRI (n = 86) post-implant. Evaluation of the pacemaker system occurred immediately before, during (monitoring), and after MRI, 1-week post-MRI, and 1-month post-MRI, and similarly for controls. Primary end points measured the MRI-related complication-free rate for safety and compared pacing capture threshold between MRI and control subjects for effectiveness. RESULTS: There were no MRI-related complications during or after MRI in subjects undergoing MRI (n = 148). Differences in pacing capture threshold values from pre-MRI to 1-month post-MRI were minimal and similar between the MRI and control groups. CONCLUSIONS: This randomized trial demonstrates that the Advisa MRI pulse generator and CapSureFix MRI 5086MRI lead system is safe and effective in the 1.5 T MRI environment without positioning restrictions for MRI scans or limitations of body parts scanned.
Resumo:
Introduction: In forensic toxicology, cocaine is better known for its powerful stimulating effects of nervous system and its high potential for recreational abuse, than for his therapeutic use. However, cocaine is still use as a topical anesthetic and peripheral vasoconstrictor in surgeries of eye, ear, nose and throat. Last decade, an increase of the presence of cocaine and metabolites in blood samples of drivers suspected to drive under the influence of drugs (DUID) was observed in Switzerland (Augsburger et al., Forensic Sci Int 153 (2005) 11-15; Senna et al., Forensic Sci Int 198 (2010) 11-16). Observed blood concentration ranges of cocaine and benzoylecgonine were 10-925 μg/L and 20-5200 μg/L, respectively. Since 2005, zero-tolerance approach was introduced in the Swiss legislation for different substances, especially cocaine (analytical cutoff: 15 μg/L). Thus, the interpretation often amounts to determine if the concentration is situated above or under the limit. However, it is important for the interpretation to take into account the context and to be critical with the obtained results, at the risk of ending in erroneous conclusions. Methods: Systematical toxicological analyses were performed on blood and urine, if available, for 5 DUID cases, as already published (Augsburger et al., Forensic Sci Int 153 (2005)). Positive results were confirmed and drugs were quantified in biological samples by GCMS, GC-MS/MS or LC-MS/MS. Results: Administration of cocaine after traffic accident was identified in five cases. All people were admitted to the emergency room because of severe trauma. Maxillofacial surgery was done shortly after admission to the emergency room, involving use of nasal application of cocaine (swab). For all cases, use of cocaine swab was not mentioned in the document filled by the police and by medical staff requested for blood and urine sampling. The information was obtained retrospectively after consultation of the medical records, without precise indication of the application time or dose. Case 1. A 83-year old man (pedestrian) was hit by a car. Blood (+11h after the accident): cocaine (16 μg/L), benzoylecgonine (370 μg/L). Urine: cocaine (1700 μg/L), benzoylecgonine (560 μg/L). Case 2. A 84-year old woman (pedestrian) was hit by a car. Blood (+1.5h after the accident): cocaine (230 μg/L), benzoylecgonine (370 μg/L). Urine was not available. Hair (+4 months after the accident): segment 1 (0-2 cm), cocaine not detected; segment 2 (2-4 cm), cocaine: <0.5 ng/mg. Case 3. A 66-year old man was involved in a car/car accident. He died 2 hours and 5 minutes after the crash. Blood (+1.5h after the accident): cocaine and metabolites not detected. Blood (+2h after the accident): cocaine (1750 μg/L), benzoylecgonine (460 μg/L). Blood (post-mortem): cocaine (370 μg/L), benzoylecgonine (200 μg/L). Urine (+1.5h after the accident): cocaine not detected. Case 4. A 57-year old woman on a motor scooter was hit by a car. She died 2 hours and 10 minutes after the crash. Blood (+0.5h after the accident): cocaine and metabolites not detected. Urine (post-mortem): cocaine (<20 μg/L), benzoylecgonine (120 μg/L). Case 5. A 30-year old man was involved in a car accident. Blood (+4h after the accident): cocaine (29 μg/L), benzoylecgonine (< 20 μg/L). Urine (+4h after the accident): cocaine and metabolites not detected. Ethanol (1,32 g/kg) and cannabinoids (THC (2,0 μg/L), THCCOOH (38 μg/L)) were also detected in blood. Conclusion: To our knowledge, this is the first description of DUID cases involving therapeutic use of cocaine after an accident. These results indicate that even if a per se law is effective for prosecution case of DUID, a critical interpretation of the results is always needed, especially if a medical intervention occurs after an accident.
Resumo:
Pharmacological treatment of hypertension is effective in preventing cardiovascular and renal complications. Calcium antagonists (CAs) and blockers of the renin-angiotensin system [angiotensin-converting enzyme (ACE) inhibitors and angiotensin II antagonists (ARBs)] are widely used today to initiate antihypertensive treatment but, when given as monotherapy, do not suffice in most patients to normalise blood pressure (BP). Combining a CA and either an ACE-inhibitor or an ARB considerably increases the antihypertensive efficacy, but not at the expense of a deterioration of tolerability. Several fixed-dose combinations are available (CA + ACE-inhibitors: amlodipine + benazepril, felodipine + ramipril, verapamil + trandolapril; CA + ARB: amlodipine + valsartan). They are expected not only to improve BP control, but also to facilitate long-term adherence with antihypertensive therapy, thereby providing maximal protection against the cardiovascular and renal damage caused by high BP.