92 resultados para IMPACT ASSESSMENT
Resumo:
RESUME : Valganciclovir (Valcyte®) is an orally administered ester prodrug of the standard anticytomegalovirus (CMV) drug ganciclovir. This drug enabled an important reduction of the burden of CMV morbidity and mortality in solid organ transplant recipients. Prevention of CMV infection and treatment of CMV disease requires drug administration during many weeks. Oral drug administration is therefore convenient. Valganciclovir has been developed to overcome the poor oral availability of ganciclovir, which limits its concentration exposure after oral administration and thus its efficacy. This prodrug crosses efficiently the intestinal barrier, is then hydrolyzed into ganciclovir, providing exposure similar to intravenous ganciclovir. Valganciclovir is now preferred for the prophylaxis and treatment of CMV infection in solid organ transplant recipients. Nevertheless, adequate dosage adjustment is necessary to optimize its use, avoiding either insufficient or exaggerate exposure related to differences in its pharmacokinetic profile between patients. The main goal of this thesis was to better describe the pharmacokinetic and pharmacodynamic profile of valganciclovir in solid organ transplant recipients, to assess their reproducibility and their predictability, and thus to evaluate the current recommendations for valganciclovir dosage adjustment and the potential contribution of routine therapeutic drug monitoring (TDM) to patients' management. A total of 437 ganciclovir plasma concentration data from 65 transplant patients (41 kidney, 12 lung, 10 heart and 2 liver recipients, 58 under oral valganciclovir prophylaxis, 8 under oral valganciclovir treatment and 2 under intravenous ganciclovir) were measured using a validated chromatographic method (HPLC) developed for this study. The results were analyzed by non-linear mixed effect modeling (NONMEM). A two-compartment model with first-order absorption appropriately described the data. Systemic clearance was markedly influenced by GFR, with further differences between graft types and sex (CL/GFR = 1.7 in kidney, 0.9 in heart and 1.2 in lung and liver recipients) with interpatient variability (CV%) of 26% and interoccasion variability of 12%. Body weight and sex influenced central volume of distribution (V1 = 0.34 l/kg in males and 0.27 l/kg in females) with an interpatient variability of 20%. Residual intrapatient variability was 21 %. No significant drug interaction influenced GCV disposition. VGC prophylactic efficacy and tolerability were good, without detectable dependence on GCV profile. In conclusion, this analysis highlights the importance of thorough adjustment of VGC dosage to renal function and body weight. Considering the good predictability and reproducibility of GCV profile after oral VGC in solid organ transplant recipients, routine TDM does not appear to be clinically indicated. However, GCV plasma measurement may still be helpful in specific clinical situations such as documentation of appropriate exposure in patients with potentially compromised absorption, or lack of response to CMV disease treatment, or under renal replacement therapy. RESUME : Le valganciclovir (Valcyte®) est un promédicament oral du ganciclovir qui est un anti-infectieux de référence contre les infections à cytomegalovirus (CMV). Cet antiviral a permis de réduire les effets délétères de cette infection jusqu'ici responsable d'une importante morbidité et mortalité chez les transplantés d'organe. La prévention et le traitement de l'infection à CMV sont donc nécessaires mais requièrent l'administration d'un agent antiviral sur une longue période. Un médicament administré par voie orale représente donc un avantage évident. Le valganciclovir a été développé dans le but d'améliorer la faible absorption orale du ganciclovir, et donc son efficacité. Cet ester valylique du ganciclovir traverse plus facilement la barrière gastro-intestinale, puis est hydrolysé en ganciclovir dans la circulation sanguine, produisant une exposition comparable à celle d'une perfusion intraveineuse de ganciclovir. De ce fait, le valganciclovir est devenu largement utilisé pour la prophylaxie mais aussi le traitement de l'infection à CMV. Néanmoins une utilisation optimale de ce nouveau médicament nécessite de bonnes connaissances sur son profil pharmacocinétique afin d'établir un schéma de dose adapté pour éviter tant une surexposition qu'une sous-exposition résultant des différences d'élimination entre les patients. Le but de cette thèse a été d'étudier le profil pharmacocinétique et pharmacodynamique du valganciclovir chez les transplantés d'organe ainsi que sa reproductibilité et sa prédictibilité. Il s'agissait d'apprécier de manière critique le schéma actuellement recommandé pour l'adaptation des doses de valganciclovir, mais aussi la contribution éventuelle d'un suivi des concentrations sanguines en routine. Un total de 437 taux sanguins de ganciclovir ont été mesurés, provenant de 65 patients transplantés d'organe (41 rénaux, 12 pulmonaires, 10 cardiaques et 2 hépatiques, 58 sous une prophylaxie orale de valganciclovir, 8 sous un traitement de valganciclovir et 2 sous un traitement intraveineux). Une méthode de chromatographie liquide à haute performance a été développée et validée pour cette étude. Les résultats ont été ensuite analysés par modélisation non linéaire à effets mixtes (NONMEM). Un modèle à deux compartiments avec absorption de premier ordre a permis de décrire les données. La clairance systémique était principalement influencée par le débit de filtration glomérulaire (GFR), avec une différence entre les types de greffe et les sexes (CL/GFR = 1.7 chez les greffés rénaux, 0.9 pour les greffés cardiaques et 1.2 pour le groupe des greffés pulmonaires et hépatiques) avec un variabilité inter-individuelle de 26% (CV%) et une variabilité inter-occasion de 12%. Le poids corporel ainsi que le sexe avaient une influence sur le volume central de distribution (V1 = 0.34 l/kg chez les hommes et 0.27 l/kg chez les femmes) avec une variabilité inter-individuelle de 20%. La variabilité intra-individuelle résiduelle était de 21 %. Aucune interaction médicamenteuse n'a montré d'influence sur le profil du ganciclovir. La prophylaxie avec le valganciclovir s'est révélée efficace et bien tolérée. En conclusion, cette analyse souligne l'importance d'une adaptation de la dose du valganciclovir à la fonction rénale et au poids du patient. Au vu de la bonne reproductibilité et prédictibilité du profil pharmacocinétique du ganciclovir chez les patients transplantés recevant du valganciclovir, un suivi des concentrations sanguines en routine ne semble pas cliniquement indiqué. Néanmoins, la mesure des taux plasmatiques de ganciclovir peut être utile dans certaines situations particulières, comme la vérification d'une exposition appropriée chez des patients susceptibles d'absorption insuffisante, ou ne répondant pas au traitement d'une infection à CMV ou encore sous épuration extra-rénale. RESUME LARGE PUBLIC : Le valganciclovir est un précurseur capable de libérer du ganciclovir, récemment développé pour améliorer la faible absorption orale de ce dernier. Une fois le valganciclovir absorbé, le ganciclovir libéré dans la circulation sanguine devient efficace contre les infections à cytomégalovirus. Ce virus largement répandu est responsable de maladies insidieuses et parfois graves chez les personnes présentant une baisse des défenses immunitaires, comme les greffés d'organe recevant un traitement anti-rejet. Le ganciclovir est administré pendant plusieurs mois consécutifs soit pour prévenir une infection après la transplantation, soit pour traiter une infection déclarée. La facilité d'administration du valganciclovir par voie orale représente un avantage sur une administration du ganciclovir par perfusion, qui nécessite une hospitalisation. Toutefois, la voie orale peut être une source supplémentaire de variabilité chez les patients, avec un impact potentiel sur l'efficacité ou la toxicité du médicament. Le but de cette étude a été - de décrire le devenir de ce médicament dans le corps humain (dont l'étude relève de la discipline de la pharmacocinétique) - de définir les facteurs cliniques pouvant expliquer les différences de concentration sanguine observées entre les patients sous une posologie donnée - d'explorer les relations entre les concentrations du médicament dans le sang et son efficacité ou la survenue d'effets indésirables (dont l'étude relève de la discipline de la pharmacodynamie). Cette étude a nécessité le développement et la validation, d'une méthode d'analyse pour mesurer la concentration sanguine du ganciclovir, puis son application à 437 échantillons provenant de 65 patients transplantés d'organe solide (41 rénaux, 12 pulmonaires, 10 cardiaques et 2 hépatiques) recevant du valganciclovir. Les résultats des mesures effectuées ont été analysés à l'aide d'un outil mathématique afin d'élaborer un modèle du devenir du médicament dans le sang chez chaque patient et à chaque occasion. Cette étude a permis d'évaluer chez des patients recevant le valganciclovir, la vitesse à laquelle l'organisme absorbe, distribue, puis élimine le médicament. La vitesse d'élimination dépendait étroitement de la fonction rénale, du type de greffe et du sexe alors que la distribution dépendait du poids et du sexe du patient. La variabilité non expliquée par ces facteurs cliniques était modérée et vraisemblablement sans conséquence clinique évidente soit sur l'efficacité ou la tolérance, qui se révèlent très satisfaisantes chez les patients de l'étude. Les observations n'ont pas révélé de relation entre les concentrations de médicament et l'efficacité thérapeutique ou la survenue d'effets indésirables, confirmant que les doses relativement faibles utilisées dans notre collectif de patients suffisaient à produire une exposition reproductible à des concentrations adéquates. En conclusion, le profil (et par conséquent l'absorption) du valganciclovir chez les patients transplantés semble bien prédictible après une adaptation de la dose à la fonction rénale et au poids du patient. Un contrôle systématique des concentrations sanguines n'est probablement pas indiqué en routine, mais cette mesure peut présenter un intérêt dans certaines conditions particulières.
Resumo:
BACKGROUND: The Advisa MRI system is designed to safely undergo magnetic resonance imaging (MRI). Its influence on image quality is not well known. OBJECTIVE: To evaluate cardiac magnetic resonance (CMR) image quality and to characterize myocardial contraction patterns by using the Advisa MRI system. METHODS: In this international trial with 35 participating centers, an Advisa MRI system was implanted in 263 patients. Of those, 177 were randomized to the MRI group and 150 underwent MRI scans at the 9-12-week visit. Left ventricular (LV) and right ventricular (RV) cine long-axis steady-state free precession MR images were graded for quality. Signal loss along the implantable pulse generator and leads was measured. The tagging CMR data quality was assessed as the percentage of trackable tagging points on complementary spatial modulation of magnetization acquisitions (n=16) and segmental circumferential fiber shortening was quantified. RESULTS: Of all cine long-axis steady-state free precession acquisitions, 95% of LV and 98% of RV acquisitions were of diagnostic quality, with 84% and 93%, respectively, being of good or excellent quality. Tagging points were trackable from systole into early diastole (360-648 ms after the R-wave) in all segments. During RV pacing, tagging demonstrated a dyssynchronous contraction pattern, which was not observed in nonpaced (n = 4) and right atrial-paced (n = 8) patients. CONCLUSIONS: In the Advisa MRI study, high-quality CMR images for the assessment of cardiac anatomy and function were obtained in most patients with an implantable pacing system. In addition, this study demonstrated the feasibility of acquiring tagging data to study the LV function during pacing.
Resumo:
OBJECTIVE: To assess the properties of various indicators aimed at monitoring the impact on the activity and patient outcome of a bed closure in a surgical intensive care unit (ICU). DESIGN: Comparison before and after the intervention. SETTING: A surgical ICU at a university hospital. PATIENTS: All patients admitted to the unit over two periods of 10 months. INTERVENTION: Closure of one bed out of 17. MEASUREMENTS AND RESULTS: Activity and outcome indicators in the ICU and the structures upstream from it (emergency department, operative theater, recovery room) and downstream from it (intermediate care units). After the bed closure, the monthly medians of admitted patients and ICU hospital days increased from 107 (interquartile range 94-112) to 113 (106-121, P=0.07) and from 360 (325-443) to 395 (345-436, P=0.48), respectively, along with the linear trend observed in our institution. All indicators of workload, patient severity, and outcome remained stable except for SAPS II score, emergency admissions, and ICU readmissions, which increased not only transiently but also on a mid-term basis (10 months), indicating that the process of patient care delivery was no longer predictable. CONCLUSIONS: Health care systems, including ICUs, are extraordinary flexible, and can adapt to multiple external constraints without altering commonly used activity and outcome indicators. It is therefore necessary to set up multiple indicators to be able to reliably monitor the impact of external interventions and intervene rapidly when the system is no longer under control.
Resumo:
OBJECTIVE: To evaluate the public health impact of statin prescribing strategies based on the Justification for the Use of Statins in Primary Prevention: An Intervention Trial Evaluating Rosuvastatin Study (JUPITER). METHODS: We studied 2268 adults aged 35-75 without cardiovascular disease in a population-based study in Switzerland in 2003-2006. We assessed the eligibility for statins according to the Adult Treatment Panel III (ATPIII) guidelines, and by adding "strict" (hs-CRP≥2.0mg/L and LDL-cholesterol <3.4mmol/L), and "extended" (hs-CRP≥2.0mg/L alone) JUPITER-like criteria. We estimated the proportion of CHD deaths potentially prevented over 10years in the Swiss population. RESULTS: Fifteen % were already taking statins, 42% were eligible by ATPIII guidelines, 53% by adding "strict", and 62% by adding "extended" criteria, with a total of 19% newly eligible. The number needed to treat with statins to avoid one CHD death over 10years was 38 for ATPIII, 84 for "strict" and 92 for "extended" JUPITER-like criteria. ATPIII would prevent 17% of CHD deaths, compared with 20% for ATPIII+"strict" and 23% for ATPIII + "extended" criteria (+6%). CONCLUSION: Implementing JUPITER-like strategies would make statin prescribing for primary prevention more common and less efficient than it is with current guidelines.
Resumo:
Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the trabecular bone score (TBS) measure. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis values, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goal of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. We included 631 women: mean age 67.4 ± 6.7 years, BMI 26.1 ± 4.6, mean lumbar spine BMD 0.943 ± 0.168 (T-score − 1.4 SD), and TBS 1.271 ± 0.103. As expected, correlation between BMD and site matched TBS is low (r2 = 0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2-2.5), 1.6 (1.2-2.1), and 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), and 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < − 2.5 SD or a TBS < 1.200. If we combine a BMD < − 2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been misclassified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS and HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.
Resumo:
Real time glycemia is a cornerstone for metabolic research, particularly when performing oral glucose tolerance tests (OGTT) or glucose clamps. From 1965 to 2009, the gold standard device for real time plasma glucose assessment was the Beckman glucose analyzer 2 (Beckman Instruments, Fullerton, CA), which technology couples glucose oxidase enzymatic assay with oxygen sensors. Since its discontinuation in 2009, today's researchers are left with few choices that utilize glucose oxidase technology. The first one is the YSI 2300 (Yellow Springs Instruments Corp., Yellow Springs, OH), known to be as accurate as the Beckman(1). The YSI has been used extensively for clinical research studies and is used to validate other glucose monitoring devices(2). The major drawback of the YSI is that it is relatively slow and requires high maintenance. The Analox GM9 (Analox instruments, London), more recent and faster, is increasingly used in clinical research(3) as well as in basic sciences(4) (e.g. 23 papers in Diabetes or 21 in Diabetologia). This article is protected by copyright. All rights reserved.
Resumo:
1. Costs of reproduction lie at the core of basic ecological and evolutionary theories, and their existence is commonly invoked to explain adaptive processes. Despite their sheer importance, empirical evidence for the existence and quantification of costs of reproduction in tree species comes mostly from correlational studies, while more comprehensive approaches remain missing. Manipulative experiments are a preferred approach to study cost of reproduction, as they allow controlling for otherwise inherent confounding factors like size or genetic background. 2. Here, we conducted a manipulative experiment in a Pinus halepensis common garden, removing developing cones from a group of trees and comparing growth and reproduction after treatment with a control group. We also estimated phenotypic and genetic correlations between reproductive and vegetative traits. 3. Manipulated trees grew slightly more than control trees just after treatment, with just a transient, marginally non-significant difference. By contrast, larger differences were observed for the number of female cones initiated 1 year after treatment, with an increase of 70% more cones in the manipulated group. Phenotypic and genetic correlations showed that smaller trees invested a higher proportion of their resources in reproduction, compared with larger trees, which could be interpreted as an indirect evidence for costs of reproduction. 4. Synthesis. This research showed a high impact of current reproduction on reproductive potential, even when not significant on vegetative growth. This has strong implications for how we understand adaptive strategies in forest trees and should encourage further interest on their still poorly known reproductive life-history traits.
Resumo:
One aim of this study is to determine the impact of water velocity on the uptake of indicator polychlorinated biphenyls (iPCBs) by silicone rubber (SR) and low-density polyethylene (LDPE) passive samplers. A second aim is to assess the efficiency of performance reference compounds (PRCs) to correct for the impact of water velocity. SR and LDPE samplers were spiked with 11 or 12 PRCs and exposed for 6 weeks to four different velocities (in the range of 1.6 to 37.7 cm s−1) in river-like flow conditions using a channel system supplied with river water. A relationship between velocity and the uptakewas found for each iPCB and enables to determine expected changes in the uptake due to velocity variations. For both samplers, velocity increases from 2 to 10 cm s−1, 30 cm s−1 (interpolated data) and 100 cm s−1 (extrapolated data) lead to increases of the uptake which do not exceed a factor of 2, 3 and 4.5, respectively. Results also showed that the influence of velocity decreased with increasing the octanol-water coefficient partition (log Kow) of iPCBs when SR is used whereas the opposite effect was observed for LDPE. Time-weighted average (TWA) concentrations of iPCBs in water were calculated from iPCB uptake and PRC release. These calculations were performed using either a single PRC or all the PRCs. The efficiency of PRCs to correct the impact of velocity was assessed by comparing the TWA concentrations obtained at the four tested velocities. For SR, a good agreement was found among the four TWA concentrations with both methods (average RSD b 10%). Also for LDPE, PRCs offered a good correction of the impact of water velocity (average RSD of about 10 to 20%). These results contribute to the process of acceptance of passive sampling in routine regulatory monitoring programs.
Resumo:
Shallow upland drains, grips, have been hypothesized as responsible for increased downstream flow magnitudes. Observations provide counterfactual evidence, often relating to the difficulty of inferring conclusions from statistical correlation and paired catchment comparisons, and the complexity of designing field experiments to test grip impacts at the catchment scale. Drainage should provide drier antecedent moisture conditions, providing more storage at the start of an event; however, grips have higher flow velocities than overland flow, thus potentially delivering flow more rapidly to the drainage network. We develop and apply a model for assessing the impacts of grips on flow hydrographs. The model was calibrated on the gripped case, and then the gripped case was compared with the intact case by removing all grips. This comparison showed that even given parameter uncertainty, the intact case had significantly higher flood peaks and lower baseflows, mirroring field observations of the hydrological response of intact peat. The simulations suggest that this is because delivery effects may not translate into catchment-scale impacts for three reasons. First, in our case, the proportions of flow path lengths that were hillslope were not changed significantly by gripping. Second, the structure of the grip network as compared with the structure of the drainage basin mitigated against grip-related increases in the concentration of runoff in the drainage network, although it did marginally reduce the mean timing of that concentration at the catchment outlet. Third, the effect of the latter upon downstream flow magnitudes can only be assessed by reference to the peak timing of other tributary basins, emphasizing that drain effects are both relative and scale dependent. However, given the importance of hillslope flow paths, we show that if upland drainage causes significant changes in surface roughness on hillslopes, then critical and important feedbacks may impact upon the speed of hydrological response. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
BACKGROUND: Frequent emergency department users represent a small number of patients but account for a large number of emergency department visits. They should be a focus because they are often vulnerable patients with many risk factors affecting their quality of life (QoL). Case management interventions have resulted in a significant decrease in emergency department visits, but association with QoL has not been assessed. One aim of our study was to examine to what extent an interdisciplinary case management intervention, compared to standard emergency care, improved frequent emergency department users' QoL. METHODS: Data are part of a randomized, controlled trial designed to improve frequent emergency department users' QoL and use of health-care resources at the Lausanne University Hospital, Switzerland. In total, 250 frequent emergency department users (≥5 attendances during the previous 12 months; ≥ 18 years of age) were interviewed between May 2012 and July 2013. Following an assessment focused on social characteristics; social, mental, and somatic determinants of health; risk behaviors; health care use; and QoL, participants were randomly assigned to the control or the intervention group (n=125 in each group). The final sample included 194 participants (20 deaths, 36 dropouts, n=96 in the intervention group, n=99 in the control group). Participants in the intervention group received a case management intervention by an interdisciplinary, mobile team in addition to standard emergency care. The case management intervention involved four nurses and a physician who provided counseling and assistance concerning social determinants of health, substance-use disorders, and access to the health-care system.
Resumo:
OBJECTIVES: Therapeutic coma is advocated in guidelines for management of refractory status epilepticus; this is, however, based on weak evidence. We here address the specific impact of therapeutic coma on status epilepticus outcome. DESIGN: Retrospective assessment of a prospectively collected cohort. SETTING: Academic hospital. PATIENTS: Consecutive adults with incident status epilepticus lasting greater than or equal to 30 minutes, admitted between 2006 and 2013. MEASUREMENTS AND MAIN RESULTS: We recorded prospectively demographics, clinical status epilepticus features, treatment, and outcome at discharge and retrospectively medical comorbidities, hospital stay, and infectious complications. Associations between potential predictors and clinical outcome were analyzed using multinomial logistic regressions. Of 467 patients with incident status epilepticus, 238 returned to baseline (51.1%), 162 had new disability (34.6%), and 67 died (14.3%); 50 subjects (10.7%) were managed with therapeutic coma. Therapeutic coma was associated with poorer outcome in the whole cohort (relative risk ratio for new disability, 6.86; 95% CI, 2.84-16.56; for mortality, 9.10; 95% CI, 3.17-26.16); the effect was more important in patients with complex partial compared with generalized convulsive or nonconvulsive status epilepticus in coma. Prevalence of infections was higher (odds ratio, 3.81; 95% CI, 1.66-8.75), and median hospital stay in patients discharged alive was longer (16 d [range, 2-240 d] vs 9 d [range, 1-57 d]; p < 0.001) in subjects managed with therapeutic coma. CONCLUSIONS: This study provides class III evidence that therapeutic coma is associated with poorer outcome after status epilepticus; furthermore, it portends higher infection rates and longer hospitalizations. These data suggest caution in the straightforward use of this approach, especially in patients with complex partial status epilepticus.
Resumo:
BACKGROUND: Evidence suggests a relationship between exposure to trauma during childhood and functional impairments in psychotic patients. However, the impact of age at the time of exposure has been understudied in early psychosis (EP) patients. METHOD: Two hundred and twenty-five patients aged 18-35 years were assessed at baseline and after 2, 6, 18, 24, 30 and 36 months of treatment. Patients exposed to sexual and/or physical abuse (SPA) were classified according to age at the time of first exposure (Early SPA: before age 11 years; Late SPA: between ages 12 and 15 years) and then compared to patients who were not exposed to such trauma (Non-SPA). The functional level in the premorbid phase was measured with the Premorbid Adjustment Scale (PAS) and with the Global Assessment of Functioning (GAF) scale and the Social and Occupational Functioning Assessment Scale (SOFAS) during follow-up. RESULTS: There were 24.8% of patients with a documented history of SPA. Late SPA patients were more likely to be female (p = 0.010). Comparison with non-SPA patients revealed that: (1) both Early and Late SPA groups showed poorer premorbid social functioning during early adolescence, and (2) while patients with Early SPA had poorer functional level at follow-up with lower GAF (p = 0.025) and lower SOFAS (p = 0.048) scores, Late SPA patients did not. CONCLUSION: Our results suggest a link between exposure to SPA and the later impairment of social functioning before the onset of the disease. EP patients exposed to SPA before age 12 may present long-lasting functional impairment, while patients exposed at a later age may improve in this regard and have a better functional outcome.
Resumo:
Introduction : La littérature suggère un lien entre l'exposition à des expériences traumatiques durant l'enfance et des déficits dans le niveau de fonctionnement chez des patients souffrant de psychose. Par contre, l'impact de l'âge au moment de l'exposition à ces expériences n'a pas été investigué chez des patients dans leur phase précoce de la psychose. Méthodes : Deux cents vingt-cinq patients âgés entre 18 et 35 ans ont été évalués au moment de leur entrée dans un programme thérapeutique spécialisé pour la psychose débutante (TIPP), et après 2, 6, 12, 18, 24, 30 et 36 mois de traitement. Les patients exposés à des abus sexuel et/ou physiques (SPA) ont été classifiés selon l'âge au moment de la première exposition (Early-SPA : avant 11 ans d'âge; Late-SPA : entre 12 et 15 ans d'âge) et ils ont été comparés à des patients qui n'ont jamais été exposés à une telle expérience (Non-SPA). Le niveau de fonctionnement dans la phase premorbide a été mesuré avec la Premorbid Adjustment Scale (PAS) et avec les échelles Global Assessment of Functioning (GAF) et Social and Occupational Functionning Assessment Scale (SOFAS) durant le suivi thérapeutique. Résultats : 24.8 % des patients ont été exposés à SPA. Les patients dans le groupe Late-SPA étaient plus souvent des femmes (p=0.010). Les comparaisons avec les patients dans le groupe Non-SPA ont révélé que : (1) Les patients dans le groupe Early et Late-SPA ont montré un moins bon niveau de fonctionnement social premorbide durant l'adolescence précoce, et (2) alors que les patients dans le groupe Early-SPA ont présenté un moins bon niveau de fonctionnement durant tout le suivi selon les scores de GAF (p=0.025) et SOFAS (p=0.048), les patients dans le groupe Late-SPA n'ont pas montré telles différences avec le groupe non exposé. Conclusion : Nos résultats suggèrent un lien entre l'exposition à SPA et une altération ultérieure de niveau de fonctionnement social, avant l'apparition de la maladie. Les patients dans leur phase précoce de la psychose exposés à SPA avant l'âge de 12 ans ont des altérations fonctionnelles durables, alors que les patients exposés à SPA plus tardivement semblent s'améliorer à ce niveau et montrent une meilleure capacité de récupération.
Resumo:
In this article, we show how the use of state-of-the-art methods in computer science based on machine perception and learning allows the unobtrusive capture and automated analysis of interpersonal behavior in real time (social sensing). Given the high ecological validity of the behavioral sensing, the ease of behavioral-cue extraction for large groups over long observation periods in the field, the possibility of investigating completely new research questions, and the ability to provide people with immediate feedback on behavior, social sensing will fundamentally impact psychology.