992 resultados para CART analysis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Individual signs and symptoms are of limited value for the diagnosis of influenza. Objective To develop a decision tree for the diagnosis of influenza based on a classification and regression tree (CART) analysis. Methods Data from two previous similar cohort studies were assembled into a single dataset. The data were randomly divided into a development set (70%) and a validation set (30%). We used CART analysis to develop three models that maximize the number of patients who do not require diagnostic testing prior to treatment decisions. The validation set was used to evaluate overfitting of the model to the training set. Results Model 1 has seven terminal nodes based on temperature, the onset of symptoms and the presence of chills, cough and myalgia. Model 2 was a simpler tree with only two splits based on temperature and the presence of chills. Model 3 was developed with temperature as a dichotomous variable (≥38°C) and had only two splits based on the presence of fever and myalgia. The area under the receiver operating characteristic curves (AUROCC) for the development and validation sets, respectively, were 0.82 and 0.80 for Model 1, 0.75 and 0.76 for Model 2 and 0.76 and 0.77 for Model 3. Model 2 classified 67% of patients in the validation group into a high- or low-risk group compared with only 38% for Model 1 and 54% for Model 3. Conclusions A simple decision tree (Model 2) classified two-thirds of patients as low or high risk and had an AUROCC of 0.76. After further validation in an independent population, this CART model could support clinical decision making regarding influenza, with low-risk patients requiring no further evaluation for influenza and high-risk patients being candidates for empiric symptomatic or drug therapy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: In Cambodia, malaria transmission is low and most cases occur in forested areas. Seroepidemiological techniques can be used to identify both areas of ongoing transmission and high-risk groups to be targeted by control interventions. This study utilizes repeated cross-sectional data to assess the risk of being malaria sero-positive at two consecutive time points during the rainy season and investigates who is most likely to sero-convert over the transmission season. Methods: In 2005, two cross-sectional surveys, one in the middle and the other at the end of the malaria transmission season, were carried out in two ecologically distinct regions in Cambodia. Parasitological and serological data were collected in four districts. Antibodies to Plasmodium falciparum Glutamate Rich Protein (GLURP) and Plasmodium vivax Merozoite Surface Protein-119 (MSP-119) were detected using Enzyme Linked Immunosorbent Assay (ELISA). The force of infection was estimated using a simple catalytic model fitted using maximum likelihood methods. Risks for sero-converting during the rainy season were analysed using the Classification and Regression Tree (CART) method. Results: A total of 804 individuals participating in both surveys were analysed. The overall parasite prevalence was low (4.6% and 2.0% for P. falciparum and 7.9% and 6.0% for P. vivax in August and November respectively). P. falciparum force of infection was higher in the eastern region and increased between August and November, whilst P. vivax force of infection was higher in the western region and remained similar in both surveys. In the western region, malaria transmission changed very little across the season (for both species). CART analysis for P. falciparum in the east highlighted age, ethnicity, village of residence and forest work as important predictors for malaria exposure during the rainy season. Adults were more likely to increase their antibody responses to P. falciparum during the transmission season than children, whilst members of the Charay ethnic group demonstrated the largest increases. Discussion: In areas of low transmission intensity, such as in Cambodia, the analysis of longitudinal serological data enables a sensitive evaluation of transmission dynamics. Consecutive serological surveys allow an insight into spatio-temporal patterns of malaria transmission. The use of CART enabled multiple interactions to be accounted for simultaneously and permitted risk factors for exposure to be clearly identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim Our aim was to discriminate different species of Pinus via pollen analysis in order to assess the responses of particular pine species to orbital and millennial-scale climate changes, particularly during the last glacial period. Location Modern pollen grains were collected from current pine populations along transects from the Pyrenees to southern Iberia and the Balearic Islands. Fossil pine pollen was recovered from the south-western Iberian margin core MD95-2042. Methods We measured a set of morphological traits of modern pollen from the Iberian pine species Pinus nigra, P. sylvestris, P. halepensis, P. pinea and P. pinaster and of fossil pine pollen from selected samples of the last glacial period and the early to mid-Holocene. Classification and regression tree (CART) analysis was used to establish a model from the modern dataset that discriminates pollen from the different pine species and allows identification of fossil pine pollen at the species level. Results The CART model was effective in separating pollen of P. nigra and P. sylvestris from that of the Mediterranean pine group (P. halepensis, P. pinea and P. pinaster). The pollen of Pinus nigra diverged from that of P. sylvestris by having a more flattened corpus. Predictions using this model suggested that fossil pine pollen is mainly from P. nigra in all the samples analysed. Pinus sylvestris was more abundant in samples from Greenland stadials than Heinrich stadials, whereas Mediterranean pines increased in samples from Greenland interstadials and during the early to mid-Holocene. Main conclusions Morphological parameters can be successfully used to increase the taxonomic resolution of fossil pine pollen at the species level for the highland pines (P. nigra and P. sylvestris) and at the group of species level for the Mediterranean pines. Our study indicates that P. nigra was the dominant component of the last glacial south-western/central Iberian pinewoods, although the species composition of these woodlands varied in response to abrupt climate changes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION Tolerability and convenience are crucial aspects for the long-term success of combined antiretroviral therapy (cART). The aim of this study was to investigate the impact in routine clinical practice of switching to the single tablet regimen (STR) RPV/FTC/TDF in patients with intolerance to previous cART, in terms of patients' well-being, assessed by several validated measures. METHODS Prospective, multicenter study. Adult HIV-infected patients with viral load under 1.000 copies/mL while receiving a stable ART for at least the last three months and switched to RPV/FTC/TDF due to intolerance of previous regimen, were included. Analyses were performed by ITT. Presence/magnitude of symptoms (ACTG-HIV Symptom Index), quality of life (EQ-5D, EUROQoL & MOS-HIV), adherence (SMAQ), preference of treatment and perceived ease of medication (ESTAR) through 48 weeks were performed. RESULTS Interim analysis of 125 patients with 16 weeks of follow up was performed. 100 (80%) were male, mean age 46 years. Mean CD4 at baseline was 629.5±307.29 and 123 (98.4%) had viral load <50 copies/mL; 15% were HCV co-infected. Ninety two (73.6%) patients switched from a NNRTI (84.8% from EFV/FTC/TDF) and 33 (26.4%) from a PI/r. The most frequent reasons for switching were psychiatric disorders (51.2%), CNS adverse events (40.8%), gastrointestinal (19.2%) and metabolic disorders (19.2%). At the time of this analysis (week 16), four patients (3.2%) discontinued treatment: one due to adverse events, two virologic failures and one with no data. A total of 104 patients (83.2%) were virologically suppressed (<50 copies/mL). The average degree of discomfort in the ACTG-HIV Symptom Index significantly decreased from baseline (21±15.55) to week 4 (10.89±12.36) & week 16 (10.81±12.62), p<0.001. In all the patients, quality of life tools showed a significant benefit in well-being of the patients (Table 1). Adherence to therapy significantly and progressively increased (SMAQ) from baseline (54.4%) to week 4 (68%), p<0.001 and to week 16 (72.0%), p<0.001. CONCLUSIONS Switching to RPV/FTC/TDF from another ARV regimen due to toxicity, significantly improved the quality of life of HIV-infected patients, both in mental and physical components, and improved adherence to therapy while maintaining a good immune and virological response.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The definition of knowledge as justified true belief is the best we presently have. However, the canonical tripartite analysis of knowledge does not do justice to it due to a Platonic conception of a priori truth that puts the cart before the horse. Within a pragmatic approach, I argue that by doing away with a priori truth, namely by submitting truth to justification, and by accordingly altering the canonical analysis of knowledge, this is a fruitful definition. So fruitful indeed that it renders the Gettier counterexamples vacuous, allowing positive work in epistemology and related disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Adherence to combination antiretroviral therapy (cART) is a dynamic process, however, changes in adherence behavior over time are insufficiently understood. METHODS: Data on self-reported missed doses of cART was collected every 6 months in Swiss HIV Cohort Study participants. We identified behavioral groups associated with specific cART adherence patterns using trajectory analyses. Repeated measures logistic regression identified predictors of changes in adherence between consecutive visits. RESULTS: Six thousand seven hundred nine individuals completed 49,071 adherence questionnaires [median 8 (interquartile range: 5-10)] during a median follow-up time of 4.5 years (interquartile range: 2.4-5.1). Individuals were clustered into 4 adherence groups: good (51.8%), worsening (17.4%), improving (17.6%), and poor adherence (13.2%). Independent predictors of worsening adherence were younger age, basic education, loss of a roommate, starting intravenous drug use, increasing alcohol intake, depression, longer time with HIV, onset of lipodystrophy, and changing care provider. Independent predictors of improvements in adherence were regimen simplification, changing class of cART, less time on cART, and starting comedications. CONCLUSIONS: Treatment, behavioral changes, and life events influence patterns of drug intake in HIV patients. Clinical care providers should routinely monitor factors related to worsening adherence and intervene early to reduce the risk of treatment failure and drug resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is an ongoing debate as to whether combined antiretroviral treatment (cART) during pregnancy is an independent risk factor for prematurity in HIV-1-infected women. OBJECTIVE: The aim of the study was to examine (1) crude effects of different ART regimens on prematurity, (2) the association between duration of cART and duration of pregnancy, and (3) the role of possibly confounding risk factors for prematurity. METHOD: We analysed data from 1180 pregnancies prospectively collected by the Swiss Mother and Child HIV Cohort Study (MoCHiV) and the Swiss HIV Cohort Study (SHCS). RESULTS: Odds ratios for prematurity in women receiving mono/dual therapy and cART were 1.8 [95% confidence interval (CI) 0.85-3.6] and 2.5 (95% CI 1.4-4.3) compared with women not receiving ART during pregnancy (P=0.004). In a subgroup of 365 pregnancies with comprehensive information on maternal clinical, demographic and lifestyle characteristics, there was no indication that maternal viral load, age, ethnicity or history of injecting drug use affected prematurity rates associated with the use of cART. Duration of cART before delivery was also not associated with duration of pregnancy. CONCLUSION: Our study indicates that confounding by maternal risk factors or duration of cART exposure is not a likely explanation for the effects of ART on prematurity in HIV-1-infected women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To investigate whether HIV-infected patients on a stable and fully suppressive combination antiretroviral therapy (cART) regimen could safely be monitored less often than the current recommendations of every 3 months. DESIGN: Two thousand two hundred and forty patients from the EuroSIDA study who maintained a stable and fully suppressed cART regimen for 1 year were included in the analysis. METHODS: Risk of treatment failure, defined by viral rebound, fall in CD4 cell count, development of new AIDS-defining illness, serious opportunistic infection or death, in the 12 months following a year of a stable and fully suppressed regimen was assessed. RESULTS: One hundred thirty-one (6%) patients experienced treatment failure in the 12 months following a year of stable therapy, viral rebound occurred in 99 (4.6%) patients. After 3, 6 and 12 months, patients had a 0.3% [95% confidence interval (CI) 0.1-0.5], 2.2% (95% CI 1.6-2.8) and 6.0% (95% CI 5.0-7.0) risk of treatment failure, respectively. Patients who spent more than 80% of their time on cART with fully suppressed viraemia prior to baseline had a 38% reduced risk of treatment failure, hazard ratio 0.62 (95% CI 0.42-0.90, P = 0.01). CONCLUSION: Patients who have responded well to cART and are on a well tolerated and durably fully suppressive cART regimen have a low chance of experiencing treatment failure in the next 3-6 months. Therefore, in this subgroup of otherwise healthy patients, it maybe reasonable to extend visit intervals to 6 months, with cost and time savings to both the treating clinics and the patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background HIV-1 infection increases plasma levels of inflammatory markers. Combination antiretroviral therapy (cART) does not restore inflammatory markers to normal levels. Since intensification of cART with raltegravir reduced CD8 T-cell activation in the Discor-Ral and IntegRal studies, we have evaluated the effect of raltegravir intensification on several soluble inflammation markers in these studies. Methods Longitudinal plasma samples (0–48 weeks) from the IntegRal (n = 67, 22 control and 45 intensified individuals) and the Discor-Ral studies (44 individuals with CD4 T-cell counts<350 cells/µl, 14 control and 30 intensified) were assayed for 25 markers. Mann-Whitney, Wilcoxon, Spearman test and linear mixed models were used for analysis. Results At baseline, different inflammatory markers were strongly associated with HCV co-infection, lower CD4 counts and with cART regimens (being higher in PI-treated individuals), but poorly correlated with detection of markers of residual viral replication. Although raltegravir intensification reduced inflammation in individuals with lower CD4 T-cell counts, no effect of intensification was observed on plasma markers of inflammation in a global analysis. An association was found, however, between reductions in immune activation and plasma levels of the coagulation marker D-dimer, which exclusively decreased in intensified patients on protease inhibitor (PI)-based cART regimens (P = 0.040). Conclusions The inflammatory profile in treated HIV-infected individuals showed a complex association with HCV co-infection, the levels of CD4 T cells and the cART regimen. Raltegravir intensification specifically reduced D-dimer levels in PI-treated patients, highlighting the link between cART composition and residual viral replication; however, raltegravir had little effect on other inflammatory markers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La version intégrale de cette thèse est disponible uniquement pour consultation individuelle à la Bibliothèque de musique de l’Université de Montréal (www.bib.umontreal.ca/MU).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le suivi thérapeutique est recommandé pour l’ajustement de la dose des agents immunosuppresseurs. La pertinence de l’utilisation de la surface sous la courbe (SSC) comme biomarqueur dans l’exercice du suivi thérapeutique de la cyclosporine (CsA) dans la transplantation des cellules souches hématopoïétiques est soutenue par un nombre croissant d’études. Cependant, pour des raisons intrinsèques à la méthode de calcul de la SSC, son utilisation en milieu clinique n’est pas pratique. Les stratégies d’échantillonnage limitées, basées sur des approches de régression (R-LSS) ou des approches Bayésiennes (B-LSS), représentent des alternatives pratiques pour une estimation satisfaisante de la SSC. Cependant, pour une application efficace de ces méthodologies, leur conception doit accommoder la réalité clinique, notamment en requérant un nombre minimal de concentrations échelonnées sur une courte durée d’échantillonnage. De plus, une attention particulière devrait être accordée à assurer leur développement et validation adéquates. Il est aussi important de mentionner que l’irrégularité dans le temps de la collecte des échantillons sanguins peut avoir un impact non-négligeable sur la performance prédictive des R-LSS. Or, à ce jour, cet impact n’a fait l’objet d’aucune étude. Cette thèse de doctorat se penche sur ces problématiques afin de permettre une estimation précise et pratique de la SSC. Ces études ont été effectuées dans le cadre de l’utilisation de la CsA chez des patients pédiatriques ayant subi une greffe de cellules souches hématopoïétiques. D’abord, des approches de régression multiple ainsi que d’analyse pharmacocinétique de population (Pop-PK) ont été utilisées de façon constructive afin de développer et de valider adéquatement des LSS. Ensuite, plusieurs modèles Pop-PK ont été évalués, tout en gardant à l’esprit leur utilisation prévue dans le contexte de l’estimation de la SSC. Aussi, la performance des B-LSS ciblant différentes versions de SSC a également été étudiée. Enfin, l’impact des écarts entre les temps d’échantillonnage sanguins réels et les temps nominaux planifiés, sur la performance de prédiction des R-LSS a été quantifié en utilisant une approche de simulation qui considère des scénarios diversifiés et réalistes représentant des erreurs potentielles dans la cédule des échantillons sanguins. Ainsi, cette étude a d’abord conduit au développement de R-LSS et B-LSS ayant une performance clinique satisfaisante, et qui sont pratiques puisqu’elles impliquent 4 points d’échantillonnage ou moins obtenus dans les 4 heures post-dose. Une fois l’analyse Pop-PK effectuée, un modèle structural à deux compartiments avec un temps de délai a été retenu. Cependant, le modèle final - notamment avec covariables - n’a pas amélioré la performance des B-LSS comparativement aux modèles structuraux (sans covariables). En outre, nous avons démontré que les B-LSS exhibent une meilleure performance pour la SSC dérivée des concentrations simulées qui excluent les erreurs résiduelles, que nous avons nommée « underlying AUC », comparée à la SSC observée qui est directement calculée à partir des concentrations mesurées. Enfin, nos résultats ont prouvé que l’irrégularité des temps de la collecte des échantillons sanguins a un impact important sur la performance prédictive des R-LSS; cet impact est en fonction du nombre des échantillons requis, mais encore davantage en fonction de la durée du processus d’échantillonnage impliqué. Nous avons aussi mis en évidence que les erreurs d’échantillonnage commises aux moments où la concentration change rapidement sont celles qui affectent le plus le pouvoir prédictif des R-LSS. Plus intéressant, nous avons mis en exergue que même si différentes R-LSS peuvent avoir des performances similaires lorsque basées sur des temps nominaux, leurs tolérances aux erreurs des temps d’échantillonnage peuvent largement différer. En fait, une considération adéquate de l'impact de ces erreurs peut conduire à une sélection et une utilisation plus fiables des R-LSS. Par une investigation approfondie de différents aspects sous-jacents aux stratégies d’échantillonnages limités, cette thèse a pu fournir des améliorations méthodologiques notables, et proposer de nouvelles voies pour assurer leur utilisation de façon fiable et informée, tout en favorisant leur adéquation à la pratique clinique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combination Antiretroviral Therapy (cART) aims to inhibit viral replication, delay immunodeficiency progression and improve survival in AIDS patients. The objective of this study was to compare two different schemes of cART, based on plasma viral load (VL) and CD4+ T lymphocyte count, during 48 weeks of treatment. For this purpose, 472 medical charts of a Specialized Outpatient Service were reviewed from 1998 to 2005. Out of these, 58 AIDS patients who had received a triple drug scheme as the initial treatment were included in the study and two groups were formed: Group 1 (G1): 47 individuals treated with two nucleoside reverse-transcriptase inhibitors (NRTI) and one non-nucleoside reverse-transcriptase inhibitor; Group 2 (G2): 11 patients treated with two NRTI and one protease inhibitor. In G1 and G2, 53.2% and 81.8% respectively were patients with an AIDS-defining disease. The T CD4+ lymphocyte count increased progressively up until the 24th week of treatment in all patients, while VL became undetectable in 68.1% of G1 and in 63.6% of G2. The study concluded that the evolutions of laboratory tests were similar in the two treatment groups and that both presented a favorable clinical evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta dissertação trata de uma análise sobre as estratégias de pequenos produtores rurais organizados em cooperativas na busca da garantia de trabalho e renda, no município de Cametá, Pará. De tal modo, considerou-se, inicialmente, as influências pelas condições de inserção social, produtiva e econômica e o incentivo pela Prelazia e instituições de assessoria que reforçam essa organização. Nesse trajeto, o Sindicato de Trabalhadores (as) Rurais motivaram lutas reivindicatórias para a melhoria das condições socioeconômicas locais em Cametá. As reflexões teóricas e a realidade prática, desses autores, mostram que as relações de promoção vêm sendo articuladas na perspectiva das transformações que tem impactado esse município, as quais foram agravadas desde a implantação dos grandes projetos na Amazônia, a exemplo da implantação da Hidrelétrica de Tucuruí, que alterou significativamente o modo de vida dessa população. Dessa forma, os trabalhadores rurais, organizados coletivamente, passaram a atuar pela superação dessas dificuldades, através de atividades produtivas como estratégia de desenvolvimento local sustentável, evidenciando a garantia de instrumentos para a produção e comercialização de frutos, com ênfase no açaí, mas também em outros produtos como a farinha de mandioca e recentemente as sementes oleaginosas. Portanto, desenvolve-se uma caracterização sócio-produtiva de Cametá, analisa-se o surgimento do STR - Cametá, a criação da CART, particularizando sua relação às estratégias de comercialização em rede por meio da organização do Consórcio de Comercialização e da Federação das Cooperativas da Agricultura Familiar e Economia Solidária - FECAFES como instrumento de valorização produtiva estratégica à organização dos pequenos produtores rurais de Cametá.