916 resultados para Multilevel Linear Models
Resumo:
BACKGROUND: How change comes about is hotly debated in psychotherapy research. One camp considers 'non-specific' or 'common factors', shared by different therapy approaches, as essential, whereas researchers of the other camp consider specific techniques as the essential ingredients of change. This controversy, however, suffers from unclear terminology and logical inconsistencies. The Taxonomy Project therefore aims at contributing to the definition and conceptualization of common factors of psychotherapy by analyzing their differential associations to standard techniques. METHODS: A review identified 22 common factors discussed in psychotherapy research literature. We conducted a survey, in which 68 psychotherapy experts assessed how common factors are implemented by specific techniques. Using hierarchical linear models, we predicted each common factor by techniques and by experts' age, gender and allegiance to a therapy orientation. RESULTS: Common factors differed largely in their relevance for technique implementation. Patient engagement, Affective experiencing and Therapeutic alliance were judged most relevant. Common factors also differed with respect to how well they could be explained by the set of techniques. We present detailed profiles of all common factors by the (positively or negatively) associated techniques. There were indications of a biased taxonomy not covering the embodiment of psychotherapy (expressed by body-centred techniques such as progressive muscle relaxation, biofeedback training and hypnosis). Likewise, common factors did not adequately represent effective psychodynamic and systemic techniques. CONCLUSION: This taxonomic endeavour is a step towards a clarification of important core constructs of psychotherapy. KEY PRACTITIONER MESSAGE: This article relates standard techniques of psychotherapy (well known to practising therapists) to the change factors/change mechanisms discussed in psychotherapy theory. It gives a short review of the current debate on the mechanisms by which psychotherapy works. We provide detailed profiles of change mechanisms and how they may be generated by practice techniques.
Resumo:
BACKGROUND: Renal failure after thoracoabdominal aortic repair is a significant clinical problem. Distal aortic perfusion for organ and spinal cord protection requires cannulation of the left femoral artery. In 2006, we reported the finding that direct cannulation led to leg ischemia in some patients and was associated with increased renal failure. After this finding, we modified our perfusion technique to eliminate leg ischemia from cannulation. In this article, we present the effects of this change on postoperative renal function. METHODS: Between February 1991 and July 2008, we repaired 1464 thoracoabdominal aortic aneurysms. Distal aortic perfusion was used in 1088, and these were studied. Median patient age was 68 years, and 378 (35%) were women. In September 2006, we began to adopt a sidearm femoral cannulation technique that provides distal aortic perfusion while maintaining downstream flow to the leg. This was used in 167 patients (15%). We measured the joint effects of preoperative glomerular filtration rate (GFR) and cannulation technique on the highest postoperative creatinine level, postoperative renal failure, and death. Analysis was by multiple linear or logistic regression with interaction. RESULTS: The preoperative GFR was the strongest predictor of postoperative renal dysfunction and death. No significant main effects of sidearm cannulation were noted. For peak creatinine level and postoperative renal failure, however, strong interactions between preoperative GFR and sidearm cannulation were present, resulting in reductions of postoperative renal complications of 15% to 20% when GFR was <60 mL>/min/1.73 m(2). For normal GFR, the effect was negated or even reversed at very high levels of GFR. Mortality, although not significantly affected by sidearm cannulation, showed a similar trend to the renal outcomes. CONCLUSION: Use of sidearm cannulation is associated with a clinically important and highly statistically significant reduction in postoperative renal complications in patients with a low GFR. Reduced renal effect of skeletal muscle ischemia is the proposed mechanism. Effects among patients with good preoperative renal function are less clear. A randomized trial is needed.
Resumo:
Increased renal resistive index (RRI) has been recently associated with target organ damage and cardiovascular or renal outcomes in patients with hypertension and diabetes mellitus. However, reference values in the general population and information on familial aggregation are largely lacking. We determined the distribution of RRI, associated factors, and heritability in a population-based study. Families of European ancestry were randomly selected in 3 Swiss cities. Anthropometric parameters and cardiovascular risk factors were assessed. A renal Doppler ultrasound was performed, and RRI was measured in 3 segmental arteries of both kidneys. We used multilevel linear regression analysis to explore the factors associated with RRI, adjusting for center and family relationships. Sex-specific reference values for RRI were generated according to age. Heritability was estimated by variance components using the ASSOC program (SAGE software). Four hundred women (mean age±SD, 44.9±16.7 years) and 326 men (42.1±16.8 years) with normal renal ultrasound had mean RRI of 0.64±0.05 and 0.62±0.05, respectively (P<0.001). In multivariable analyses, RRI was positively associated with female sex, age, systolic blood pressure, and body mass index. We observed an inverse correlation with diastolic blood pressure and heart rate. Age had a nonlinear association with RRI. We found no independent association of RRI with diabetes mellitus, hypertension treatment, smoking, cholesterol levels, or estimated glomerular filtration rate. The adjusted heritability estimate was 42±8% (P<0.001). In a population-based sample with normal renal ultrasound, RRI normal values depend on sex, age, blood pressure, heart rate, and body mass index. The significant heritability of RRI suggests that genes influence this phenotype.
Resumo:
BACKGROUND: Risk factors and outcomes of bronchial stricture after lung transplantation are not well defined. An association between acute rejection and development of stricture has been suggested in small case series. We evaluated this relationship using a large national registry. METHODS: All lung transplantations between April 1994 and December 2008 per the United Network for Organ Sharing (UNOS) database were analyzed. Generalized linear models were used to determine the association between early rejection and development of stricture after adjusting for potential confounders. The association of stricture with postoperative lung function and overall survival was also evaluated. RESULTS: Nine thousand three hundred thirty-five patients were included for analysis. The incidence of stricture was 11.5% (1,077/9,335), with no significant change in incidence during the study period (P=0.13). Early rejection was associated with a significantly greater incidence of stricture (adjusted odds ratio [AOR], 1.40; 95% confidence interval [CI], 1.22-1.61; p<0.0001). Male sex, restrictive lung disease, and pretransplantation requirement for hospitalization were also associated with stricture. Those who experienced stricture had a lower postoperative peak percent predicted forced expiratory volume at 1 second (FEV1) (median 74% versus 86% for bilateral transplants only; p<0.0001), shorter unadjusted survival (median 6.09 versus 6.82 years; p<0.001) and increased risk of death after adjusting for potential confounders (adjusted hazard ratio 1.13; 95% CI, 1.03-1.23; p=0.007). CONCLUSIONS: Early rejection is associated with an increased incidence of stricture. Recipients with stricture demonstrate worse postoperative lung function and survival. Prospective studies may be warranted to further assess causality and the potential for coordinated rejection and stricture surveillance strategies to improve postoperative outcomes.
Resumo:
We examined outcomes and trends in surgery and radiation use for patients with locally advanced esophageal cancer, for whom optimal treatment isn't clear. Trends in surgery and radiation for patients with T1-T3N1M0 squamous cell or adenocarcinoma of the mid or distal esophagus in the Surveillance, Epidemiology, and End Results database from 1998 to 2008 were analyzed using generalized linear models including year as predictor; Surveillance, Epidemiology, and End Results doesn't record chemotherapy data. Local treatment was unimodal if patients had only surgery or radiation and bimodal if they had both. Five-year cancer-specific survival (CSS) and overall survival (OS) were analyzed using propensity-score adjusted Cox proportional-hazard models. Overall 5-year survival for the 3295 patients identified (mean age 65.1 years, standard deviation 11.0) was 18.9% (95% confidence interval: 17.3-20.7). Local treatment was bimodal for 1274 (38.7%) and unimodal for 2021 (61.3%) patients; 1325 (40.2%) had radiation alone and 696 (21.1%) underwent only surgery. The use of bimodal therapy (32.8-42.5%, P = 0.01) and radiation alone (29.3-44.5%, P < 0.001) increased significantly from 1998 to 2008. Bimodal therapy predicted improved CSS (hazard ratios [HR]: 0.68, P < 0.001) and OS (HR: 0.58, P < 0.001) compared with unimodal therapy. For the first 7 months (before survival curve crossing), CSS after radiation therapy alone was similar to surgery alone (HR: 0.86, P = 0.12) while OS was worse for surgery only (HR: 0.70, P = 0.001). However, worse CSS (HR: 1.43, P < 0.001) and OS (HR: 1.46, P < 0.001) after that initial timeframe were found for radiation therapy only. The use of radiation to treat locally advanced mid and distal esophageal cancers increased from 1998 to 2008. Survival was best when both surgery and radiation were used.
Resumo:
Introduction Research has shown that individuals infer their group-efficacy beliefs from the groups’ abilities to perform in specific tasks. Group abilities also seem to affect team members’ performance motivation adding a psychological advantage to teams already high on task relevant abilities. In a recent study we found the effect of group abilities on individual performance motivation to be partially mediated by the team members’ individual group-efficacy beliefs which is an example of how attributes on a group-level can be affecting individual-level parameters. Objectives The study aimed at testing the possibility to reduce the direct and mediated effects of low group abilities on performance motivation by augmenting the visibility of individual contributions to group performances via the inclusion of a separate ranking on individual performances. Method Forty-seven students (M=22.83 years, SD=2.83, 34% women) of the University of Bern participated in the study. At three collection points (t1-3) subjects were provided information about fictive team members with whom they had to imagine performing a group triathlon. Three values (low, medium, high) of the other team members’ abilities to perform in their parts of the triathlon (swimming and biking) were combined in a 3x3 full factorial design yielding nine groups with different ability profiles. At t1 subjects were asked to rate their confidence that the teams would perform well in the triathlon task, at t2 and t3 subjects were asked how motivated they were to perform at their best in the respective groups. At t3 the presence of an individual performance ranking was mentioned in the cover story. Mixed linear models (SPSS) and structural equation models for complex survey data (Mplus) were specified to estimate the effects of the individual performance rankings on the relationship between group-efficacy beliefs and performance motivation. Results A significant interaction effect for individual group-efficacy beliefs and the triathlon condition on performance motivation was found; the effect of group-efficacy beliefs on performance motivation being smaller with individual performance rankings available. The partial mediation of group attributes on performance motivation by group-efficacy beliefs disappeared with the announcement of individual performance rankings. Conclusion In teams low in task relevant abilities the disadvantageous effect of group-efficacy beliefs on performance motivation might be reduced by providing means of evaluating individual performances apart from a group’s overall performance. While it is believed that a common group goal is a core criterion for a well performing sport group future studies should also aim at the possible benefit of individualized goal setting in groups.
Resumo:
Introduction Research has shown that individuals infer their group-efficacy beliefs from the groups’ abilities to perform in specific tasks. Group abilities also seem to affect team members’ performance motivation adding a psychological advantage to teams already high on task relevant abilities. In a recent study we found the effect of group abilities on individual performance motivation to be partially mediated by the team members’ individual group-efficacy beliefs which is an example of how attributes on a group-level can be affecting individual-level parameters. Objectives The study aimed at testing the possibility to reduce the direct and mediated effects of low group abilities on performance motivation by augmenting the visibility of individual contributions to group performances via the inclusion of a separate ranking on individual performances. Method Forty-seven students (M=22.83 years, SD=2.83, 34% women) of the University of Bern participated in the study. At three collection points (t1-3) subjects were provided information about fictive team members with whom they had to imagine performing a group triathlon. Three values (low, medium, high) of the other team members’ abilities to perform in their parts of the triathlon (swimming and biking) were combined in a 3x3 full factorial design yielding nine groups with different ability profiles. At t1 subjects were asked to rate their confidence that the teams would perform well in the triathlon task, at t2 and t3 subjects were asked how motivated they were to perform at their best in the respective groups. At t3 the presence of an individual performance ranking was mentioned in the cover story. Mixed linear models (SPSS) and structural equation models for complex survey data (Mplus) were specified to estimate the effects of the individual performance rankings on the relationship between group-efficacy beliefs and performance motivation. Results A significant interaction effect for individual group-efficacy beliefs and the triathlon condition on performance motivation was found; the effect of group-efficacy beliefs on performance motivation being smaller with individual performance rankings available. The partial mediation of group attributes on performance motivation by group-efficacy beliefs disappeared with the announcement of individual performance rankings. Conclusion In teams low in task relevant abilities the disadvantageous effect of group-efficacy beliefs on performance motivation might be reduced by providing means of evaluating individual performances apart from a group’s overall performance. While it is believed that a common group goal is a core criterion for a well performing sport group future studies should also aim at the possible benefit of individualized goal setting in groups.
Resumo:
Background Nowadays there is extensive evidence available showing the efficacy of cognitive remediation therapies. Integrative approaches seem superior regarding the maintenance of proximal outcome at follow-up as well as generalization to other areas of functioning. To date, only limited evidence about the efficacy of CRT is available concerning elder schizophrenia patients. The Integrated Neurocognitive Therapy (INT) represents a new developed cognitive remediation approach. It is a manualized group therapy approach targeting all 11 NIMH-MATRICS dimensions within one therapy concept. In this study we compared the effects of INT on an early course group (duration of disease<5 years) to a long-term group of schizophrenia outpatients (duration of disease>15 years). Methods An international multicenter study carried out in Germany, Switzerland and Austria with a total of 90 outpatients diagnosed with Schizophrenia (DSM-IV-TR) were randomly assigned either to an INT-Therapy or to Treatment-As-Usual (TAU). 50 of the 90 Patients were an Early-Course (EC) group, suffering from schizophrenia for less than 5 years (Mean age=29 years, Mean duration of illness=3.3 years). The other 40 were a Long-term Course (LC) group, suffering from schizophrenia longer than 15 years (Mean age= 45 years, Mean duration of illness=22 years). Treatment comprised of 15 biweekly sessions. An extensive assessment battery was conducted before and after treatment and at follow up (1 year). Multivariate General Linear Models (GLM) (duration of illness x treatment x time) examined our hypothesis, if an EC group of schizophrenia outpatients differ in proximal and distal outcome from a LC group. Results Irrespective of the duration of illness, both groups (EC & LC) were able to benefit from the INT. INT was superior compared to TAU in most of the assessed domains. Dropout rate of EC group was much higher (21.4%) than LC group (8%) during therapy phase. However, interaction effects show that the LC group revealed significantly higher effects in the neurocognitive domains of speed of processing (F>3.6) and vigilance (F>2.4). In social cognition the EC group showed significantly higher effects in social schema (F>2.5) and social attribution (blame; F>6.0) compared to the LC group. Regarding more distal outcome, patients treated with INT obtained reduced general symptoms unaffected by the duration of illness during therapy phase and at follow-up (F>4.3). Discussion Results suggest that INT is a valid goal-oriented treatment to improve cognitive functions in schizophrenia outpatients. Irrespective of the duration of illness significant treatment, effects were evident. Against common expectations, long-term, more chronic patients showed higher effects in basal cognitive functions compared to younger patients and patients without any active therapy (TAU). Consequently, more integrated therapy offers are also recommended for long-term course schizophrenia patients.
Resumo:
BACKGROUND AND PURPOSE Visit-to-visit variability in systolic blood pressure (SBP) is associated with an increased risk of stroke and was reduced in randomized trials by calcium channel blockers and diuretics but not by renin-angiotensin system inhibitors. However, time of day effects could not be determined. Day-to-day variability on home BP readings predicts stroke risk and potentially offers a practical method of monitoring response to variability-directed treatment. METHODS SBP mean, maximum, and variability (coefficient of variation=SD/mean) were determined in 500 consecutive transient ischemic attack or minor stroke patients on 1-month home BP monitoring (3 BPs, 3× daily). Hypertension was treated to a standard protocol. Differences in SBP variability from 3 to 10 days before to 8 to 15 days after starting or increasing calcium channel blockers/diuretics versus renin-angiotensin system inhibitors versus both were compared by general linear models, adjusted for risk factors and baseline BP. RESULTS Among 288 eligible interventions, variability in SBP was reduced after increased treatment with calcium channel blockers/diuretics versus both versus renin-angiotensin system inhibitors (-4.0 versus 6.9 versus 7.8%; P=0.015), primarily because of effects on maximum SBP (-4.6 versus -1.0 versus -1.0%; P=0.001), with no differences in effect on mean SBP. Class differences were greatest for early-morning SBP variability (3.6 versus 17.0 versus 38.3; P=0.002) and maximum (-4.8 versus -2.0 versus -0.7; P=0.001), with no effect on midmorning (P=0.29), evening (P=0.65), or diurnal variability (P=0.92). CONCLUSIONS After transient ischemic attack or minor stroke, calcium channel blockers and diuretics reduced variability and maximum home SBP, primarily because of effects on morning readings. Home BP readings enable monitoring of response to SBP variability-directed treatment in patients with recent cerebrovascular events.
Resumo:
AIMS To estimate physical activity trajectories for people who quit smoking, and compare them to what would have been expected had smoking continued. DESIGN, SETTING AND PARTICIPANTS A total of 5115 participants in the Coronary Artery Risk Development in Young Adults Study (CARDIA) study, a population-based study of African American and European American people recruited at age 18-30 years in 1985/6 and followed over 25 years. MEASUREMENTS Physical activity was self-reported during clinical examinations at baseline (1985/6) and at years 2, 5, 7, 10, 15, 20 and 25 (2010/11); smoking status was reported each year (at examinations or by telephone, and imputed where missing). We used mixed linear models to estimate trajectories of physical activity under varying smoking conditions, with adjustment for participant characteristics and secular trends. FINDINGS We found significant interactions by race/sex (P = 0.02 for the interaction with cumulative years of smoking), hence we investigated the subgroups separately. Increasing years of smoking were associated with a decline in physical activity in black and white women and black men [e.g. coefficient for 10 years of smoking: -0.14; 95% confidence interval (CI) = -0.20 to -0.07, P < 0.001 for white women]. An increase in physical activity was associated with years since smoking cessation in white men (coefficient 0.06; 95% CI = 0 to 0.13, P = 0.05). The physical activity trajectory for people who quit diverged progressively towards higher physical activity from the expected trajectory had smoking continued. For example, physical activity was 34% higher (95% CI = 18 to 52%; P < 0.001) for white women 10 years after stopping compared with continuing smoking for those 10 years (P = 0.21 for race/sex differences). CONCLUSIONS Smokers who quit have progressively higher levels of physical activity in the years after quitting compared with continuing smokers.
Resumo:
BACKGROUND Estimating the prevalence of comorbidities and their associated costs in patients with diabetes is fundamental to optimizing health care management. This study assesses the prevalence and health care costs of comorbid conditions among patients with diabetes compared with patients without diabetes. Distinguishing potentially diabetes- and nondiabetes-related comorbidities in patients with diabetes, we also determined the most frequent chronic conditions and estimated their effect on costs across different health care settings in Switzerland. METHODS Using health care claims data from 2011, we calculated the prevalence and average health care costs of comorbidities among patients with and without diabetes in inpatient and outpatient settings. Patients with diabetes and comorbid conditions were identified using pharmacy-based cost groups. Generalized linear models with negative binomial distribution were used to analyze the effect of comorbidities on health care costs. RESULTS A total of 932,612 persons, including 50,751 patients with diabetes, were enrolled. The most frequent potentially diabetes- and nondiabetes-related comorbidities in patients older than 64 years were cardiovascular diseases (91%), rheumatologic conditions (55%), and hyperlipidemia (53%). The mean total health care costs for diabetes patients varied substantially by comorbidity status (US$3,203-$14,223). Patients with diabetes and more than two comorbidities incurred US$10,584 higher total costs than patients without comorbidity. Costs were significantly higher in patients with diabetes and comorbid cardiovascular disease (US$4,788), hyperlipidemia (US$2,163), hyperacidity disorders (US$8,753), and pain (US$8,324) compared with in those without the given disease. CONCLUSION Comorbidities in patients with diabetes are highly prevalent and have substantial consequences for medical expenditures. Interestingly, hyperacidity disorders and pain were the most costly conditions. Our findings highlight the importance of developing strategies that meet the needs of patients with diabetes and comorbidities. Integrated diabetes care such as used in the Chronic Care Model may represent a useful strategy.
Resumo:
OBJECTIVES Pre-antiretroviral therapy (ART) inflammation and coagulation activation predict clinical outcomes in HIV-positive individuals. We assessed whether pre-ART inflammatory marker levels predicted the CD4 count response to ART. METHODS Analyses were based on data from the Strategic Management of Antiretroviral Therapy (SMART) trial, an international trial evaluating continuous vs. interrupted ART, and the Flexible Initial Retrovirus Suppressive Therapies (FIRST) trial, evaluating three first-line ART regimens with at least two drug classes. For this analysis, participants had to be ART-naïve or off ART at randomization and (re)starting ART and have C-reactive protein (CRP), interleukin-6 (IL-6) and D-dimer measured pre-ART. Using random effects linear models, we assessed the association between each of the biomarker levels, categorized as quartiles, and change in CD4 count from ART initiation to 24 months post-ART. Analyses adjusted for CD4 count at ART initiation (baseline), study arm, follow-up time and other known confounders. RESULTS Overall, 1084 individuals [659 from SMART (26% ART naïve) and 425 from FIRST] met the eligibility criteria, providing 8264 CD4 count measurements. Seventy-five per cent of individuals were male with the mean age of 42 years. The median (interquartile range) baseline CD4 counts were 416 (350-530) and 100 (22-300) cells/μL in SMART and FIRST, respectively. All of the biomarkers were inversely associated with baseline CD4 count in FIRST but not in SMART. In adjusted models, there was no clear relationship between changing biomarker levels and mean change in CD4 count post-ART (P for trend: CRP, P = 0.97; IL-6, P = 0.25; and D-dimer, P = 0.29). CONCLUSIONS Pre-ART inflammation and coagulation activation do not predict CD4 count response to ART and appear to influence the risk of clinical outcomes through other mechanisms than blunting long-term CD4 count gain.
Resumo:
1. The cover of plant species was recorded annually from 1988 to 2000 in nine spatially replicated plots in a species-rich, semi-natural meadow at Negrentino (southern Alps). This period showed large climatic variation and included the centennial maximum and minimum frequency of days with ≥ 10 mm of rain. 2. Changes in species composition were compared between three 4-year intervals characterized by increasingly dry weather (1988–91), a preceding extreme drought (1992–95), and increasingly wet weather (1997–2000). Redundancy analysis and anova with repeated spatial replicates were used to find trends in vegetation data across time. 3. Recruitment capacity, the potential for fast clonal growth and seasonal expansion rate were determined for abundant taxa and tested in general linear models (GLM) as predictors for rates of change in relative cover of species across the climatically defined 4-year intervals. 4. Relative cover of the major growth forms present, graminoids and forbs, changed more in the period following extreme drought than at other times. Recruitment capacity was the only predictor of species’ rates of change. 5. Following perturbation, re-colonization was the primary driver of vegetation dynamics. The dominant grasses, which lacked high recruitment from seed, therefore decreased in relative abundance. This effect persisted until the end of the study and may represent a lasting response to an extreme climatic event.
Resumo:
Background Nowadays there is extensive evidence available showing the efficacy of cognitive remediation (CR). To date, only limited evidence is available about the impact of the duration of illness on CR effects. The Integrated Neurocognitive Therapy (INT) represents a new developed CR approach. It is a manualized group therapy targeting all 11 NIMH-MATRICS domains. Methods In an international multicenter study, 166 schizophrenia outpatients (DSM-IV-TR) were randomly assigned either to INT or to Treatment-As-Usual (TAU). 60 patients were defined as Early Course group (EC) characterized by less than 5 years of illness, 40 patients were in the Long-Term group (LT) characterized by more than 15 years of illness, and 76 patients were in the Medium-Long-Term group (MLT) characterized by an illness of 5-15 years. Treatment comprised of 15 biweekly sessions. Assessments were conducted before and after treatment and at follow up (1 year). Multivariate General Linear Models (GLM) examined our hypothesis, whether EC, LT, and MLT groups differ under INT and TAU from each other in outcome. Results First of all, the attendance rate of 65% was significantly lower and the drop out rate of 18.5% during therapy was higher in the EC group compared to the other groups. Interaction effects regarding proximal outcome showed that the duration of illness has a strong impact on neurocognitive functioning in speed of processing (F>2.4) and attention (F>2.8). But INT intervention compared to TAU only had a significant effect in more chronically ill patients of MLT and LT, but not in younger patients in EC. In social cognitive domains, only the EC group showed a significant change in attribution (hostility; F>2.5), LT and MLT groups did not. However, no differences between the 3 groups were evident in memory, problem solving, and emotion perception. Regarding more distal outcome, LT patients had more symptoms compared to EC (F>4.4). Finally, EC patients showed higher improvements in psychosocial functioning compared to LT and MLT (F=1.8). Conclusions Against common expectations, long-term, more chronically ill patients showed higher effects in basal cognitive functions compared to younger patients and patients without any active therapy (TAU). On the other hand, early-course patients had a greater potential to change in attribution, symptoms and psychosocial functioning. Consequently, more integrated therapy offers are also recommended for long-term course schizophrenia patients.
Resumo:
Aim Our aims were to compare the composition of testate amoeba (TA) communities from Santa Cruz Island, Galápagos Archipelago, which are likely in existence only as a result of anthropogenic habitat transformation, with similar naturally occurring communities from northern and southern continental peatlands. Additionally, we aimed at assessing the importance of niche-based and dispersal-based processes in determining community composition and taxonomic and functional diversity. Location The humid highlands of the central island of Santa Cruz, Galápagos Archipelago. Methods We survey the alpha, beta and gamma taxonomic and functional diversities of TA, and the changes in functional traits along a gradient of wet to dry habitats. We compare the TA community composition, abundance and frequency recorded in the insular peatlands with that recorded in continental peatlands of Northern and Southern Hemispheres. We use generalized linear models to determine how environmental conditions influence taxonomic and functional diversity as well as the mean values of functional traits within communities. We finally apply variance partitioning to assess the relative importance of niche- and dispersal-based processes in determining community composition. Results TA communities in Santa Cruz Island were different from their Northern Hemisphere and South American counterparts with most genera considered as characteristic for Northern Hemisphere and South American Sphagnum peatlands missing or very rare in the Galápagos. Functional traits were most correlated with elevation and site topography and alpha functional diversity to the type of material sampled and site topography. Community composition was more strongly correlated with spatial variables than with environmental ones. Main conclusions TA communities of the Sphagnum peatlands of Santa Cruz Island and the mechanisms shaping these communities contrast with Northern Hemisphere and South American peatlands. Soil moisture was not a strong predictor of community composition most likely because rainfall and clouds provide sufficient moisture. Dispersal limitation was more important than environmental filtering because of the isolation of the insular peatlands from continental ones and the young ecological history of these ecosystems.