169 resultados para Load force
Resumo:
Knowledge of the quantitative genetics of resistance to parasitism is key to appraise host evolutionary responses to parasite selection. Here, we studied effects of common origin (i.e. genetic and pre-hatching parental effects) and common rearing environment (i.e. post-hatching parental effects and other environment effects) on variance in ectoparasite load in nestling Alpine swifts (Apus melba). This colonial bird is intensely parasitized by blood sucking louse-flies that impair nestling development and survival. By cross-fostering half of the hatchlings between pairs of nests, we show strong significant effect of common rearing environment on variance (90.7% in 2002 and 90.9% in 2003) in the number of louse-flies per nestling and no significant effect of common origin on variance in the number of louse-flies per nestling. In contrast, significant effects of common origin were found for all the nestling morphological traits (i.e. body mass, wing length, tail length, fork length and sternum length) under investigation. Hence, our study suggests that genetic and pre-hatching parental effects play little role in the distribution of parasites among nestling Alpine swifts, and thus that nestlings have only limited scope for evolutionary responses against parasites. Our results highlight the need to take into consideration environmental factors, including the evolution of post-hatching parental effects such as nest sanitation, in our understanding of host-parasite relationships.
Resumo:
During an infection the antigen-nonspecific memory CD8 T cell compartment is not simply an inert pool of cells, but becomes activated and cytotoxic. It is unknown how these cells contribute to the clearance of an infection. We measured the strength of T cell receptor (TCR) signals that bystander-activated, cytotoxic CD8 T cells (BA-CTLs) receive in vivo and found evidence of limited TCR signaling. Given this marginal contribution of the TCR, we asked how BA-CTLs identify infected target cells. We show that target cells express NKG2D ligands following bacterial infection and demonstrate that BA-CTLs directly eliminate these target cells in an innate-like, NKG2D-dependent manner. Selective inhibition of BA-CTL-mediated killing led to a significant defect in pathogen clearance. Together, these data suggest an innate role for memory CD8 T cells in the early immune response before the onset of a de novo generated, antigen-specific CD8 T cell response.
Resumo:
BACKGROUND: Tendon transfers and calcaneal osteotomies are commonly used to treat symptoms related to medial ankle arthrosis in fixed pes cavovarus. However, the relative effect of these osteotomies in terms of lateralizing the ground contact point of the hindfoot and redistributing ankle joint contact stresses are unknown. MATERIALS AND METHODS: Pes cavovarus with fixed hindfoot varus was simulated in eight cadaver specimens. The effect of three types of calcaneal osteotomies on the migration of the center of force and tibiotalar peak pressure at 300 N axial static load (half-body weight) were recorded using pressure sensors. RESULTS: A significant lateral shift of the center of force was observed: 4.9 mm for the laterally closing Z-shaped osteotomy with additional lateralization of the tuberosity, 3.4 mm for the lateral sliding osteotomy of the calcaneal tuberosity, and 2.7 mm for the laterally closing Z-shaped osteotomy (all p < 0.001). A significant peak pressure reduction was recorded: -0.53 MPa for the Z-shaped osteotomy with lateralization, -0.58 MPa for the lateral sliding osteotomy of the calcaneal tuberosity, and -0.41 MPa for the Z-shaped osteotomy (all p < 0.01). CONCLUSION: This cadaver study supports the hypothesis that lateralizing calcaneal osteotomies substantially help to normalize ankle contact stresses in pes cavovarus.
Resumo:
This study aimed to examine the effects of a 5-h hilly run on ankle plantar (PF) and dorsal flexor (DF) force and fatigability. It was hypothesised that DF fatigue/fatigability would be greater than PF fatigue/fatigability. Eight male trail long distance runners (42.5 ± 5.9 years) were tested for ankle PF and DF maximal voluntary isokinetic contraction strength and fatigue resistance tests (percent decrement score), maximal voluntary and electrically evoked isometric contraction strength before and after the run. Maximal EMG root mean square (RMS(max)) and mean power frequency (MPF) values of the tibialis anterior (TA), gastrocnemius lateralis (GL) and soleus (SOL) EMG activity were calculated. The peak torque of the potentiated high- and low-frequency doublets and the ratio of paired stimulation peak torques at 10 Hz over 100 Hz (Db10:100) were analysed for PF. Maximal voluntary isometric contraction strength of PF decreased from pre- to post-run (-17.0 ± 6.2%; P < 0.05), but no significant decrease was evident for DF (-7.9 ± 6.2%). Maximal voluntary isokinetic contraction strength and fatigue resistance remained unchanged for both PF and DF. RMS(max) SOL during maximal voluntary isometric contraction and RMS(max) TA during maximal voluntary isokinetic contraction were decreased (P < 0.05) after the run. For MPF, a significant decrease for TA (P < 0.05) was found and the ratio Db10:100 decreased for PF (-6.5 ± 6.0%; P < 0.05). In conclusion, significant isometric strength loss was only detected for PF after a 5-h hilly run and was partly due to low-frequency fatigue. This study contradicted the hypothesis that neuromuscular alterations due to prolonged hilly running are predominant for DF.
Resumo:
Disorders of language, spatial perception, attention, memory, calculation and praxis are a frequent consequence of acquired brain damage [in particular, stroke and traumatic brain injury (TBI)] and a major determinant of disability. The rehabilitation of aphasia and, more recently, of other cognitive disorders is an important area of neurological rehabilitation. We report here a review of the available evidence about effectiveness of cognitive rehabilitation. Given the limited number and generally low quality of randomized clinical trials (RCTs) in this area of therapeutic intervention, the Task Force considered, besides the available Cochrane reviews, evidence of lower classes which was critically analysed until a consensus was reached. In particular, we considered evidence from small group or single cases studies including an appropriate statistical evaluation of effect sizes. The general conclusion is that there is evidence to award a grade A, B or C recommendation to some forms of cognitive rehabilitation in patients with neuropsychological deficits in the post-acute stage after a focal brain lesion (stroke, TBI). These include aphasia therapy, rehabilitation of unilateral spatial neglect (ULN), attentional training in the post-acute stage after TBI, the use of electronic memory aids in memory disorders, and the treatment of apraxia with compensatory strategies. There is clearly a need for adequately designed studies in this area, which should take into account specific problems such as patient heterogeneity and treatment standardization.
Resumo:
FRAX(®) is a fracture risk assessment algorithm developed by the World Health Organization in cooperation with other medical organizations and societies. Using easily available clinical information and femoral neck bone mineral density (BMD) measured by dual-energy X-ray absorptiometry (DXA), when available, FRAX(®) is used to predict the 10-year probability of hip fracture and major osteoporotic fracture. These values may be included in country specific guidelines to aid clinicians in determining when fracture risk is sufficiently high that the patient is likely to benefit from pharmacological therapy to reduce that risk. Since the introduction of FRAX(®) into clinical practice, many practical clinical questions have arisen regarding its use. To address such questions, the International Society for Clinical Densitometry (ISCD) and International Osteoporosis Foundations (IOF) assigned task forces to review the best available medical evidence and make recommendations for optimal use of FRAX(®) in clinical practice. Questions were identified and divided into three general categories. A task force was assigned to investigating the medical evidence in each category and developing clinically useful recommendations. The BMD Task Force addressed issues that included the potential use of skeletal sites other than the femoral neck, the use of technologies other than DXA, and the deletion or addition of clinical data for FRAX(®) input. The evidence and recommendations were presented to a panel of experts at the ISCD-IOF FRAX(®) Position Development Conference, resulting in the development of ISCD-IOF Official Positions addressing FRAX(®)-related issues.
Resumo:
OBJECTIVES: An article by the Swiss AIDS Commission states that patients with stably suppressed viraemia [i.e. several successive HIV-1 RNA plasma concentrations (viral loads, VL) below the limits of detection during 6 months or more of highly active antiretroviral therapy (HAART)] are unlikely to be infectious. Questions then arise: how reliable is the undetectability of the VL, given the history of measures? What factors determine reliability? METHODS: We assessed the probability (henceforth termed reliability) that the n+1 VL would exceed 50 or 1000 HIV-1 RNA copies/mL when the nth one had been <50 copies/mL in 6168 patients of the Swiss HIV Cohort Study who were continuing to take HAART between 2003 and 2007. General estimating equations were used to analyse potential factors of reliability. RESULTS: With a cut-off at 50 copies/mL, reliability was 84.5% (n=1), increasing to 94.5% (n=5). Compliance, the current type of HAART and the first antiretroviral therapy (ART) received (HAART or not) were predictive factors of reliability. With a cut-off at 1000 copies/mL, reliability was 97.5% (n=1), increasing to 99.1% (n=4). Chart review revealed that patients had stopped their treatment, admitted to major problems with compliance or were taking non-HAART ART in 72.2% of these cases. Viral escape caused by resistance was found in 5.6%. No explanation was found in the charts of 22.2% of cases. CONCLUSIONS: After several successive VLs at <50 copies/mL, reliability reaches approximately 94% with a cut-off of 50 copies/mL and approximately 99% with a cut-off at 1000 copies/mL. Compliance is the most important factor predicting reliability.
Resumo:
OBJECTIVES: Toll-like receptors (TLRs) are innate immune sensors that are integral to resisting chronic and opportunistic infections. Mounting evidence implicates TLR polymorphisms in susceptibilities to various infectious diseases, including HIV-1. We investigated the impact of TLR single nucleotide polymorphisms (SNPs) on clinical outcome in a seroincident cohort of HIV-1-infected volunteers. DESIGN: We analyzed TLR SNPs in 201 antiretroviral treatment-naive HIV-1-infected volunteers from a longitudinal seroincident cohort with regular follow-up intervals (median follow-up 4.2 years, interquartile range 4.4). Participants were stratified into two groups according to either disease progression, defined as peripheral blood CD4(+) T-cell decline over time, or peak and setpoint viral load. METHODS: Haplotype tagging SNPs from TLR2, TLR3, TLR4, and TLR9 were detected by mass array genotyping, and CD4(+) T-cell counts and viral load measurements were determined prior to antiretroviral therapy initiation. The association of TLR haplotypes with viral load and rapid progression was assessed by multivariate regression models using age and sex as covariates. RESULTS: Two TLR4 SNPs in strong linkage disequilibrium [1063 A/G (D299G) and 1363 C/T (T399I)] were more frequent among individuals with high peak viral load compared with low/moderate peak viral load (odds ratio 6.65, 95% confidence interval 2.19-20.46, P < 0.001; adjusted P = 0.002 for 1063 A/G). In addition, a TLR9 SNP previously associated with slow progression was found less frequently among individuals with high viral setpoint compared with low/moderate setpoint (odds ratio 0.29, 95% confidence interval 0.13-0.65, P = 0.003, adjusted P = 0.04). CONCLUSION: This study suggests a potentially new role for TLR4 polymorphisms in HIV-1 peak viral load and confirms a role for TLR9 polymorphisms in disease progression.
Resumo:
Recent data on the AFM studies of nucleoprotein complexes of different types are reviewed in this paper. The first section describes the progress in the sample preparation methods for AFM studies of nucleic acids and nucleoprotein complexes. The second part of this paper reviews AFM data on studies of complexes of DNA with regulatory proteins. These studies include two different types of DNA distortion induced by proteins binding: local bending of DNA at sites of protein binding and formation of large loops due to protein-protein interactions between molecules bound to distant sites along the DNA molecules (DNA looping). The prospects for use of AFM for physical mapping of genomes are discussed in this section as well. The third part of the paper reviews data on studies of complexes of DNA with non-sequence specific binding proteins. Special emphasis is given to studies of chromatin which have resulted in progress in the understanding of structure of native chromatin fiber. In this section, novel data on AFM studies of RecA-DNA filaments and complexes of dsRNA with the dsRNA-specific protein p25 are also presented. Discussion of the substrate preparation procedures in relation to the AFM studies of nucleoprotein complexes is given in the final section.
Resumo:
Human-induced habitat fragmentation constitutes a major threat to biodiversity. Both genetic and demographic factors combine to drive small and isolated populations into extinction vortices. Nevertheless, the deleterious effects of inbreeding and drift load may depend on population structure, migration patterns, and mating systems and are difficult to predict in the absence of crossing experiments. We performed stochastic individual-based simulations aimed at predicting the effects of deleterious mutations on population fitness (offspring viability and median time to extinction) under a variety of settings (landscape configurations, migration models, and mating systems) on the basis of easy-to-collect demographic and genetic information. Pooling all simulations, a large part (70%) of variance in offspring viability was explained by a combination of genetic structure (F(ST)) and within-deme heterozygosity (H(S)). A similar part of variance in median time to extinction was explained by a combination of local population size (N) and heterozygosity (H(S)). In both cases the predictive power increased above 80% when information on mating systems was available. These results provide robust predictive models to evaluate the viability prospects of fragmented populations.