54 resultados para Errors-in-variables model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Video records are widely used to analyze performance in alpine skiing at professional or amateur level. Parts of these analyses require the labeling of some movements (i.e. determining when specific events occur). If differences among coaches and differences for the same coach between different dates are expected, they have never been quantified. Moreover, knowing these differences is essential to determine which parameters reliable should be used. This study aimed to quantify the precision and the repeatability for alpine skiing coaches of various levels, as it is done in other fields (Koo et al, 2005). METHODS: A software similar to commercialized products was designed to allow video analyses. 15 coaches divided into 3 groups (5 amateur coaches (G1), 5 professional instructors (G2) and 5 semi-professional coaches (G3)) were enrolled. They were asked to label 15 timing parameters (TP) according to the Swiss ski manual (Terribilini et al, 2001) for each curve. TP included phases (initiation, steering I-II), body and ski movements (e.g. rotation, weighting, extension, balance). Three video sequences sampled at 25 Hz were used and one curve per video was labeled. The first video was used to familiarize the analyzer to the software. The two other videos, corresponding to slalom and giant slalom, were considered for the analysis. G1 realized twice the analysis (A1 and A2) at different dates and TP were randomized between both analyses. Reference TP were considered as the median of G2 and G3 at A1. The precision was defined as the RMS difference between individual TP and reference TP, whereas the repeatability was calculated as the RMS difference between individual TP at A1 and at A2. RESULTS AND DISCUSSION: For G1, G2 and G3, a precision of +/-5.6 frames, +/-3.0 and +/-2.0 frames, was respectively obtained. These results showed that G2 was more precise than G1, and G3 more precise than G2, were in accordance with group levels. The repeatability for G1 was +/-3.1 frames. Furthermore, differences among TP precision were observed, considering G2 and G3, with largest differences of +/-5.9 frames for "body counter rotation movement in steering phase II", and of 0.8 frame for "ski unweighting in initiation phase". CONCLUSION: This study quantified coach ability to label video in term of precision and repeatability. The best precision was obtained for G3 and was of +/-0.08s, which corresponds to +/-6.5% of the curve cycle. Regarding the repeatability, we obtained a result of +/-0.12s for G1, corresponding to +/-12% of the curve cycle. The repeatability of G2 and G3 are expected to be lower than the precision of G1 and the corresponding repeatability will be assessed soon. In conclusion, our results indicate that the labeling of video records is reliable for some TP, whereas caution is required for others. REFERENCES Koo S, Gold MD, Andriacchi TP. (2005). Osteoarthritis, 13, 782-789. Terribilini M, et al. (2001). Swiss Ski manual, 29-46. IASS, Lucerne.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cutaneous Leishmaniasis (CL) caused by Leishmania aethiopica is a public health and social problem with a sequel of severe and mutilating skin lesions. It is manifested in three forms: localized CL (LCL), mucosal CL (MCL) and diffuse CL (DCL). Unresponsiveness to sodium stibogluconate (Sb(V)) is common in Ethiopian CL patients. Using the amastigote-macrophage in vitro model the susceptibility of 24 clinical isolates of L. aethiopica derived from untreated patients was investigated. Eight strains of LCL, 9 of MCL, and 7 of DCL patients together with a reference strain (MHOM/ET/82/117/82) were tested against four antileishmanial drugs: amphotericin B, miltefosine, Sb(V) and paromomycin. In the same order of drugs, IC(50) (μg/ml±SD) values for the 24 strains tested were 0.16±0.18, 5.88±4.79, 10.23±8.12, and 13.63±18.74. The susceptibility threshold of isolates originating from the 3 categories of patients to all 4 drugs was not different (p>0.05). Maximal efficacy was superior for miltefosine across all the strains. Further susceptibility test could validate miltefosine as a potential alternative drug in cases of sodium stibogluconate treatment failure in CL patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND:: Voltage-gated sodium channels dysregulation is important for hyperexcitability leading to pain persistence. Sodium channel blockers currently used to treat neuropathic pain are poorly tolerated. Getting new molecules to clinical use is laborious. We here propose a drug already marketed as anticonvulsant, rufinamide. METHODS:: We compared the behavioral effect of rufinamide to amitriptyline using the Spared Nerve Injury neuropathic pain model in mice. We compared the effect of rufinamide on sodium currents using in vitro patch clamp in cells expressing the voltage-gated sodium channel Nav1.7 isoform and on dissociated dorsal root ganglion neurons to amitriptyline and mexiletine. RESULTS:: In naive mice, amitriptyline (20 mg/kg) increased withdrawal threshold to mechanical stimulation from 1.3 (0.6-1.9) (median [95% CI]) to 2.3 g (2.2-2.5) and latency of withdrawal to heat stimulation from 13.1 (10.4-15.5) to 30.0 s (21.8-31.9), whereas rufinamide had no effect. Rufinamide and amitriptyline alleviated injury-induced mechanical allodynia for 4 h (maximal effect: 0.10 ± 0.03 g (mean ± SD) to 1.99 ± 0.26 g for rufinamide and 0.25 ± 0.22 g to 1.92 ± 0.85 g for amitriptyline). All drugs reduced peak current and stabilized the inactivated state of voltage-gated sodium channel Nav1.7, with similar effects in dorsal root ganglion neurons. CONCLUSIONS:: At doses alleviating neuropathic pain, amitriptyline showed alteration of behavioral response possibly related to either alteration of basal pain sensitivity or sedative effect or both. Side-effects and drug tolerance/compliance are major problems with drugs such as amitriptyline. Rufinamide seems to have a better tolerability profile and could be a new alternative to explore for the treatment of neuropathic pain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To prospectively compare various parameters of vessels imaged at 3 T by using time-of-flight (TOF) and T2-prepared magnetic resonance (MR) angiography in a rabbit model of hind limb ischemia. MATERIALS AND METHODS: Experiments were approved by the institutional animal care and use committee. Endovascular occlusion of the left superficial femoral artery was induced in 14 New Zealand white rabbits. After 2 weeks, MR angiography and conventional (x-ray) angiography were performed. Vessel sharpness was evaluated visually in the ischemic and nonischemic limbs, and the presence of small collateral vessels was evaluated in the ischemic limbs. Vessel sharpness was also quantified by evaluating the magnitude of signal intensity change at the vessel borders. RESULTS: The sharpness of vessels in the nonischemic limbs was similar between the TOF and the T2-prepared images. In the ischemic limbs, however, T2-prepared imaging, as compared with TOF imaging, generated higher vessel sharpness in arteries with diminished blood flow (mean vessel sharpness: 44% vs 30% for popliteal arteries, 45% vs 28% for saphenous arteries; P < .001 for both comparisons) and enabled better detection of small collateral vessels (93% vs 36% of vessels, P < .001). CONCLUSION: T2-prepared imaging can facilitate high-spatial-resolution MR angiography of small vessels with low blood flow and thus has potential as a tool for noninvasive evaluation of arteriogenic therapies, without use of contrast material. Supplemental material: http://radiology.rsnajnls.org/cgi/content/full/2452062067/DC1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continuous positive airway pressure, aimed at preventing pulmonary atelectasis, has been used for decades to reduce lung injury in critically ill patients. In neonatal practice, it is increasingly used worldwide as a primary form of respiratory support due to its low cost and because it reduces the need for endotracheal intubation and conventional mechanical ventilation. We studied the anesthetized in vivo rat and determined the optimal circuit design for delivery of continuous positive airway pressure. We investigated the effects of continuous positive airway pressure following lipopolysaccharide administration in the anesthetized rat. Whereas neither continuous positive airway pressure nor lipopolysaccharide alone caused lung injury, continuous positive airway pressure applied following intravenous lipopolysaccharide resulted in increased microvascular permeability, elevated cytokine protein and mRNA production, and impaired static compliance. A dose-response relationship was demonstrated whereby higher levels of continuous positive airway pressure (up to 6 cmH(2)O) caused greater lung injury. Lung injury was attenuated by pretreatment with dexamethasone. These data demonstrate that despite optimal circuit design, continuous positive airway pressure causes significant lung injury (proportional to the airway pressure) in the setting of circulating lipopolysaccharide. Although we would currently avoid direct extrapolation of these findings to clinical practice, we believe that in the context of increasing clinical use, these data are grounds for concern and warrant further investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When researchers introduce a new test they have to demonstrate that it is valid, using unbiased designs and suitable statistical procedures. In this article we use Monte Carlo analyses to highlight how incorrect statistical procedures (i.e., stepwise regression, extreme scores analyses) or ignoring regression assumptions (e.g., heteroscedasticity) contribute to wrong validity estimates. Beyond these demonstrations, and as an example, we re-examined the results reported by Warwick, Nettelbeck, and Ward (2010) concerning the validity of the Ability Emotional Intelligence Measure (AEIM). Warwick et al. used the wrong statistical procedures to conclude that the AEIM was incrementally valid beyond intelligence and personality traits in predicting various outcomes. In our re-analysis, we found that the reliability-corrected multiple correlation of their measures with personality and intelligence was up to .69. Using robust statistical procedures and appropriate controls, we also found that the AEIM did not predict incremental variance in GPA, stress, loneliness, or well-being, demonstrating the importance for testing validity instead of looking for it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of continuous traits is the central component of comparative analyses in phylogenetics, and the comparison of alternative models of trait evolution has greatly improved our understanding of the mechanisms driving phenotypic differentiation. Several factors influence the comparison of models, and we explore the effects of random errors in trait measurement on the accuracy of model selection. We simulate trait data under a Brownian motion model (BM) and introduce different magnitudes of random measurement error. We then evaluate the resulting statistical support for this model against two alternative models: Ornstein-Uhlenbeck (OU) and accelerating/decelerating rates (ACDC). Our analyses show that even small measurement errors (10%) consistently bias model selection towards erroneous rejection of BM in favour of more parameter-rich models (most frequently the OU model). Fortunately, methods that explicitly incorporate measurement errors in phylogenetic analyses considerably improve the accuracy of model selection. Our results call for caution in interpreting the results of model selection in comparative analyses, especially when complex models garner only modest additional support. Importantly, as measurement errors occur in most trait data sets, we suggest that estimation of measurement errors should always be performed during comparative analysis to reduce chances of misidentification of evolutionary processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When individuals learn by trial-and-error, they perform randomly chosen actions and then reinforce those actions that led to a high payoff. However, individuals do not always have to physically perform an action in order to evaluate its consequences. Rather, they may be able to mentally simulate actions and their consequences without actually performing them. Such fictitious learners can select actions with high payoffs without making long chains of trial-and-error learning. Here, we analyze the evolution of an n-dimensional cultural trait (or artifact) by learning, in a payoff landscape with a single optimum. We derive the stochastic learning dynamics of the distance to the optimum in trait space when choice between alternative artifacts follows the standard logit choice rule. We show that for both trial-and-error and fictitious learners, the learning dynamics stabilize at an approximate distance of root n/(2 lambda(e)) away from the optimum, where lambda(e) is an effective learning performance parameter depending on the learning rule under scrutiny. Individual learners are thus unlikely to reach the optimum when traits are complex (n large), and so face a barrier to further improvement of the artifact. We show, however, that this barrier can be significantly reduced in a large population of learners performing payoff-biased social learning, in which case lambda(e) becomes proportional to population size. Overall, our results illustrate the effects of errors in learning, levels of cognition, and population size for the evolution of complex cultural traits. (C) 2013 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 3D in vitro model of rat organotypic brain cell cultures in aggregates was used to investigate neurotoxicity mechanisms in glutaric aciduria type I (GA-I). 1 mM glutarate (GA) or 3-hydroxyglutarate (3OHGA) were repeatedly added to the culture media at two different time points. In cultures treated with 3OHGA, we observed an increase in lactate in the medium, pointing to a possible inhibition of Krebs cycle and respiratory chain. We further observed that 3OHGA and to a lesser extend GA induced an increase in ammonia production with concomitant decrease of glutamine concentrations, which may suggest an inhibition of the astrocytic enzyme glutamine synthetase. These previously unreported findings may uncover a pathogenic mechanism in this disease which has deleterious effects on early stages of brain development. By immunohistochemistry we showed that 3OHGA increased non-apoptotic cell death. On the cellular level, 3OHGA and to a lesser extend GA led to cell swelling and loss of astrocytic fibers whereas a loss of oligodendrocytes was only observed for 3OHGA. We conclude that 3OHGAwas the most toxic metabolite in our model for GA-I. 3OHGA induced deleterious effects on glial cells, an increase of ammonia production, and resulted in accentuated cell death of non-apoptotic origin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glutaric aciduria type I (glutaryl-CoA dehydrogenase deficiency) is an inborn error of metabolism that usually manifests in infancy by an acute encephalopathic crisis and often results in permanent motor handicap. Biochemical hallmarks of this disease are elevated levels of glutarate and 3-hydroxyglutarate in blood and urine. The neuropathology of this disease is still poorly understood, as low lysine diet and carnitine supplementation do not always prevent brain damage, even in early-treated patients. We used a 3D in vitro model of rat organotypic brain cell cultures in aggregates to mimic glutaric aciduria type I by repeated administration of 1 mM glutarate or 3-hydroxyglutarate at two time points representing different developmental stages. Both metabolites were deleterious for the developing brain cells, with 3-hydroxyglutarate being the most toxic metabolite in our model. Astrocytes were the cells most strongly affected by metabolite exposure. In culture medium, we observed an up to 11-fold increase of ammonium in the culture medium with a concomitant decrease of glutamine. We further observed an increase in lactate and a concomitant decrease in glucose. Exposure to 3-hydroxyglutarate led to a significantly increased cell death rate. Thus, we propose a three step model for brain damage in glutaric aciduria type I: (i) 3-OHGA causes the death of astrocytes, (ii) deficiency of the astrocytic enzyme glutamine synthetase leads to intracerebral ammonium accumulation, and (iii) high ammonium triggers secondary death of other brain cells. These unexpected findings need to be further investigated and verified in vivo. They suggest that intracerebral ammonium accumulation might be an important target for the development of more effective treatment strategies to prevent brain damage in patients with glutaric aciduria type I.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mice from the majority of inbred strains are resistant to infection by Leishmania major, an obligate intracellular protozoan parasite of macrophages in the mammalian host. In contrast, mice from BALB strains are unable to control infection and develop progressive disease. In this model of infection, genetically determined resistance and susceptibility have been clearly shown to result from the appearance of parasite-specific CD4+ T helper 1 or T helper 2 cells, respectively. This murine model of infection is considered as one of the best experimental systems for the study of the mechanisms operating in vivo at the initiation of polarised T helper 1 and T helper 2 cell maturation. Among the several factors influencing Th cell development, cytokines themselves critically regulate this process. The results accumulated during the last years have clarified some aspects of the role played by cytokines in Th cell differentiation. They are providing critical information that may ultimately lead to the rational devise of means by which to tailor immune responses to the effector functions that are most efficient in preventing and/or controlling infections with pathogens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parkinson's disease (PD) is a chronic neurodegenerative disorder characterized by progressive loss of dopaminergic (DA) neurons of the substantia nigra pars compacta with unknown aetiology. 6-Hydroxydopamine (6-OHDA) treatment of neuronal cells is an established in vivo model for mimicking the effect of oxidative stress found in PD brains. We examined the effects of 6-OHDA treatment on human neuroblastoma cells (SH-SY5Y) and primary mesencephalic cultures. Using a reverse arbitrarily primed polymerase chain reaction (RAP-PCR) approach we generated reproducible genetic fingerprints of differential expression levels in cell cultures treated with 6-OHDA. Of the resulting sequences, 23 showed considerable homology to known human coding sequences. The results of the RAP-PCR were validated by reverse transcription PCR, real-time PCR and, for selected genes, by Western blot analysis and immunofluorescence. In four cases, [tomoregulin-1 (TMEFF-1), collapsin response mediator protein 1 (CRMP-1), neurexin-1, and phosphoribosylaminoimidazole synthetase (GART)], a down-regulation of mRNA and protein levels was detected. Further studies will be necessary on the physiological role of the identified proteins and their impact on pathways leading to neurodegeneration in PD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Intranasal administration of high amount of allergen was shown to induce tolerance and to reverse the allergic phenotype. However, mechanisms of tolerance induction via the mucosal route are still unclear. Objectives: To characterize the therapeutic effects of intranasal application of ovalbumin (OVA) in a mouse model of bronchial inflammation as well as the cellular and molecular mechanisms leading to protection upon re-exposure to allergen. Methods: After induction of bronchial inflammation, mice were treated intranasally with OVA and re-exposed to OVA aerosols 10 days later. Bronchoalveolar lavage fluid (BALF), T cell proliferation and cytokine secretion were examined. The respective role of CD4(+)CD25(+) and CD4(+)CD25(-) T cells in the induction of tolerance was analysed. Results: Intranasal treatment with OVA drastically reduced inflammatory cell recruitment into BALF and bronchial hyperresponsiveness upon re-exposure to allergen. Both OVA- specific-proliferation of T cells, T(h)1 and T(h)2 cytokine production from lung and bronchial lymph nodes were inhibited. Transfer of CD4(+)CD25(-) T cells, which strongly expressed membrane-bound transforming growth factor beta (mTGF beta), from tolerized mice protected asthmatic recipient mice from subsequent aerosol challenges. The presence of CD4(+)CD25(+)(Foxp3(+)) T cells during the process of tolerization was indispensable to CD4(+)CD25(-) T cells to acquire regulatory properties. Whereas the presence of IL-10 appeared dispensable in this model, the suppression of CD4(+)CD25(-)mTGF beta(+) T cells in transfer experiments significantly impaired the down-regulation of airways inflammation. Conclusion: Nasal application of OVA in established asthma led to the induction of CD4(+)CD25(-)mTGF beta(+) T cells with regulatory properties, able to confer protection upon allergen re-exposure.