192 resultados para Medians.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: To investigate substance loss and bond strength capacity of sclerotic, non-carious cervical dentin after airborne-particle abrasion or diamond bur preparation. Methods: Fifteen non-sclerotic dentin specimens were made from crowns of extracted human incisors of which the labial surfaces had been ground with silicon carbide papers (non-sclerotic control; Group 1). Forty-five sclerotic dentin specimens (n=15/group) were made from the labial, non-carious cervical root part of extracted human incisors and underwent either no pre-treatment (sclerotic control; Group 2), pre-treatment with airborne-particle abrasion (CoJet Prep [3M ESPE] and 50 µm aluminium oxide; Group 3), or with diamond bur preparation (40 µm grit size; Group 4). Substance loss after pre-treatment was measured in Groups 3 and 4. Subsequently, Scotchbond Universal (3M ESPE) and resin composite (CeramX [DENTSPLY DeTrey]) were applied on the treated dentin surfaces. The specimens were stored at 37°C and 100% humidity for 24 h. After storage, shear bond strength (SBS) was measured and data analyzed with nonparametric ANOVA followed by Wilcoxon rank sum tests. Results: Substance loss (medians) was 19 µm in Group 3 and 113 µm in Group 4. SBS-values (MPa; medians) in Group 2 (9.24) were significantly lower than in Group 1 (13.15; p=0.0069), Group 3 (13.05; p=0.01), and Group 4 (13.02; p=0.0142). There were no significant differences in SBS between Groups 1, 3, and 4 (p≥0.8063). Conclusion: Airborne-particle abrasion and diamond bur preparation restored bond strength of Scotchbond Universal to sclerotic dentin to the level of non-sclerotic dentin, with airborne-particle abrasion being less invasive than diamond bur preparation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES Clinical benefit response (CBR), based on changes in pain, Karnofsky performance status, and weight, is an established palliative endpoint in trials for advanced gastrointestinal cancer. We investigated whether CBR is associated with survival, and whether CBR reflects a wide-enough range of domains to adequately capture patients' perception. METHODS CBR was prospectively evaluated in an international phase III chemotherapy trial in patients with advanced pancreatic cancer (n = 311) in parallel with patient-reported outcomes (PROs). RESULTS The median time to treatment failure was 3.4 months (range: 0-6). The majority of the CBRs (n = 39) were noted in patients who received chemotherapy for at least 5 months. Patients with CBR (n = 62) had longer survival than non-responders (n = 182) (hazard ratio = 0.69; 95% confidence interval: 0.51-0.94; p = 0.013). CBR was predicted with a sensitivity and specificity of 77-80% by various combinations of 3 mainly physical PROs. A comparison between the duration of CBR (n = 62, median = 8 months, range = 4-31) and clinically meaningful improvements in the PROs (n = 100-116; medians = 9-11 months, range = 4-24) showed similar intervals. CONCLUSION CBR is associated with survival and mainly reflects physical domains. Within phase III chemotherapy trials for advanced gastrointestinal cancer, CBR can be replaced by a PRO evaluation, without losing substantial information but gaining complementary information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: Assessment, whether location of impact causing different facial fracture patterns was associated with diffuse axonal injury in patients with severe closed head injury. METHODS: Retrospectively all patients referred to the Trauma Unit of the University Hospital of Zurich, Switzerland between 1996 and 2002 presenting with severe closed head injuries (Abbreviated Injury Scale (AIS) (face) of 2-4 and an AIS (head and neck) of 3-5) were assessed according to the Glasgow Coma Scale (GCS) and the Injury Severity Score (ISS). Facial fracture patterns were classified as resulting from frontal, oblique or lateral impact. All patients had undergone computed tomography. The association between impact location and diffuse axonal injury when correcting for the level of consciousness (using the Glasgow scale) and severity of injury (using the ISS) was calculated with a multivariate regression analysis. RESULTS: Of 200 screened patients, 61 fulfilled the inclusion criteria for severe closed head injury. The medians (interquartile ranges 25;75) for GCS, AIS(face) AIS(head and neck) and ISS were 3 (3;13), 2 (2;4), 4 (4;5) and 30 (24;41), respectively. A total of 51% patients had frontal, 26% had an oblique and 23% had lateral trauma. A total of 21% patients developed diffuse axonal injury (DAI) when compared with frontal impact, the likelihood of diffuse axonal injury increased 11.0 fold (1.7-73.0) in patients with a lateral impact. CONCLUSIONS: Clinicians should be aware of the substantial increase of diffuse axonal injury related to lateral impact in patients with severe closed head injuries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES The aim of this study was to optimise dexmedetomidine and alfaxalone dosing, for intramuscular administration with butorphanol, to perform minor surgeries in cats. METHODS Initially, cats were assigned to one of five groups, each composed of six animals and receiving, in addition to 0.3 mg/kg butorphanol intramuscularly, one of the following: (A) 0.005 mg/kg dexmedetomidine, 2 mg/kg alfaxalone; (B) 0.008 mg/kg dexmedetomidine, 1.5 mg/kg alfaxalone; (C) 0.012 mg/kg dexmedetomidine, 1 mg/kg alfaxalone; (D) 0.005 mg/kg dexmedetomidine, 1 mg/kg alfaxalone; and (E) 0.012 mg/kg dexmedetomidine, 2 mg/kg alfaxalone. Thereafter, a modified 'direct search' method, conducted in a stepwise manner, was used to optimise drug dosing. The quality of anaesthesia was evaluated on the basis of composite scores (one for anaesthesia and one for recovery), visual analogue scales and the propofol requirement to suppress spontaneous movements. The medians or means of these variables were used to rank the treatments; 'unsatisfactory' and 'promising' combinations were identified to calculate, through the equation first described by Berenbaum in 1990, new dexmedetomidine and alfaxalone doses to be tested in the next step. At each step, five combinations (one new plus the best previous four) were tested. RESULTS None of the tested combinations resulted in adverse effects. Four steps and 120 animals were necessary to identify the optimal drug combination (0.014 mg/kg dexmedetomidine, 2.5 mg/kg alfaxalone and 0.3 mg/kg butorphanol). CONCLUSIONS AND RELEVANCE The investigated drug mixture, at the doses found with the optimisation method, is suitable for cats undergoing minor clinical procedures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Patients who had started HAART (Highly Active Anti-Retroviral Treatment) under previous aggressive DHHS guidelines (1997) underwent a life-long continuous HAART that was associated with many short term as well as long term complications. Many interventions attempted to reduce those complications including intermittent treatment also called pulse therapy. Many studies were done to study the determinants of rate of fall in CD4 count after interruption as this data would help guide treatment interruptions. The data set used here was a part of a cohort study taking place at the Johns Hopkins AIDS service since January 1984, in which the data were collected both prospectively and retrospectively. The patients in this data set consisted of 47 patients receiving via pulse therapy with the aim of reducing the long-term complications. ^ The aim of this project was to study the impact of virologic and immunologic factors on the rate of CD4 loss after treatment interruption. The exposure variables under investigation included CD4 cell count and viral load at treatment initiation. The rates of change of CD4 cell count after treatment interruption was estimated from observed data using advanced longitudinal data analysis methods (i.e., linear mixed model). Using random effects accounted for repeated measures of CD4 per person after treatment interruption. The regression coefficient estimates from the model was then used to produce subject specific rates of CD4 change accounting for group trends in change. The exposure variables of interest were age, race, and gender, CD4 cell counts and HIV RNA levels at HAART initiation. ^ The rate of fall of CD4 count did not depend on CD4 cell count or viral load at initiation of treatment. Thus these factors may not be used to determine who can have a chance of successful treatment interruption. CD4 and viral load were again studied by t-tests and ANOVA test after grouping based on medians and quartiles to see any difference in means of rate of CD4 fall after interruption. There was no significant difference between the groups suggesting that there was no association between rate of fall of CD4 after treatment interruption and above mentioned exposure variables. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the biomedical studies, the general data structures have been the matched (paired) and unmatched designs. Recently, many researchers are interested in Meta-Analysis to obtain a better understanding from several clinical data of a medical treatment. The hybrid design, which is combined two data structures, may create the fundamental question for statistical methods and the challenges for statistical inferences. The applied methods are depending on the underlying distribution. If the outcomes are normally distributed, we would use the classic paired and two independent sample T-tests on the matched and unmatched cases. If not, we can apply Wilcoxon signed rank and rank sum test on each case. ^ To assess an overall treatment effect on a hybrid design, we can apply the inverse variance weight method used in Meta-Analysis. On the nonparametric case, we can use a test statistic which is combined on two Wilcoxon test statistics. However, these two test statistics are not in same scale. We propose the Hybrid Test Statistic based on the Hodges-Lehmann estimates of the treatment effects, which are medians in the same scale.^ To compare the proposed method, we use the classic meta-analysis T-test statistic on the combined the estimates of the treatment effects from two T-test statistics. Theoretically, the efficiency of two unbiased estimators of a parameter is the ratio of their variances. With the concept of Asymptotic Relative Efficiency (ARE) developed by Pitman, we show ARE of the hybrid test statistic relative to classic meta-analysis T-test statistic using the Hodges-Lemann estimators associated with two test statistics.^ From several simulation studies, we calculate the empirical type I error rate and power of the test statistics. The proposed statistic would provide effective tool to evaluate and understand the treatment effect in various public health studies as well as clinical trials.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sediments were collected with Eckman and Petersen dredges from the bottom of Trout Lake, northern Wisconsin, at 221 stations. Sampling was done with a spud sampler at 32 stations, and core samples were obtained with a Jenkins and Mortimer and a Twenhofel sampler at 17 stations. The shore and offshore deposits of the shores of Trout Lake and the shores of the islands are described. Megascopic descriptions are given of the samples collected with the Eckman and Petersen dredges. Sediments on bottoms of about 10 meters or deeper are mainly gyttja, or crusts composed of mixtures of organic matter, ferric hydroxide, and some form of manganese oxide. The latter deposits are extensive. Detailed descriptions of some of the samples of sands are given, and generalizations respecting size and distribution are made. Tables showing quartiles, medians, and coefficients of sorting and skewness of the coarse sediments collected from the bottom are given in tables. Mechanical analyses of all fine sediments, mainly gyttja, were not made, as previous experience seems to have demonstrated that results have no sedimentational value. Organic matter of the gyttja was determined and also the percentages of lignin in the organic matter. Core samples are composed almost entirely of fine materials, mainly gyttja, and determinations were made on these samples in the same way as on the samples obtained with the Eckman and Petersen dredges. Studies of the core samples show that the fine sediments usually contain in excess of 90 per cent moisture and there is very little change in the moisture content from top to bottom of cores. A map shows the distribution of the iron and manganese deposits. These deposits were found to contain 10 to 20 per cent of organic matter, 11 to 16 per cent of metallic iron, and 12 to 30 per cent of metallic manganese. No stratification of any kind was found in any of the deep-water sediments of Trout Lake except in the iron and manganese crusts. Absence of stratification is considered to be due to the slow rate of deposition and the mixing of sediments by organisms which dwell in them. The data indicate that the rate of deposition in the deep waters of Trout Lake is of the order of 1 foot in 15,000 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The abundances and distribution of metazoan within-ice meiofauna (13 stations) and under-ice fauna (12 stations) were investigated in level sea ice and sea-ice ridges in the Chukchi/Beaufort Seas and Canada Basin in June/July 2005 using a combination of ice coring and SCUBA diving. Ice meiofauna abundance was estimated based on live counts in the bottom 30 cm of level sea ice based on triplicate ice core sampling at each location, and in individual ice chunks from ridges at four locations. Under-ice amphipods were counted in situ in replicate (N=24-65 per station) 0.25 m**2 quadrats using SCUBA to a maximum water depth of 12 m. In level sea ice, the most abundant ice meiofauna groups were Turbellaria (46%), Nematoda (35%), and Harpacticoida (19%), with overall low abundances per station that ranged from 0.0 to 10.9 ind/l (median 0.8 ind/l). In level ice, low ice algal pigment concentrations (<0.1-15.8 µg Chl a /l), low brine salinities (1.8-21.7) and flushing from the melting sea ice likely explain the low ice meiofauna concentrations. Higher abundances of Turbellaria, Nematoda and Harpacticoida also were observed in pressure ridges (0-200 ind/l, median 40 ind/l), although values were highly variable and only medians of Turbellaria were significantly higher in ridge ice than in level ice. Median abundances of under-ice amphipods at all ice types (level ice, various ice ridge structures) ranged from 8 to 114 ind/m**2 per station and mainly consisted of Apherusa glacialis (87%), Onisimus spp. (7%) and Gammarus wilkitzkii (6%). Highest amphipod abundances were observed in pressure ridges at depths >3 m where abundances were up to 42-fold higher compared with level ice. We propose that the summer ice melt impacted meiofauna and under-ice amphipod abundance and distribution through (a) flushing, and (b) enhanced salinity stress at thinner level sea ice (less than 3 m thickness). We further suggest that pressure ridges, which extend into deeper, high-salinity water, become accumulation regions for ice meiofauna and under-ice amphipods in summer. Pressure ridges thus might be crucial for faunal survival during periods of enhanced summer ice melt. Previous estimates of Arctic sea ice meiofauna and under-ice amphipods on regional and pan-Arctic scales likely underestimate abundances at least in summer because they typically do not include pressure ridges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La siniestralidad por salida izquierda de vía en carreteras de gran capacidad es un problema que, además de las dramáticas situaciones a las que da lugar, inflige a la sociedad elevados costes. Por ello, debe prestarse una intensa atención al diseño de las medianas y a la disposición de barreras en ellas, con el objetivo de evitar que se produzcan este tipo de accidentes y limitar las consecuencias de los que aún así tengan lugar. Habitualmente las medianas de autovías se diseñan aplicando casi sistemáticamente los parámetros mínimos exigidos normativamente y generalmente con barreras de seguridad adosadas o muy próximas a uno de los arcenes interiores. Sin embargo, tanto las recomendaciones técnicas nacionales como la bibliografía internacional recomiendan llevar a cabo un estudio económico de alternativas antes que colocar barreras y, si está justificada su disposición, alejarla de la calzada disponiéndola próxima al eje de la mediana. En esta tesis se analizan las ventajas y limitaciones que tiene la disposición de barrera próxima al eje de la mediana. Se ha investigado sobre el comportamiento de los vehículos al circular por la mediana y se muestra cómo se ha instalado en la obra de la autovía A‐40, Tramo: Villarrubia de Santiago‐Santa Cruz de la Zarza, destacando los aspectos más novedosos y llamativos pero que se ajustan a las mejores prácticas en la materia y también a la normativa de aplicación. ABSTRACT Many dramatic situations are caused by cross‐median traffic accidents which imply high costs for society, both in human and economic terms. It is therefore important that special attention should be paid to the design of highway medians and to the installation of safety barriers so as to avoid these kinds of incidents and to reduce their consequences. Highway median are usually designed with the application of minimum parameters, according to regulations, with the installation of safety barriers against or close to the inside border. However, Spanish technical regulations and international bibliography recommend a prior study to be carried out with the purpose of finding alternatives to this installation of safety barriers and if necessary, the installation of the safety barrier close to the centre of the median. This thesis directs its analysis towards the advantages and restrictions of installing the safety barrier close to the centre of the median. Research has shown vehicle response when within the median and we show the installation of safety barriers in the A‐40 highway stretch: Villarrubia de Santiago – Santa Cruz de la Zarza, highlighting the aspects that should be taken into account as best practices for road safety and technical regulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives three related results: (i) a new, simple, fast, monotonically converging algorithm for deriving the L1-median of a data cloud in ℝd, a problem that can be traced to Fermat and has fascinated applied mathematicians for over three centuries; (ii) a new general definition for depth functions, as functions of multivariate medians, so that different definitions of medians will, correspondingly, give rise to different dept functions; and (iii) a simple closed-form formula of the L1-depth function for a given data cloud in ℝd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As Ciências Forenses empregam a técnica de Reconstrução Facial buscando aumentar as possibilidades de reconhecimento humano. Após análise antropológica, a face é esculpida sobre o crânio esqueletizado e divulgada na mídia. Existem várias metodologias para a modelagem do rosto e das características da face, bem como vários dados de espessuras de tecidos moles que auxiliam no contorno facial. Com o intuito de investigar se existe uma metodologia que favoreça mais reconhecimentos e que permita uma maior semelhança com o indivíduo, este trabalho buscou comparar reconstruções faciais manuais feitas com duas abordagens para o preenchimento dos tecidos moles (métodos Americano e de Manchester) e para a predição dos olhos, nariz, boca e orelhas. Também buscou comparar reconstruções realizadas com quatro tabelas de espessuras de tecidos moles, desenvolvidas para brasileiros por estudos prévios, observando a possibilidade de unir esses dados para auxiliar na reconstrução. Um quarto objetivo foi averiguar se existe influência do sexo e do conhecimento anatômico ou forense na frequência de reconhecimentos. O estudo foi dividido em duas fases. Na primeira, duas reconstruções foram realizadas para dois indivíduos alvos (um homem e uma mulher) com os métodos Americano e de Manchester, aplicando dois guias para olhos, nariz, boca e orelhas. As reconstruções foram avaliadas por quarenta indivíduos (homens e mulheres, divididos em 4 grupos - alunos de graduação em Odontologia que não passaram pela disciplina de Odontologia Legal, alunos de graduação em Odontologia que passaram pela disciplina, especialistas em Odontologia Legal e indivíduos que não possuíam conhecimento de anatomia humana) por meio dos testes de reconhecimento e semelhança. Para o alvo feminino, as frequências de reconhecimentos foram 20% e 10% para os métodos Americano e de Manchester, respectivamente; para o alvo masculino, as frequências foram 35% e 17,5%. Em relação à semelhança, as medianas foram menores que 3 (em uma escala de 1 a 5); entretanto, foi verificada uma exceção para a escultura feita com o método Americano para o alvo masculino, a qual apresentou mediana 3. Na segunda fase, reconstruções faciais para quatro alvos (dois homens e duas mulheres) foram obtidas com o método Americano, considerando as quatro tabelas de espessuras de tecidos moles para brasileiros. Dezesseis reconstruções foram avaliadas por cento e vinte indivíduos, também pelos testes de reconhecimento e semelhança. Assim como na fase I, foram considerados o sexo e o grupo dos avaliadores. Para o alvo 1, as proporções de acertos são significativamente maiores para reconstruções feitas com as tabelas de cadáveres (44% e 38%) em relação às com os dados de exames de imagem. Para o alvo 4, as proporções de acertos com os dados de cadáveres (Tedeschi-Oliveira et al.) e com os de ressonância magnética foram significativamente maiores comparados às reconstruções com dados de tomografias computadorizadas. Em relação à semelhança, somente o alvo 1 mostrou diferenças significativas de frequências de semelhança leve entre reconstruções. Além disso, não houve influência nem do sexo, nem do conhecimento de anatomia nas frequências de reconhecimentos corretos. Espera-se que a tabela proposta possa ser empregada para a população brasileira.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context. The ongoing Gaia-ESO Public Spectroscopic Survey is using FLAMES at the VLT to obtain high-quality medium-resolution Giraffe spectra for about 105 stars and high-resolution UVES spectra for about 5000 stars. With UVES, the Survey has already observed 1447 FGK-type stars. Aims. These UVES spectra are analyzed in parallel by several state-of-the-art methodologies. Our aim is to present how these analyses were implemented, to discuss their results, and to describe how a final recommended parameter scale is defined. We also discuss the precision (method-to-method dispersion) and accuracy (biases with respect to the reference values) of the final parameters. These results are part of the Gaia-ESO second internal release and will be part of its first public release of advanced data products. Methods. The final parameter scale is tied to the scale defined by the Gaia benchmark stars, a set of stars with fundamental atmospheric parameters. In addition, a set of open and globular clusters is used to evaluate the physical soundness of the results. Each of the implemented methodologies is judged against the benchmark stars to define weights in three different regions of the parameter space. The final recommended results are the weighted medians of those from the individual methods. Results. The recommended results successfully reproduce the atmospheric parameters of the benchmark stars and the expected Teff-log  g relation of the calibrating clusters. Atmospheric parameters and abundances have been determined for 1301 FGK-type stars observed with UVES. The median of the method-to-method dispersion of the atmospheric parameters is 55 K for Teff, 0.13 dex for log  g and 0.07 dex for [Fe/H]. Systematic biases are estimated to be between 50−100 K for Teff, 0.10−0.25 dex for log  g and 0.05−0.10 dex for [Fe/H]. Abundances for 24 elements were derived: C, N, O, Na, Mg, Al, Si, Ca, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Y, Zr, Mo, Ba, Nd, and Eu. The typical method-to-method dispersion of the abundances varies between 0.10 and 0.20 dex. Conclusions. The Gaia-ESO sample of high-resolution spectra of FGK-type stars will be among the largest of its kind analyzed in a homogeneous way. The extensive list of elemental abundances derived in these stars will enable significant advances in the areas of stellar evolution and Milky Way formation and evolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le traumatisme craniocérébral léger (TCCL) a des effets complexes sur plusieurs fonctions cérébrales, dont l’évaluation et le suivi peuvent être difficiles. Les problèmes visuels et les troubles de l’équilibre font partie des plaintes fréquemment rencontrées après un TCCL. En outre, ces problèmes peuvent continuer à affecter les personnes ayant eu un TCCL longtemps après la phase aiguë du traumatisme. Cependant, les évaluations cliniques conventionnelles de la vision et de l’équilibre ne permettent pas, la plupart du temps, d’objectiver ces symptômes, surtout lorsqu’ils s’installent durablement. De plus, il n’existe pas, à notre connaissance, d’étude longitudinale ayant étudié les déficits visuels perceptifs, en tant que tels, ni les troubles de l’équilibre secondaires à un TCCL, chez l’adulte. L’objectif de ce projet était donc de déterminer la nature et la durée des effets d’un tel traumatisme sur la perception visuelle et sur la stabilité posturale, en évaluant des adultes TCCL et contrôles sur une période d’un an. Les mêmes sujets, exactement, ont participé aux deux expériences, qui ont été menées les mêmes jours pour chacun des sujets. L’impact du TCCL sur la perception visuelle de réseaux sinusoïdaux définis par des attributs de premier et de second ordre a d’abord été étudié. Quinze adultes diagnostiqués TCCL ont été évalués 15 jours, 3 mois et 12 mois après leur traumatisme. Quinze adultes contrôles appariés ont été évalués à des périodes identiques. Des temps de réaction (TR) de détection de clignotement et de discrimination de direction de mouvement ont été mesurés. Les niveaux de contraste des stimuli de premier et de second ordre ont été ajustés pour qu’ils aient une visibilité comparable, et les moyennes, médianes, écarts-types (ET) et écarts interquartiles (EIQ) des TR correspondant aux bonnes réponses ont été calculés. Le niveau de symptômes a également été évalué pour le comparer aux données de TR. De façon générale, les TR des TCCL étaient plus longs et plus variables (plus grands ET et EIQ) que ceux des contrôles. De plus, les TR des TCCL étaient plus courts pour les stimuli de premier ordre que pour ceux de second ordre, et plus variables pour les stimuli de premier ordre que pour ceux de second ordre, dans la condition de discrimination de mouvement. Ces observations se sont répétées au cours des trois sessions. Le niveau de symptômes des TCCL était supérieur à celui des participants contrôles, et malgré une amélioration, cet écart est resté significatif sur la période d’un an qui a suivi le traumatisme. La seconde expérience, elle, était destinée à évaluer l’impact du TCCL sur le contrôle postural. Pour cela, nous avons mesuré l’amplitude d’oscillation posturale dans l’axe antéropostérieur et l’instabilité posturale (au moyen de la vitesse quadratique moyenne (VQM) des oscillations posturales) en position debout, les pieds joints, sur une surface ferme, dans cinq conditions différentes : les yeux fermés, et dans un tunnel virtuel tridimensionnel soit statique, soit oscillant de façon sinusoïdale dans la direction antéropostérieure à trois vitesses différentes. Des mesures d’équilibre dérivées de tests cliniques, le Bruininks-Oseretsky Test of Motor Proficiency 2nd edition (BOT-2) et le Balance Error Scoring System (BESS) ont également été utilisées. Les participants diagnostiqués TCCL présentaient une plus grande instabilité posturale (une plus grande VQM des oscillations posturales) que les participants contrôles 2 semaines et 3 mois après le traumatisme, toutes conditions confondues. Ces troubles de l’équilibre secondaires au TCCL n’étaient plus présents un an après le traumatisme. Ces résultats suggèrent également que les déficits affectant les processus d’intégration visuelle mis en évidence dans la première expérience ont pu contribuer aux troubles de l’équilibre secondaires au TCCL. L’amplitude d’oscillation posturale dans l’axe antéropostérieur de même que les mesures dérivées des tests cliniques d’évaluation de l’équilibre (BOT-2 et BESS) ne se sont pas révélées être des mesures sensibles pour quantifier le déficit postural chez les sujets TCCL. L’association des mesures de TR à la perception des propriétés spécifiques des stimuli s’est révélée être à la fois une méthode de mesure particulièrement sensible aux anomalies visuomotrices secondaires à un TCCL, et un outil précis d’investigation des mécanismes sous-jacents à ces anomalies qui surviennent lorsque le cerveau est exposé à un traumatisme léger. De la même façon, les mesures d’instabilité posturale se sont révélées suffisamment sensibles pour permettre de mesurer les troubles de l’équilibre secondaires à un TCCL. Ainsi, le développement de tests de dépistage basés sur ces résultats et destinés à l’évaluation du TCCL dès ses premières étapes apparaît particulièrement intéressant. Il semble également primordial d’examiner les relations entre de tels déficits et la réalisation d’activités de la vie quotidienne, telles que les activités scolaires, professionnelles ou sportives, pour déterminer les impacts fonctionnels que peuvent avoir ces troubles des fonctions visuomotrice et du contrôle de l’équilibre.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Research, Development and Technology, Washington, D.C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.