979 resultados para Moderation statistical analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: In equine laminitis, the deep digital flexor muscle (DDFM) appears to have increased muscle force, but evidence-based confirmation is lacking. OBJECTIVES: The purpose of this study was to test if the DDFM of laminitic equines has an increased muscle force detectable by needle electromyography interference pattern analysis (IPA). ANIMALS AND METHODS: The control group included six Royal Dutch Sport horses, three Shetland ponies and one Welsh pony [10 healthy, sound adults weighing 411 ± 217 kg (mean ± SD) and aged 10 ± 5 years]. The laminitic group included three Royal Dutch Sport horses, one Friesian, one Haflinger, one Icelandic horse, one Welsh pony, one miniature Appaloosa and six Shetland ponies (14 adults, weight 310 ± 178 kg, aged 13 ± 6 years) with acute/chronic laminitis. The electromyography IPA measurements included firing rate, turns/second (T), amplitude/turn (M) and M/T ratio. Statistical analysis used a general linear model with outcomes transformed to geometric means. RESULTS: The firing rate of the total laminitic group was higher than the total control group. This difference was smaller for the ponies compared to the horses; in the horses, the geometric mean difference of the laminitic group was 1.73 [geometric 95% confidence interval (CI) 1.29-2.32], and in the ponies this value was 1.09 (geometric 95% CI 0.82-1.45). CONCLUSION AND CLINICAL RELEVANCE: In human medicine, an increased firing rate is characteristic of increased muscle force. Thus, the increased firing rate of the DDFM in the context of laminitis suggests an elevated muscle force. However, this seems to be only a partial effect as in this study, the unchanged turns/second and amplitude/turn failed to prove the recruitment of larger motor units with larger amplitude motor unit potentials in laminitic equids.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES To determine the relationship between nasolabial symmetry and esthetics in subjects with orofacial clefts. MATERIAL AND METHODS Eighty-four subjects (mean age 10 years, standard deviation 1.5) with various types of nonsyndromic clefts were included: 11 had unilateral cleft lip (UCL); 30 had unilateral cleft lip and alveolus (UCLA); and 43 had unilateral cleft lip, alveolus, and palate (UCLAP). A 3D stereophotogrammetric image of the face was taken for each subject. Symmetry and esthetics were evaluated on cropped 3D facial images. The degree of asymmetry of the nasolabial area was calculated based on all 3D data points using a surface registration algorithm. Esthetic ratings of various elements of nasal morphology were performed by eight lay raters on a 100 mm visual analog scale. Statistical analysis included ANOVA tests and regression models. RESULTS Nasolabial asymmetry increased with growing severity of the cleft (p = 0.029). Overall, nasolabial appearance was affected by nasolabial asymmetry; subjects with more nasolabial asymmetry were judged as having a less esthetically pleasing nasolabial area (p < 0.001). However, the relationship between nasolabial symmetry and esthetics was relatively weak in subjects with UCLAP, in whom only vermilion border esthetics was associated with asymmetry. CONCLUSIONS Nasolabial symmetry assessed with 3D facial imaging can be used as an objective measure of treatment outcome in subjects with less severe cleft deformity. In subjects with more severe cleft types, other factors may play a decisive role. CLINICAL SIGNIFICANCE Assessment of nasolabial symmetry is a useful measure of treatment success in less severe cleft types.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. MATERIALS AND METHODS This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. RESULTS All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). CONCLUSION Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES To assess the presence of within-group comparisons with baseline in a subset of leading dental journals and to explore possible associations with a range of study characteristics including journal and study design. STUDY DESIGN AND SETTING Thirty consecutive issues of five leading dental journals were electronically searched. The conduct and reporting of statistical analysis in respect of comparisons against baseline or otherwise along with the manner of interpretation of the results were assessed. Descriptive statistics were obtained, and chi-square test and Fisher's exact were undertaken to test the association between trial characteristics and overall study interpretation. RESULTS A total of 184 studies were included with the highest proportion published in Journal of Endodontics (n = 84, 46%) and most involving a single center (n = 157, 85%). Overall, 43 studies (23%) presented interpretation of their outcomes based solely on comparisons against baseline. Inappropriate use of baseline testing was found to be less likely in interventional studies (P < 0.001). CONCLUSION Use of comparisons with baseline appears to be common among both observational and interventional research studies in dentistry. Enhanced conduct and reporting of statistical tests are required to ensure that inferences from research studies are appropriate and informative.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.

Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background. Research into methods for recovery from fatigue due to exercise is a popular topic among sport medicine, kinesiology and physical therapy. However, both the quantity and quality of studies and a clear solution of recovery are lacking. An analysis of the statistical methods in the existing literature of performance recovery can enhance the quality of research and provide some guidance for future studies. Methods: A literature review was performed using SCOPUS, SPORTDiscus, MEDLINE, CINAHL, Cochrane Library and Science Citation Index Expanded databases to extract the studies related to performance recovery from exercise of human beings. Original studies and their statistical analysis for recovery methods including Active Recovery, Cryotherapy/Contrast Therapy, Massage Therapy, Diet/Ergogenics, and Rehydration were examined. Results: The review produces a Research Design and Statistical Method Analysis Summary. Conclusion: Research design and statistical methods can be improved by using the guideline from the Research Design and Statistical Method Analysis Summary. This summary table lists the potential issues and suggested solutions, such as, sample size calculation, sports specific and research design issues consideration, population and measure markers selection, statistical methods for different analytical requirements, equality of variance and normality of data, post hoc analyses and effect size calculation.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mixed longitudinal designs are important study designs for many areas of medical research. Mixed longitudinal studies have several advantages over cross-sectional or pure longitudinal studies, including shorter study completion time and ability to separate time and age effects, thus are an attractive choice. Statistical methodology used in general longitudinal studies has been rapidly developing within the last few decades. Common approaches for statistical modeling in studies with mixed longitudinal designs have been the linear mixed-effects model incorporating an age or time effect. The general linear mixed-effects model is considered an appropriate choice to analyze repeated measurements data in longitudinal studies. However, common use of linear mixed-effects model on mixed longitudinal studies often incorporates age as the only random-effect but fails to take into consideration the cohort effect in conducting statistical inferences on age-related trajectories of outcome measurements. We believe special attention should be paid to cohort effects when analyzing data in mixed longitudinal designs with multiple overlapping cohorts. Thus, this has become an important statistical issue to address. ^ This research aims to address statistical issues related to mixed longitudinal studies. The proposed study examined the existing statistical analysis methods for the mixed longitudinal designs and developed an alternative analytic method to incorporate effects from multiple overlapping cohorts as well as from different aged subjects. The proposed study used simulation to evaluate the performance of the proposed analytic method by comparing it with the commonly-used model. Finally, the study applied the proposed analytic method to the data collected by an existing study Project HeartBeat!, which had been evaluated using traditional analytic techniques. Project HeartBeat! is a longitudinal study of cardiovascular disease (CVD) risk factors in childhood and adolescence using a mixed longitudinal design. The proposed model was used to evaluate four blood lipids adjusting for age, gender, race/ethnicity, and endocrine hormones. The result of this dissertation suggest the proposed analytic model could be a more flexible and reliable choice than the traditional model in terms of fitting data to provide more accurate estimates in mixed longitudinal studies. Conceptually, the proposed model described in this study has useful features, including consideration of effects from multiple overlapping cohorts, and is an attractive approach for analyzing data in mixed longitudinal design studies.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Birth defects are the leading cause of infant mortality in the United States and are a major cause of lifetime disability. However, efforts to understand their causes have been hampered by a lack of population-specific data. During 1990–2004, 22 state legislatures responded to this need by proposing birth defects surveillance legislation (BDSL). The contrast between these states and those that did not pass BDSL provides an opportunity to better understand conditions associated with US public health policy diffusion. ^ This study identifies key state-specific determinants that predict: (1) the introduction of birth defects surveillance legislation (BDSL) onto states' formal legislative agenda, and (2) the successful adoption of these laws. Secondary aims were to interpret these findings in a theoretically sound framework and to incorporate evidence from three analytical approaches. ^ The study begins with a comparative case study of Texas and Oregon (states with divergent BDSL outcomes), including a review of historical documentation and content analysis of key informant interviews. After selecting and operationalizing explanatory variables suggested by the case study, Qualitative Comparative Analysis (QCA) was applied to publically available data to describe important patterns of variation among 37 states. Results from logistic regression were compared to determine whether the two methods produced consistent findings. ^ Themes emerging from the comparative case study included differing budgetary conditions and the significance of relationships within policy issue networks. However, the QCA and statistical analysis pointed to the importance of political parties and contrasting societal contexts. Notably, state policies that allow greater access to citizen-driven ballot initiatives were consistently associated with lower likelihood of introducing BDSL. ^ Methodologically, these results indicate that a case study approach, while important for eliciting valuable context-specific detail, may fail to detect the influence of overarching, systemic variables, such as party competition. However, QCA and statistical analyses were limited by a lack of existing data to operationalize policy issue networks, and thus may have downplayed the impact of personal interactions. ^ This study contributes to the field of health policy studies in three ways. First, it emphasizes the importance of collegial and consistent relationships among policy issue network members. Second, it calls attention to political party systems in predicting policy outcomes. Finally, a novel approach to interpreting state data in a theoretically significant manner (QCA) has been demonstrated.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The geometrical factors defining an adhesive joint are of great importance as its design greatly conditions the performance of the bonding. One of the most relevant geometrical factors is the thickness of the adhesive as it decisively influences the mechanical properties of the bonding and has a clear economic impact on the manufacturing processes or long runs. The traditional mechanical joints (riveting, welding, etc.) are characterised by a predictable performance, and are very reliable in service conditions. Thus, structural adhesive joints will only be selected in industrial applications demanding mechanical requirements and adverse environmental conditions if the suitable reliability (the same or higher than the mechanical joints) is guaranteed. For this purpose, the objective of this paper is to analyse the influence of the adhesive thickness on the mechanical behaviour of the joint and, by means of a statistical analysis based on Weibull distribution, propose the optimum thickness for the adhesive combining the best mechanical performance and high reliability. This procedure, which is applicable without a great deal of difficulty to other joints and adhesives, provides a general use for a more reliable use of adhesive bondings and, therefore, for a better and wider use in the industrial manufacturing processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Concession contracts in highways often include some kind of clauses (for example, a minimum traffic guarantee) that allow for better management of the business risks. The value of these clauses may be important and should be added to the total value of the concession. However, in these cases, traditional valuation techniques, like the NPV (net present value) of the project, are insufficient. An alternative methodology for the valuation of highway concession is one based on the real options approach. This methodology is generally built on the assumption of the evolution of traffic volume as a GBM (geometric Brownian motion), which is the hypothesis analyzed in this paper. First, a description of the methodology used for the analysis of the existence of unit roots (i.e., the hypothesis of non-stationarity) is provided. The Dickey-Fuller approach has been used, which is the most common test for this kind of analysis. Then this methodology is applied to perform a statistical analysis of traffic series in Spanish toll highways. For this purpose, data on the AADT (annual average daily traffic) on a set of highways have been used. The period of analysis is around thirty years in most cases. The main outcome of the research is that the hypothesis that traffic volume follows a GBM process in Spanish toll highways cannot be rejected. This result is robust, and therefore it can be used as a starting point for the application of the real options theory to assess toll highway concessions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta tesis estudia la evolución estructural de conjuntos de neuronas como la capacidad de auto-organización desde conjuntos de neuronas separadas hasta que forman una red (clusterizada) compleja. Esta tesis contribuye con el diseño e implementación de un algoritmo no supervisado de segmentación basado en grafos con un coste computacional muy bajo. Este algoritmo proporciona de forma automática la estructura completa de la red a partir de imágenes de cultivos neuronales tomadas con microscopios de fase con una resolución muy alta. La estructura de la red es representada mediante un objeto matemático (matriz) cuyos nodos representan a las neuronas o grupos de neuronas y los enlaces son las conexiones reconstruidas entre ellos. Este algoritmo extrae también otras medidas morfológicas importantes que caracterizan a las neuronas y a las neuritas. A diferencia de otros algoritmos hasta el momento, que necesitan de fluorescencia y técnicas inmunocitoquímicas, el algoritmo propuesto permite el estudio longitudinal de forma no invasiva posibilitando el estudio durante la formación de un cultivo. Además, esta tesis, estudia de forma sistemática un grupo de variables topológicas que garantizan la posibilidad de cuantificar e investigar la progresión de las características principales durante el proceso de auto-organización del cultivo. Nuestros resultados muestran la existencia de un estado concreto correspondiente a redes con configuracin small-world y la emergencia de propiedades a micro- y meso-escala de la estructura de la red. Finalmente, identificamos los procesos físicos principales que guían las transformaciones morfológicas de los cultivos y proponemos un modelo de crecimiento de red que reproduce el comportamiento cuantitativamente de las observaciones experimentales. ABSTRACT The thesis analyzes the morphological evolution of assemblies of living neurons, as they self-organize from collections of separated cells into elaborated, clustered, networks. In particular, it contributes with the design and implementation of a graph-based unsupervised segmentation algorithm, having an associated very low computational cost. The processing automatically retrieves the whole network structure from large scale phase-contrast images taken at high resolution throughout the entire life of a cultured neuronal network. The network structure is represented by a mathematical object (a matrix) in which nodes are identified neurons or neurons clusters, and links are the reconstructed connections between them. The algorithm is also able to extract any other relevant morphological information characterizing neurons and neurites. More importantly, and at variance with other segmentation methods that require fluorescence imaging from immunocyto- chemistry techniques, our measures are non invasive and entitle us to carry out a fully longitudinal analysis during the maturation of a single culture. In turn, a systematic statistical analysis of a group of topological observables grants us the possibility of quantifying and tracking the progression of the main networks characteristics during the self-organization process of the culture. Our results point to the existence of a particular state corresponding to a small-world network configuration, in which several relevant graphs micro- and meso-scale properties emerge. Finally, we identify the main physical processes taking place during the cultures morphological transformations, and embed them into a simplified growth model that quantitatively reproduces the overall set of experimental observations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta tesis estudia la evolución estructural de conjuntos de neuronas como la capacidad de auto-organización desde conjuntos de neuronas separadas hasta que forman una red (clusterizada) compleja. Esta tesis contribuye con el diseño e implementación de un algoritmo no supervisado de segmentación basado en grafos con un coste computacional muy bajo. Este algoritmo proporciona de forma automática la estructura completa de la red a partir de imágenes de cultivos neuronales tomadas con microscopios de fase con una resolución muy alta. La estructura de la red es representada mediante un objeto matemático (matriz) cuyos nodos representan a las neuronas o grupos de neuronas y los enlaces son las conexiones reconstruidas entre ellos. Este algoritmo extrae también otras medidas morfológicas importantes que caracterizan a las neuronas y a las neuritas. A diferencia de otros algoritmos hasta el momento, que necesitan de fluorescencia y técnicas inmunocitoquímicas, el algoritmo propuesto permite el estudio longitudinal de forma no invasiva posibilitando el estudio durante la formación de un cultivo. Además, esta tesis, estudia de forma sistemática un grupo de variables topológicas que garantizan la posibilidad de cuantificar e investigar la progresión de las características principales durante el proceso de auto-organización del cultivo. Nuestros resultados muestran la existencia de un estado concreto correspondiente a redes con configuracin small-world y la emergencia de propiedades a micro- y meso-escala de la estructura de la red. Finalmente, identificamos los procesos físicos principales que guían las transformaciones morfológicas de los cultivos y proponemos un modelo de crecimiento de red que reproduce el comportamiento cuantitativamente de las observaciones experimentales. ABSTRACT The thesis analyzes the morphological evolution of assemblies of living neurons, as they self-organize from collections of separated cells into elaborated, clustered, networks. In particular, it contributes with the design and implementation of a graph-based unsupervised segmentation algorithm, having an associated very low computational cost. The processing automatically retrieves the whole network structure from large scale phase-contrast images taken at high resolution throughout the entire life of a cultured neuronal network. The network structure is represented by a mathematical object (a matrix) in which nodes are identified neurons or neurons clusters, and links are the reconstructed connections between them. The algorithm is also able to extract any other relevant morphological information characterizing neurons and neurites. More importantly, and at variance with other segmentation methods that require fluorescence imaging from immunocyto- chemistry techniques, our measures are non invasive and entitle us to carry out a fully longitudinal analysis during the maturation of a single culture. In turn, a systematic statistical analysis of a group of topological observables grants us the possibility of quantifying and tracking the progression of the main networks characteristics during the self-organization process of the culture. Our results point to the existence of a particular state corresponding to a small-world network configuration, in which several relevant graphs micro- and meso-scale properties emerge. Finally, we identify the main physical processes taking place during the cultures morphological transformations, and embed them into a simplified growth model that quantitatively reproduces the overall set of experimental observations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Análisis de precisión en modelos digitales de elevación globales. ABSTRACT: Terrain-Based Analysis results in derived products from an input DEM and these products are needed to perform various analyses. To efficiently use these products in decision-making, their accuracies must be estimated systematically. This paper proposes a procedure to assess the accuracy of these derived products, by calculating the accuracy of the slope dataset and its significance, taking as an input the accuracy of the DEM. Based on the output of previously published research on modeling the relative accuracy of a DEM, specifically ASTER and SRTM DEMs with Lebanon coverage as the area of study, analysis have showed that ASTER has a low significance in the majority of the area where only 2% of the modeled terrain has 50% or more significance. On the other hand, SRTM showed a better significance, where 37% of the modeled terrain has 50% or more significance. Statistical analysis deduced that the accuracy of the slope dataset, calculated on a cell-by-cell basis, is highly correlated to the accuracy of the input DEM. However, this correlation becomes lower between the slope accuracy and the slope significance, whereas it becomes much higher between the modeled slope and the slope significance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Actualmente la optimization de la calidad de experiencia (Quality of Experience- QoE) de HTTP Adaptive Streaming (HAS) de video recibe una atención creciente. Este incremento de interés proviene fundamentalmente de las carencias de las soluciones actuales HAS, que, al no ser QoE-driven, no incluyen la percepción de la calidad de los usuarios finales como una parte integral de la lógica de adaptación. Por lo tanto, la obtención de información de referencia fiable en QoE en HAS presenta retos importantes, ya que las metodologías de evaluación subjetiva de la calidad de vídeo propuestas en las normas actuales no son adecuadas para tratar con la variación temporal de la calidad que es consustancial de HAS. Esta tesis investiga la influencia de la adaptación dinámica en la calidad de la transmisión de vídeo considerando métodos de evaluación subjetiva. Tras un estudio exhaustivo del estado del arte en la evaluación subjetiva de QoE en HAS, se han resaltado los retos asociados y las líneas de investigación abiertas. Como resultado, se han seleccionado dos líneas principales de investigación: el análisis del impacto en la QoE de los parámetros de las técnicas de adaptación y la investigación de las metodologías de prueba subjetiva adecuada para evaluación de QoE en HAS. Se han llevado a cabo un conjunto de experimentos de laboratorio para investigar las cuestiones planteadas mediante la utilización de diferentes metodologáas para pruebas subjetivas. El análisis estadístico muestra que no son robustas todas las suposiciones y reivindicaciones de las referencias analizadas, en particular en lo que respecta al impacto en la QoE de la frecuencia de las variaciones de calidad, de las adaptaciones suaves o abruptas y de las oscilaciones de calidad. Por otra parte, nuestros resultados confirman la influencia de otros parámetros, como la longitud de los segmentos de vídeo y la amplitud de las oscilaciones de calidad. Los resultados también muestran que tomar en consideración las características objetivas de los contenidos puede ser beneficioso para la mejora de la QoE en HAS. Además, todos los resultados han sido validados mediante extensos análisis experimentales que han incluido estudio tanto en otros laboratorios como en crowdsourcing Por último, sobre los aspectos metodológicos de las pruebas subjetivas de QoE, se ha realizado la comparación entre los resultados experimentales obtenidos a partir de un método estandarizado basado en estímulos cortos (ACR) y un método semi continuo (desarrollado para la evaluación de secuencias prolongadas de vídeo). A pesar de algunas diferencias, el resultado de los análisis estadísticos no muestra ningún efecto significativo de la metodología de prueba. Asimismo, aunque se percibe la influencia de la presencia de audio en la evaluación de degradaciones del vídeo, no se han encontrado efectos estadísticamente significativos de dicha presencia. A partir de la ausencia de influencia del método de prueba y de la presencia de audio, se ha realizado un análisis adicional sobre el impacto de realizar comparaciones estadísticas múltiples en niveles estadísticos de importancia que aumentan la probabilidad de los errores de tipo-I (falsos positivos). Nuestros resultados muestran que, para obtener un efectos sólido en el análisis estadístico de los resultados subjetivos, es necesario aumentar el número de sujetos de las pruebas claramente por encima de los tamaños de muestras propuestos por las normas y recomendaciones actuales. ABSTRACT Optimizing the Quality of Experience (QoE) of HTTP adaptive video streaming (HAS) is receiving increasing attention nowadays. The growth of interest is mainly caused by the fact that current HAS solutions are not QoE-driven, i.e. end-user quality perception is not integral part of the adaptation logic. However, obtaining the necessary reliable ground truths on HAS QoE faces substantial challenges, since the subjective video quality assessment methodologies as proposed by current standards are not well-suited for dealing with the time-varying quality properties that are characteristic for HAS. This thesis investigates the influence of dynamic quality adaptation on the QoE of streaming video by means of subjective evaluation approaches. Based on a comprehensive survey of related work on subjective HAS QoE assessment, the related challenges and open research questions are highlighted and discussed. As a result, two main research directions are selected for further investigation: analysis of the QoE impact of different technical adaptation parameters, and investigation of testing methodologies suitable for HAS QoE evaluation. In order to investigate related research issues and questions, a set of laboratory experiments have been conducted using different subjective testing methodologies. Our statistical analysis demonstrates that not all assumptions and claims reported in the literature are robust, particularly as regards the QoE impact of switching frequency, smooth vs. abrupt switching, and quality oscillation. On the other hand, our results confirm the influence of some other parameters such as chunk length and switching amplitude on perceived quality. We also show that taking the objective characteristics of the content into account can be beneficial to improve the adaptation viewing experience. In addition, all aforementioned findings are validated by means of an extensive cross-experimental analysis that involves external laboratory and crowdsourcing studies. Finally, to address the methodological aspects of subjective QoE testing, a comparison between the experimental results obtained from a (short stimuli-based) ACR standardized method and a semi-continuous method (developed for assessment of long video sequences) has been performed. In spite of observation of some differences, the result of statistical analysis does not show any significant effect of testing methodology. Similarly, although the influence of audio presence on evaluation of video-related degradations is perceived, no statistically significant effect of audio presence could be found. Motivating by this finding (no effect of testing method and audio presence), a subsequent analysis has been performed investigating the impact of performing multiple statistical comparisons on statistical levels of significance which increase the likelihood of Type-I errors (false positives). Our results show that in order to obtain a strong effect from the statistical analysis of the subjective results, it is necessary to increase the number of test subjects well beyond the sample sizes proposed by current quality assessment standards and recommendations.