990 resultados para Controlled experiment
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
There is a growing interest of the Computer Science education community for including testing concepts on introductory programming courses. Aiming at contributing to this issue, we introduce POPT, a Problem-Oriented Programming and Testing approach for Introductory Programming Courses. POPT main goal is to improve the traditional method of teaching introductory programming that concentrates mainly on implementation and neglects testing. POPT extends POP (Problem Oriented Programing) methodology proposed on the PhD Thesis of Andrea Mendonça (UFCG). In both methodologies POPT and POP, students skills in dealing with ill-defined problems must be developed since the first programming courses. In POPT however, students are stimulated to clarify ill-defined problem specifications, guided by de definition of test cases (in a table-like manner). This paper presents POPT, and TestBoot a tool developed to support the methodology. In order to evaluate the approach a case study and a controlled experiment (which adopted the Latin Square design) were performed. In an Introductory Programming course of Computer Science and Software Engineering Graduation Programs at the Federal University of Rio Grande do Norte, Brazil. The study results have shown that, when compared to a Blind Testing approach, POPT stimulates the implementation of programs of better external quality the first program version submitted by POPT students passed in twice the number of test cases (professor-defined ones) when compared to non-POPT students. Moreover, POPT students submitted fewer program versions and spent more time to submit the first version to the automatic evaluation system, which lead us to think that POPT students are stimulated to think better about the solution they are implementing. The controlled experiment confirmed the influence of the proposed methodology on the quality of the code developed by POPT students
Resumo:
A controlled experiment, related with the mating system of Scinax rizibilis (Bokermann, 1964), was conducted to assess if larval variation could be due to size of male or its ability to manage an amplexus. Adult individuals were caught during breeding activity (from February 1993 to January 1994), in a temporary pond in the municipality of Ribeirão Branco. São Paulo State, southeastern Brazil. The duration of the larval period was not different between tadpoles of large and small males, nor was it different between tadpoles coming from natural or artificial pairs. The reproductive status of the male (if it had managed an amplexus) also did not influence the total length nor the mass of the tadpoles close to metamorphosis. However, tadpoles of larger and heavier males were, on average, approximately 5.5% and 11% larger and heavier, respectively, than tadpoles of smaller males. These results indicate that the breeding system of S. rizibilis could potentially have a directional effect on the larval characteristics.
Resumo:
The influence of bulk light absorption on running photorefractive holograms is investigated. By solving the coupled wave equations we prove that the beam intensities, but not the beam phases, can be calculated by averaging the coupling constant over the crystal thickness. We show the importance of the effect by calculating the dielectric relaxation time at the crystal front, and from that the quantum efficiency from a feedback-controlled experiment with a 2.05 mm thick BTO crystal.We propose to simulate the effect of bulk light absorption by a rude estimate of the average dielectric relaxation time which is related in a simple way to the dielectric relaxation time at the crystal front, in doing so an error of less than 10% is introduced.
Resumo:
The feeding choices of the mangrove crab Ucides cordatus for various mangrove plant leaves (Avicennia schaueriana, Laguncularia racemosa, and Rhizophora mangle) at different ages (mature, senescent pre-abscission, and decomposing leaves) were examined. In a controlled experiment set in a mangrove area, we evaluated crab selection for different plant leaves by analyzing foraging rate (number of leaves with predation marks) and leaf consumption. Crabs were housed individually in plastic containers and after a 3-day fast supplied with leaf fragments every 24 h for 72 h. Uneaten leaves were removed before each new food offering. No food selection was observed in the first day, but after this period, senescent leaves, which have a high polyphenol content, were rejected. On the third day, an interactive effect between plant species and leaf age was shown to affect leaf selection, with mature leaves of A. schaueriana and L. racemosa being more selected than the other treatments. This observation was consistent across crab sexes and ages. Our results show that food selection by this mangrove crab changes through time in fasted animals, suggesting that this variable must be controlled in food preference studies. © 2012 Springer Science+Business Media B.V.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Spreadsheets are widely used but often contain faults. Thus, in prior work we presented a data-flow testing methodology for use with spreadsheets, which studies have shown can be used cost-effectively by end-user programmers. To date, however, the methodology has been investigated across a limited set of spreadsheet language features. Commercial spreadsheet environments are multiparadigm languages, utilizing features not accommodated by our prior approaches. In addition, most spreadsheets contain large numbers of replicated formulas that severely limit the efficiency of data-flow testing approaches. We show how to handle these two issues with a new data-flow adequacy criterion and automated detection of areas of replicated formulas, and report results of a controlled experiment investigating the feasibility of our approach.
Resumo:
Homeopathic preparations are used in homeopathy and anthroposophic medicine. Although there is evidence of effectiveness in several clinical studies, including double-blinded randomized controlled trials, their nature and mode of action could not be explained with current scientific approaches yet. Several physical methods have already been applied to investigate homeopathic preparations but it is yet unclear which methods are best suited to identify characteristic physicochemical properties of homeopathic preparations. The aim of this study was to investigate homeopathic preparations with UV-spectroscopy. In a blinded, randomized, controlled experiment homeopathic preparations of copper sulfate (CuSO(4); 11c-30c), quartz (SiO(2); 10c-30c, i.e., centesimal dilution steps) and sulfur (S; 11×-30×, i.e., decimal dilution steps) and controls (one-time succussed diluent) were investigated using UV-spectroscopy and tested for contamination by inductively coupled plasma mass spectrometry (ICP-MS). The UV transmission for homeopathic preparations of CuSO(4) preparations was significantly lower than in controls. The transmission seemed to be also lower for both SiO(2) and S, but not significant. The mean effect size (95% confidence interval) was similar for the homeopathic preparations: CuSO(4) (pooled data) 0.0544% (0.0260-0.0827%), SiO(2) 0.0323% (-0.0064% to 0.0710%) and S 0.0281% (-0.0520% to 0.1082%). UV transmission values of homeopathic preparations had a significantly higher variability compared to controls. In none of the samples the concentration of any element analyzed by ICP-MS exceeded 100 ppb. Lower transmission of UV light may indicate that homeopathic preparations are less structured or more dynamic than their succussed pure solvent.
Resumo:
Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.
Resumo:
Currently, there is a great deal of well-founded explicit knowledge formalizing general notions, such as time concepts and the part_of relation. Yet, it is often the case that instead of reusing ontologies that implement such notions (the so-called general ontologies), engineers create procedural programs that implicitly implement this knowledge. They do not save time and code by reusing explicit knowledge, and devote effort to solve problems that other people have already adequately solved. Consequently, we have developed a methodology that helps engineers to: (a) identify the type of general ontology to be reused; (b) find out which axioms and definitions should be reused; (c) make a decision, using formal concept analysis, on what general ontology is going to be reused; and (d) adapt and integrate the selected general ontology in the domain ontology to be developed. To illustrate our approach we have employed use-cases. For each use case, we provide a set of heuristics with examples. Each of these heuristics has been tested in either OWL or Prolog. Our methodology has been applied to develop a pharmaceutical product ontology. Additionally, we have carried out a controlled experiment with graduated students doing a MCs in Artificial Intelligence. This experiment has yielded some interesting findings concerning what kind of features the future extensions of the methodology should have.
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
En este trabajo se ha realizado un análisis de la estructura del juego y de los parámetros morfológicos y fisiológicos en jugadores de bádminton. Para ello se han realizado 4 estudios aplicados. Objetivo: Los objetivos del trabajo han sido: (1) comprobar si existen diferencias entre el lado dominante y no dominante de las medidas antropométricas en jugadores de bádminton de máximo nivel nacional, así como verificar si el lado del cuerpo donde se realiza la medición puede influir en el cálculo de la composición corporal y del somatotipo. (2) Comparar la estuctura temporal y notacional en partidos de individual masculino entre los Juegos Olímpicos de Pekín y de Londres para observar como ha evolucionado el bádminton de 2008 a 2012. (3) Medir la ocurrencia de daño muscular después de un partido simulado de bádminton y su influencia en parámetros físicos y hematológicos. (4) Investigar la efectividad de una bebida energética que contiene cafeína para mejorar el rendimiento físico y el rendimiento en un partido en jugadores de élite de bádminton. Metodología: Para caracterizar el bádminton participaron en esta tesis un total de 78 jugadores de bádminton de élite (63 hombres y 15 mujeres), distribuidos en tres estudios y se analizaron 40 sets de bádminton de individual masculino usando los videos oficiales de los Juegos Olímpicos de Pekín 2008 y Londres 2012. En el primer estudio se tomaron medidas de pliegues cutáneos, diámetros, longitudes y perímetros del lado dominante y no dominante de los jugadores. Se calculó la composición corporal y el somatotipo. En el segundo estudio se analizaron los factores temporales y los factores notacionales de los partidos. En el tercer estudio se midieron la fuerza máxima isométrica, la velocidad en test específicos de bádminton y se tomaron muestras de sangre antes y después de jugar un partido de bádminton de 45 minutos. En el cuarto estudio se realizó un experimento a doble ciego, aleatorizado y controlado con placebo, los jugadores ingirieron 3 mg de cafeína por kilógramo de masa corporal en forma de bebida energética, o la misma bebida sin cafeína (placebo). En este estudio se registraron diferente tests específicos de bádminton (tests de salto, fuerza máxima y test de agilidad) y se jugó un partido simulado de 45 minutos. Resultados y discusión: (1) El porcentaje óseo fue mayor calculado a partir de las mediciones del lado dominante (dominante = 16.37 ± 1.14 %, no dominante = 15.66 ± 1.12 %; P < 0.001), mientras que el porcentaje muscular fue mayor calculado a partir de las mediciones del lado no dominante (dominante = 49.39 ± 2.60 %, no dominante = 50.18 ± 2.69%; P < 0.001). (2) La duración del set (Pekín: 1124.6 ± 229.9 s vs Londres: 1260.3 ± 267.1 s.; P < 0.05), el tiempo real de juego (Pekín: 306.9 ± 45.7 s vs Londres: 354.7 ± 86.5 s; P < 0.05), tiempo de rally, golpeos por rally, tiempo de descanso en el punto 11, tiempo de descanso entre sets y golpeos por rally fueron significativamente mayores en Londres que en Pekín. (3) El partido simulado de bádminton no afectó a la fuerza isométrica máxima (Pre: 1263.6 ± 245.5, Post: 1290.8 ± 240.4 N) o a la velocidad específica de bádminton (Pre: 21.0 ± 1.7, Post: 20.9 ± 1.8 s), sin embargo las concentraciones de mioglobina y de creatina quinasa en sangre aumentaron de 26.5 ± 11.6 a 197.3 ± 70.2 μg • L-1 y de 258.6 ± 192.2 a 466.0 ± 296.5 U • L-1, respectivamente después del partido de bádminton. (4) En comparación con la bebida placebo, la ingesta de la bebida energética con cafeína incrementó la altura del SJ (34.5±4.7 vs. 36.4±4.3 cm; P < 0.05) y del CMJ (37.7 ± 4.5 vs. 39.5 ± 5.1 cm; P < 0.05) y aumentó el número de aceleraciones totales durante el partido (7395 ± 1594 vs. 7707 ± 2033 aceleraciones; P < 0.05). Conclusiones: (1) Existen asimetrías corporales en los jugadores de bádminton de alto nivel, al encontrarse diferencias en los diámetros óseos y en los perímetros entre el lado dominante y no dominante. Al calcular la composición corporal con el lado dominante de los jugadores de bádminton se está sobreestimando el porcentaje óseo e infraestimando el porcentaje muscular. (2) El bádminton está evolucionando hacía rallies más largos con intervalos de descanso mayores, lo que resulta en partidos más largos. (3) El partido de bádminton generó daño muscular, sin embargo, el nivel de daño muscular alcanzado después de un partido de bádminton no produjo una disminución del rendimiento muscular. (4) El uso de una bebida energética con cafeína puede ser una ayuda nutricional eficaz para aumentar el rendimiento en el salto y patrones de actividad durante el juego en jugadores de élite de bádminton. ABSTRACT: This study analyzes the structure of the game and the morphological and physiological parameters in badminton players, investigated in four applied studies. Purpose: The purposes of the study were: (1) To check if there are differences between the dominant and non-dominant side in the anthropometric measures of badminton players at the highest national level and verify if the side of the body where the measurements are performed can influence the calculation of the body composition and the somatotype. (2) To compare the temporal and notational structure in men’s singles matches between the Olympic Games in Beijing and London to observe the evolution of badminton between 2008 and 2012. (3) To asses the occurrence of muscle damage after a simulated badminton match and its influence on physical and haematological parameters. (4) To determine the effectiveness of a commercially available energy drink that contains caffeine to improve match performance in elite badminton players. Methods: A total of 78 elite badminton players (63 men and 15 women) participated in this thesis to characterize the sport of badminton distributed in three studies and 40 sets of men’s singles badminton analyzed using the official videos of the Olympic Games of Beijing 2008 and London 2012. In the first study skinfolds, diameters, lengths and perimeters of the dominant and non-dominant side of the players were measured and body composition and somatotype were calculated. In the second study the temporal and notational factors were analyzed. In the third study maximal isometric force and speed in badminton specific tests were measured and blood samples were taken before and after a badminton match of 45 minutes. In the fourth study, a double-blind, randomized placebo-controlled experiment, players ingested 3 mg of caffeine per kilogram of body mass in the form of an energy drink or an identical drink with no caffeine content (placebo). In this study different badminton specific tests (jump tests, handgrip force test and an agility test) were recorded and a simulated badminton match of 45 minutes was played. Results and discussion: (1) The percentage of bone was higher when calculated from measurements of the dominant body side (dominant = 16.37 ± 1.14 %, nondominant = 15.66 ± 1.12 %; P < 0.001), while the muscle percentage was higher when calculated from measurements of the non-dominant side (dominant = 49.39 ± 2.60 %, non-dominant = 50.18 ± 2.69%; P < 0.001). (2) Set duration (Beijing: 1124.6 ± 229.9 s vs. London: 1260.3 ± 267.1 s.; P < 0.05), real time played (Beijing: 306.9 ± 45.7 s vs. London: 354.7 ± 86.5 s; P < 0.05), rally time, shots per rally, rest time at point 11, rest time between sets and shots per rally were significantly higher in London than in Beijing. (3) A simulated badminton match did not affect maximal isometric force (Pre: 1263.6 ± 245.5, Post: 1290.8 ± 240.4 N) or specific badminton speed (Pre: 21.0 ± 1.7, Post: 20.9 ± 1.8 s), however, concentrations of myoglobin and creatine kinase in blood increased from 26.5 ± 11.6 to 197.3 ± 70.2 μg • L-1 and from 258.6 ± 192.2 to 466.0 ± 296.5 U • L-1, respectively after the badminton match. (4) In comparison to the placebo drink, the caffeinated beverage increased height in the SJ (34.5±4.7 vs. 36.4±4.3 cm; P < 0.05) and in the CMJ (37.7 ± 4.5 vs. 39.5 ± 5.1 cm; P < 0.05) and increased the number of total accelerations during the match (7395 ± 1594 vs. 7707 ± 2033 accelerations; P < 0.05). Conclusions: (1) Body asymmetries were found in high level badminton players, due to the differences found in bone diameters and perimeters between the dominant and non-dominant body side. When calculating body composition with the dominant side of the badminton players we are overestimating bone percentage and underestimating muscle percentage. (2) Badminton is evolving towards longer rallies with greater rest intervals, resulting in longer matches. (3) The badminton match generated muscle damage, however, the level of muscle damage reached after a badminton match did not produce a decrease in muscle performance. (4) The ingestion of an energy drink containing caffeine might be an effective ergogenic nutritional supplement to increase jump performance and activity patterns during the game in elite badminton players.
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
We present statistical methods for analyzing replicated cDNA microarray expression data and report the results of a controlled experiment. The study was conducted to investigate inherent variability in gene expression data and the extent to which replication in an experiment produces more consistent and reliable findings. We introduce a statistical model to describe the probability that mRNA is contained in the target sample tissue, converted to probe, and ultimately detected on the slide. We also introduce a method to analyze the combined data from all replicates. Of the 288 genes considered in this controlled experiment, 32 would be expected to produce strong hybridization signals because of the known presence of repetitive sequences within them. Results based on individual replicates, however, show that there are 55, 36, and 58 highly expressed genes in replicates 1, 2, and 3, respectively. On the other hand, an analysis by using the combined data from all 3 replicates reveals that only 2 of the 288 genes are incorrectly classified as expressed. Our experiment shows that any single microarray output is subject to substantial variability. By pooling data from replicates, we can provide a more reliable analysis of gene expression data. Therefore, we conclude that designing experiments with replications will greatly reduce misclassification rates. We recommend that at least three replicates be used in designing experiments by using cDNA microarrays, particularly when gene expression data from single specimens are being analyzed.