347 resultados para Randomization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural composite lumber (SCL) products often possess significantly higher design values than the top grades of solid lumber, making it a popular choice for both residential and commercial applications. The enhanced mechanical properties of SCL are mainly due to defect randomization and densification of the wood fiber, both largely functions of the size, shape and composition (species) of the wood element. Traditionally, SCL manufacturers have used thin, rectangular elements produced from either moderate density softwoods or low density hardwoods. Higher density hardwood species have been avoided, as they require higher pressures to adequately densify and consolidate the wood furnish. These higher pressures can lead to increased manufacturing costs, damage to the wood fiber and/or a product that is too dense, making it heavy and unreceptive to common mechanical fastening techniques. In the northeastern United States high density, diffuse-porous hardwoods (such as maple, beech and birch) are abundant. Use of these species as primary furnish for a SCL product may allow for a competitive advantage in terms of resource cost against products that rely on veneer grade logs. Proximity to this abundant and relatively inexpensive resource may facilitate entry of SCL production facilities in the northeastern United States, where currently none exist. However, modifications to current strand sizes, geometries or production techniques will likely be required to allow for use of these species. A new SCL product concept has been invented allowing for use of these high density hardwoods. The product, referred to as long-strand structural composite lumber (LSSCL), uses strands of significantly larger cross sectional areas and volumes than existing SCL products. In spite of the large strand size, satisfactory consolidation is achieved without excessive densification of the wood fiber through use of a symmetrical strand geometric cross-section. LSSCL density is similar to that of existing SCL products, but is due mainly to the inherent density of the species, rather than through densification. An experiment was designed and conducted producing LSSCL from both large (7/16”) and small (1/4”) strands, of both square and triangular geometric cross sections. Testing results indicate that the large, triangular strands produce LSSCL beams with projected design values of: Modulus of elasticity (MOEapp) – 1,750,000 psi; Allowable bending stress (Fb) – 2750 psi; Allowable shear stress (Fv) – 260 psi. Several modifications are recommended which may lead to improvement of these values, likely allowing for competition against existing SCL products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this research was to determine if principles from organizational theory could be used as a framework to compare and contrast safety interventions developed by for-profit industry for the time period 1986–1996. A literature search of electronic databases and manual search of journals and local university libraries' book stacks was conducted for safety interventions developed by for-profit businesses. To maintain a constant regulatory environment, the business sectors of nuclear power, aviation and non-profits were excluded. Safety intervention evaluations were screened for scientific merit. Leavitt's model from organization theory was updated to include safety climate and renamed the Updated Leavitt's Model. In all, 8000 safety citations were retrieved, 525 met the inclusion criteria, 255 met the organizational safety intervention criteria, and 50 met the scientific merit criteria. Most came from non-public health journals. These 50 were categorized by the Updated Leavitt's Model according to where within the organizational structure the intervention took place. Evidence tables were constructed for descriptive comparison. The interventions clustered in the areas of social structure, safety climate, the interaction between social structure and participants, and the interaction between technology and participants. No interventions were found in the interactions between social structure and technology, goals and technology, or participants and goals. Despite the scientific merit criteria, many still had significant study design weaknesses. Five interventions tested for statistical significance but none of the interventions commented on the power of their study. Empiric studies based on safety climate theorems had the most rigorous designs. There was an attempt in these studies to address randomization amongst subjects to avoid bias. This work highlights the utility of using the Updated Leavitt's Model, a model from organizational theory, as a framework when comparing safety interventions. This work also highlights the need for better study design of future trials of safety interventions. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At least 15 million American adults have participated in yoga at least once in their lifetime (Saper, Eisenberg, Davis, Culpepper, & Phillips, 2004). The field of yoga research is relatively new in the United States, and the majority of studies have concentrated on yoga's effect on measures of physiology (cardiovascular disease, diabetes, obesity) or psychological measures of stress and anxiety. This review attempted to identify studies that had been conducted measuring a different set of outcome measures, specifically violence, trauma, eating, and other behavioral disorders. In 9 of 10 studies reviewed, researchers observed statistically significant effects of yoga interventions. Effects were most evident within multifaceted studies that combined intensive yoga practice with group discussion and training. Only one group (Mitchell, Mazzeo, Rausch, & Cooke, 2007) failed to observe any significant differences between yoga practice groups and control groups. Effects were seen in both sexes, although a majority of the studies were aimed specifically at women. All studies were limited by small sample size and lack of follow-up data. Future research should seek to increase sample size, to diversify recruitment to allow for the randomization of treatment and control groups, and to allow for long-term follow-up.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. There is currently a push to increase the number of minorities in cancer clinical trials in an effort to reduce cancer health disparities. Overcoming barriers to clinical trial research for minorities is necessary if we are to achieve the goals of Healthy People 2010. To understand the unexpectedly high rate of attrition in the A NULIFE study, the research team examined the perceived barriers to participation among minority women. The purpose of this study was to determine if either personal or study-related factors influenced healthy pre-menopausal women aged 25-45 years to terminate their participation in the A NULIFE Study. We hypothesized that personal factors were the driving forces for attrition rates in the prevention trial.^ Methods. The target population consisted of eligible women who consented to the A NULIFE study but withdrew prior to being randomized (N= 46), as well as eligible women who completed the informed consent process for the A NULIFE study and withdrew after randomization (N= 42). Examination of attrition rates in this study occurred at a time point when 10 out of 12 participant groups had completed the A NULIFE study. Data involving the 2 groups that were actively engaged in study activities were not used in this analysis. A survey instrument was designed to query the personal and study-related factors that were believed to have contributed to the decision to terminate participation in the A NULIFE study.^ Results. Overall, the highest ranked personal reason that influenced withdrawal from the study was being “too busy” with other obligations. The second highest ranked factor for withdrawal was work obligations. Whereas, more than half of all participants agreed that they were well-informed about the study and considered the study personnel to be approachable, 54% of participants would have been inclined to remain in the study if it were located at a local community center.^ Conclusions. Time commitment was likely a major factor for withdrawal from the A NULIFE study. Future investigators should implement trials within participant communities where possible. Also, focus group settings may provide detailed insight into factors that contribute to the attrition of minorities in cancer clinical trials.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Colorectal cancer (CRC) is the third largest cause of cancer death in the United States. While the disease burden is high, there are proven methods to screen for CRC and detect it at a stage that is amenable to cure. Patients with low health literacy have difficulty navigating the health care system and are at increased risk to not receive preventive care services such as colorectal cancer screening (CRCS). To address this need, an exam-room based video was developed to be played for patients in the privacy of the exam room, while they are waiting to be seen by their medical provider. In roughly 2 minutes, the video informs the patient about CRC and CRCS and how they can successfully complete CRCS. One of the key barriers to completing CRCS is the need to increase patients' knowledge and improve attitudes surrounding CRCS. This study examines the impact of the video on patients' knowledge and attitudes about CRC and CRCS in a medically underserved patient population in Houston, Texas. ^ Sixty-one patients presenting for routine medical care were enrolled in the study. Depending on their randomization, the patients either received routine information about CRC and CRCS or they watched the video. We found that the patients who did watch the video did have improvements in their knowledge and improved attitudes about CRC and CRCS. Future studies will be needed to examine whether the video improves the patients' completion of CRCS.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bisphosphonates have proven effectiveness in preventing skeletal-related events (SREs) in advanced breast cancer, prostate cancer and multiple myeloma. The purpose of this study was to assess efficacy of bisphosphonates in preventing SREs, in controlling pain, and in increasing life expectancy in lung cancer patients with bone metastases.^ We performed an electronic search in MEDLINE, EMBASE, Web of Science, and Cochrane library databases up to April 4, 2010. Hand searching and searching in clinicaltrials.gov were also performed. Two independent reviewers selected all clinical trials that included lung cancer patients with bone metastases treated with bisphosphonates. We excluded articles that involved cancers other than lung, patients without bone metastasis and treatment other than bisphosphonates. Outcome questions answered were efficacy measured as overall pain control, overall improvement in survival and reduction in skeletal-related events or SREs (fracture, cord compression, radiation or surgery to the bone, hypercalcemia of malignancy). The quality of each study was evaluated using the Cochrane Back Review group questionnaire to assess risk of bias (0-worst to 11-best). Data extraction and quality assessments were independently performed by two assessors. Meta-analyses were performed where more than one study with similar outcomes were found.^ We identified eight trials that met our inclusion criteria. Three studies evaluated zoledronic acid, three pamidronate, three clodronate and two ibandronate. Two were placebocontrol trials while two had multi-group comparisons (radiotherapy, radionucleotides, and chemotherapy) and two had different bisphosphonate as active controls. Quality scores ranged from 1-4 out of 11 suggesting high risk of bias. Studies failed to report adequate explanation of randomization procedures, concealment of randomization and blinding. Metaanalysis showed that patients treated with zoledronic acid alone had lower rates of developing SREs compared to placebo at 21 months (RR=0.80, 95% CI=0.66-0.97, p=0.02). Meta-analyses also showed increased pain control when a bisphosphonate was added to the existing treatment modality like chemotherapy or radiation (RR=1.17, 95% CI=1.03-1.34, p=0.02). However, pain control was not statistically significantly different among various bisphosphonates when other treatment modalities were not present. Despite improvement in SRE and pain control, bisphosphonates failed to show improvement in overall survival (Difference in means=109.1 days, 95% CI= -51.52 – 269.71, p=0.183).^ Adding biphosphonates to standard care improved pain control and reduced SREs. Biphosphonates did not improve overall survival. Further larger studies with higher quality are required to stengthen the evidence.^ Keywords/MeSH terms Bisphosphonates/diphosphonates: generic, chemical and trade names.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early phase clinical trial designs have long been the focus of interest for clinicians and statisticians working in oncology field. There are several standard phse I and phase II designs that have been widely-implemented in medical practice. For phase I design, the most commonly used methods are 3+3 and CRM. A newly-developed Bayesian model-based mTPI design has now been used by an increasing number of hospitals and pharmaceutical companies. The advantages and disadvantages of these three top phase I designs have been discussed in my work here and their performances were compared using simulated data. It was shown that mTPI design exhibited superior performance in most scenarios in comparison with 3+3 and CRM designs. ^ The next major part of my work is proposing an innovative seamless phase I/II design that allows clinicians to conduct phase I and phase II clinical trials simultaneously. Bayesian framework was implemented throughout the whole design. The phase I portion of the design adopts mTPI method, with the addition of futility rule which monitors the efficacy performance of the tested drugs. Dose graduation rules were proposed in this design to allow doses move forward from phase I portion of the study to phase II portion without interrupting the ongoing phase I dose-finding schema. Once a dose graduated to phase II, adaptive randomization was used to randomly allocated patients into different treatment arms, with the intention of more patients being assigned to receive more promising dose(s). Again simulations were performed to compare the performance of this innovative phase I/II design with a recently published phase I/II design, together with the conventional phase I and phase II designs. The simulation results indicated that the seamless phase I/II design outperform the other two competing methods in most scenarios, with superior trial power and the fact that it requires smaller sample size. It also significantly reduces the overall study time. ^ Similar to other early phase clinical trial designs, the proposed seamless phase I/II design requires that the efficacy and safety outcomes being able to be observed in a short time frame. This limitation can be overcome by using validated surrogate marker for the efficacy and safety endpoints.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of targeted therapy involve many challenges. Our study will address some of the key issues involved in biomarker identification and clinical trial design. In our study, we propose two biomarker selection methods, and then apply them in two different clinical trial designs for targeted therapy development. In particular, we propose a Bayesian two-step lasso procedure for biomarker selection in the proportional hazards model in Chapter 2. In the first step of this strategy, we use the Bayesian group lasso to identify the important marker groups, wherein each group contains the main effect of a single marker and its interactions with treatments. In the second step, we zoom in to select each individual marker and the interactions between markers and treatments in order to identify prognostic or predictive markers using the Bayesian adaptive lasso. In Chapter 3, we propose a Bayesian two-stage adaptive design for targeted therapy development while implementing the variable selection method given in Chapter 2. In Chapter 4, we proposed an alternate frequentist adaptive randomization strategy for situations where a large number of biomarkers need to be incorporated in the study design. We also propose a new adaptive randomization rule, which takes into account the variations associated with the point estimates of survival times. In all of our designs, we seek to identify the key markers that are either prognostic or predictive with respect to treatment. We are going to use extensive simulation to evaluate the operating characteristics of our methods.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis project is motivated by the potential problem of using observational data to draw inferences about a causal relationship in observational epidemiology research when controlled randomization is not applicable. Instrumental variable (IV) method is one of the statistical tools to overcome this problem. Mendelian randomization study uses genetic variants as IVs in genetic association study. In this thesis, the IV method, as well as standard logistic and linear regression models, is used to investigate the causal association between risk of pancreatic cancer and the circulating levels of soluble receptor for advanced glycation end-products (sRAGE). Higher levels of serum sRAGE were found to be associated with a lower risk of pancreatic cancer in a previous observational study (255 cases and 485 controls). However, such a novel association may be biased by unknown confounding factors. In a case-control study, we aimed to use the IV approach to confirm or refute this observation in a subset of study subjects for whom the genotyping data were available (178 cases and 177 controls). Two-stage IV method using generalized method of moments-structural mean models (GMM-SMM) was conducted and the relative risk (RR) was calculated. In the first stage analysis, we found that the single nucleotide polymorphism (SNP) rs2070600 of the receptor for advanced glycation end-products (AGER) gene meets all three general assumptions for a genetic IV in examining the causal association between sRAGE and risk of pancreatic cancer. The variant allele of SNP rs2070600 of the AGER gene was associated with lower levels of sRAGE, and it was neither associated with risk of pancreatic cancer, nor with the confounding factors. It was a potential strong IV (F statistic = 29.2). However, in the second stage analysis, the GMM-SMM model failed to converge due to non- concaveness probably because of the small sample size. Therefore, the IV analysis could not support the causality of the association between serum sRAGE levels and risk of pancreatic cancer. Nevertheless, these analyses suggest that rs2070600 was a potentially good genetic IV for testing the causality between the risk of pancreatic cancer and sRAGE levels. A larger sample size is required to conduct a credible IV analysis.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show a method for parallelizing top down dynamic programs in a straightforward way by a careful choice of a lock-free shared hash table implementation and randomization of the order in which the dynamic program computes its subproblems. This generic approach is applied to dynamic programs for knapsack, shortest paths, and RNA structure alignment, as well as to a state-of-the-art solution for minimizing the máximum number of open stacks. Experimental results are provided on three different modern multicore architectures which show that this parallelization is effective and reasonably scalable. In particular, we obtain over 10 times speedup for 32 threads on the open stacks problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013).Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Politécnica del Ejército Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program andgroup variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults.We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dentro de los materiales estructurales, el magnesio y sus aleaciones están siendo el foco de una de profunda investigación. Esta investigación está dirigida a comprender la relación existente entre la microestructura de las aleaciones de Mg y su comportamiento mecánico. El objetivo es optimizar las aleaciones actuales de magnesio a partir de su microestructura y diseñar nuevas aleaciones. Sin embargo, el efecto de los factores microestructurales (como la forma, el tamaño, la orientación de los precipitados y la morfología de los granos) en el comportamiento mecánico de estas aleaciones está todavía por descubrir. Para conocer mejor de la relación entre la microestructura y el comportamiento mecánico, es necesaria la combinación de técnicas avanzadas de caracterización experimental como de simulación numérica, a diferentes longitudes de escala. En lo que respecta a las técnicas de simulación numérica, la homogeneización policristalina es una herramienta muy útil para predecir la respuesta macroscópica a partir de la microestructura de un policristal (caracterizada por el tamaño, la forma y la distribución de orientaciones de los granos) y el comportamiento del monocristal. La descripción de la microestructura se lleva a cabo mediante modernas técnicas de caracterización (difracción de rayos X, difracción de electrones retrodispersados, así como con microscopia óptica y electrónica). Sin embargo, el comportamiento del cristal sigue siendo difícil de medir, especialmente en aleaciones de Mg, donde es muy complicado conocer el valor de los parámetros que controlan el comportamiento mecánico de los diferentes modos de deslizamiento y maclado. En la presente tesis se ha desarrollado una estrategia de homogeneización computacional para predecir el comportamiento de aleaciones de magnesio. El comportamiento de los policristales ha sido obtenido mediante la simulación por elementos finitos de un volumen representativo (RVE) de la microestructura, considerando la distribución real de formas y orientaciones de los granos. El comportamiento del cristal se ha simulado mediante un modelo de plasticidad cristalina que tiene en cuenta los diferentes mecanismos físicos de deformación, como el deslizamiento y el maclado. Finalmente, la obtención de los parámetros que controlan el comportamiento del cristal (tensiones críticas resueltas (CRSS) así como las tasas de endurecimiento para todos los modos de maclado y deslizamiento) se ha resuelto mediante la implementación de una metodología de optimización inversa, una de las principales aportaciones originales de este trabajo. La metodología inversa pretende, por medio del algoritmo de optimización de Levenberg-Marquardt, obtener el conjunto de parámetros que definen el comportamiento del monocristal y que mejor ajustan a un conjunto de ensayos macroscópicos independientes. Además de la implementación de la técnica, se han estudiado tanto la objetividad del metodología como la unicidad de la solución en función de la información experimental. La estrategia de optimización inversa se usó inicialmente para obtener el comportamiento cristalino de la aleación AZ31 de Mg, obtenida por laminado. Esta aleación tiene una marcada textura basal y una gran anisotropía plástica. El comportamiento de cada grano incluyó cuatro mecanismos de deformación diferentes: deslizamiento en los planos basal, prismático, piramidal hc+ai, junto con el maclado en tracción. La validez de los parámetros resultantes se validó mediante la capacidad del modelo policristalino para predecir ensayos macroscópicos independientes en diferentes direcciones. En segundo lugar se estudió mediante la misma estrategia, la influencia del contenido de Neodimio (Nd) en las propiedades de una aleación de Mg-Mn-Nd, obtenida por extrusión. Se encontró que la adición de Nd produce una progresiva isotropización del comportamiento macroscópico. El modelo mostró que este incremento de la isotropía macroscópica era debido tanto a la aleatoriedad de la textura inicial como al incremento de la isotropía del comportamiento del cristal, con valores similares de las CRSSs de los diferentes modos de deformación. Finalmente, el modelo se empleó para analizar el efecto de la temperatura en el comportamiento del cristal de la aleación de Mg-Mn-Nd. La introducción en el modelo de los efectos non-Schmid sobre el modo de deslizamiento piramidal hc+ai permitió capturar el comportamiento mecánico a temperaturas superiores a 150_C. Esta es la primera vez, de acuerdo con el conocimiento del autor, que los efectos non-Schmid han sido observados en una aleación de Magnesio. The study of Magnesium and its alloys is a hot research topic in structural materials. In particular, special attention is being paid in understanding the relationship between microstructure and mechanical behavior in order to optimize the current alloy microstructures and guide the design of new alloys. However, the particular effect of several microstructural factors (precipitate shape, size and orientation, grain morphology distribution, etc.) in the mechanical performance of a Mg alloy is still under study. The combination of advanced characterization techniques and modeling at several length scales is necessary to improve the understanding of the relation microstructure and mechanical behavior. Respect to the simulation techniques, polycrystalline homogenization is a very useful tool to predict the macroscopic response from polycrystalline microstructure (grain size, shape and orientation distributions) and crystal behavior. The microstructure description is fully covered with modern characterization techniques (X-ray diffraction, EBSD, optical and electronic microscopy). However, the mechanical behaviour of single crystals is not well-known, especially in Mg alloys where the correct parameterization of the mechanical behavior of the different slip/twin modes is a very difficult task. A computational homogenization framework for predicting the behavior of Magnesium alloys has been developed in this thesis. The polycrystalline behavior was obtained by means of the finite element simulation of a representative volume element (RVE) of the microstructure including the actual grain shape and orientation distributions. The crystal behavior for the grains was accounted for a crystal plasticity model which took into account the physical deformation mechanisms, e.g. slip and twinning. Finally, the problem of the parametrization of the crystal behavior (critical resolved shear stresses (CRSS) and strain hardening rates of all the slip and twinning modes) was obtained by the development of an inverse optimization methodology, one of the main original contributions of this thesis. The inverse methodology aims at finding, by means of the Levenberg-Marquardt optimization algorithm, the set of parameters defining crystal behavior that best fit a set of independent macroscopic tests. The objectivity of the method and the uniqueness of solution as function of the input information has been numerically studied. The inverse optimization strategy was first used to obtain the crystal behavior of a rolled polycrystalline AZ31 Mg alloy that showed a marked basal texture and a strong plastic anisotropy. Four different deformation mechanisms: basal, prismatic and pyramidal hc+ai slip, together with tensile twinning were included to characterize the single crystal behavior. The validity of the resulting parameters was proved by the ability of the polycrystalline model to predict independent macroscopic tests on different directions. Secondly, the influence of Neodymium (Nd) content on an extruded polycrystalline Mg-Mn-Nd alloy was studied using the same homogenization and optimization framework. The effect of Nd addition was a progressive isotropization of the macroscopic behavior. The model showed that this increase in the macroscopic isotropy was due to a randomization of the initial texture and also to an increase of the crystal behavior isotropy (similar values of the CRSSs of the different modes). Finally, the model was used to analyze the effect of temperature on the crystal behaviour of a Mg-Mn-Nd alloy. The introduction in the model of non-Schmid effects on the pyramidal hc+ai slip allowed to capture the inverse strength differential that appeared, between the tension and compression, above 150_C. This is the first time, to the author's knowledge, that non-Schmid effects have been reported for Mg alloys.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to investigate the effects of a standardized mixture of a commercial blend of phytogenic feed additives containing 5% carvacrol, 3% cinnamaldehyde, and 2% capsicum on utilization of dietary energy and performance in broiler chickens. Four experimental diets were offered to the birds from 7 to 21 d of age. These included 2 basal control diets based on either wheat or maize that contained 215 g CP/kg and 12.13 MJ/kg ME and another 2 diets using the basal control diets supplemented with the plant extracts combination at 100 mg/kg diet. Each diet was fed to 16 individually penned birds following randomization. Dietary plant extracts improved feed intake and weight gain (P < 0.05) and slightly (P < 0.1) improved feed efficiency of birds fed the maize-based diet. Supplementary plant extracts did not change dietary ME (P > 0.05) but improved (P < 0.05) dietary NE by reducing the heat increment (P < 0.05) per kilogram feed intake. Feeding phytogenics improved (P < 0.05) total carcass energy retention and the efficiency of dietary ME for carcass energy retention. The number of interactions between type of diet and supplementary phytogenic feed additive suggest that the chemical composition and the energy to protein ratio of the diet may influence the efficiency of phytogenics when fed to chickens. The experiment showed that although supplementary phytogenic additives did not affect dietary ME, they caused a significant improvement in the utilization of dietary energy for carcass energy retention but this did not always relate to growth performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El siguiente proyecto versa sobre la programación en lenguaje java del algoritmo de humanización MIDI desarrollado por Jorge Grundman en su tesis La Humanización de la Interpretación Virtual: Tres ejemplos significativos de la obra de Chopin. Este algoritmo, denominado Zig-Zag tiene como finalidad lograr que una partitura interpretada por un ordenador tenga unas características similares a la lectura a primera vista de la misma por un pianista. Para ello, basa su funcionamiento en una aleatorización del tempo en base a una serie de parámetros, a una modificación de la dinámica acorde a la modificación de tempo y a una segunda aleatorización para cada figura de la partitura. Este algoritmo tiene un gran campo de aplicación como complemento a los diversos secuenciadores y editores de partituras que existen en la actualidad, proporcionando nuevas características a los mismos. La programación del algoritmo se ha llevado a cabo empleando el Java SDK (Standard Developement Kit) 7 y las herramientas que proporciona esta plataforma para el manejo y modificación de los mensajes MIDI. ABSTRACT. The next project is about the programming in Java language of the MIDI humanization algorithm developed by Jorge Grundman in his thesis La Humanización de la Interpretación Virtual: Tres ejemplos significativos de la obra de Chopin. This algorithm, called Zig-Zag aims to have similar characteristics in a score performed by a computer than in the sight reading by a pianist. To this end, it bases its process in a randomization of the tempo from several parameters, a modification of the dynamic according to the change of tempo and a second randomization for each figure in the score. This algorithm has a big scope of application as complement for the different sequencers and score editors that already exist, providing new features to them. The algorithm has been programmed using the Java SDK (Standard Development Kit) 7 and the tools that this platform provides to handle and modify MIDI messages.