20 resultados para Test Design
em CentAUR: Central Archive University of Reading - UK
Resumo:
This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.
Resumo:
The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.
Resumo:
Pharmacogenetic trials investigate the effect of genotype on treatment response. When there are two or more treatment groups and two or more genetic groups, investigation of gene-treatment interactions is of key interest. However, calculation of the power to detect such interactions is complicated because this depends not only on the treatment effect size within each genetic group, but also on the number of genetic groups, the size of each genetic group, and the type of genetic effect that is both present and tested for. The scale chosen to measure the magnitude of an interaction can also be problematic, especially for the binary case. Elston et al. proposed a test for detecting the presence of gene-treatment interactions for binary responses, and gave appropriate power calculations. This paper shows how the same approach can also be used for normally distributed responses. We also propose a method for analysing and performing sample size calculations based on a generalized linear model (GLM) approach. The power of the Elston et al. and GLM approaches are compared for the binary and normal case using several illustrative examples. While more sensitive to errors in model specification than the Elston et al. approach, the GLM approach is much more flexible and in many cases more powerful. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
There is increasing interest in combining Phases II and III of clinical development into a single trial in which one of a small number of competing experimental treatments is ultimately selected and where a valid comparison is made between this treatment and the control treatment. Such a trial usually proceeds in stages, with the least promising experimental treatments dropped as soon as possible. In this paper we present a highly flexible design that uses adaptive group sequential methodology to monitor an order statistic. By using this approach, it is possible to design a trial which can have any number of stages, begins with any number of experimental treatments, and permits any number of these to continue at any stage. The test statistic used is based upon efficient scores, so the method can be easily applied to binary, ordinal, failure time, or normally distributed outcomes. The method is illustrated with an example, and simulations are conducted to investigate its type I error rate and power under a range of scenarios.
Resumo:
While planning the GAIN International Study of gavestinel in acute stroke, a sequential triangular test was proposed but not implemented. Before the trial commenced it was agreed to evaluate the sequential design retrospectively to evaluate the differences in the resulting analyses, trial durations and sample sizes in order to assess the potential of sequential procedures for future stroke trials. This paper presents four sequential reconstructions of the GAIN study made under various scenarios. For the data as observed, the sequential design would have reduced the trial sample size by 234 patients and shortened its duration by 3 or 4 months. Had the study not achieved a recruitment rate that far exceeded expectation, the advantages of the sequential design would have been much greater. Sequential designs appear to be an attractive option for trials in stroke. Copyright 2004 S. Karger AG, Basel
Resumo:
This paper summarizes the design, manufacturing, testing, and finite element analysis (FEA) of glass-fibre-reinforced polyester leaf springs for rail freight vehicles. FEA predictions of load-deflection curves under static loading are presented, together with comparisons with test results. Bending stress distribution at typical load conditions is plotted for the springs. The springs have been mounted on a real wagon and drop tests at tare and full load have been carried out on a purpose-built shaker rig. The transient response of the springs from tests and FEA is presented and discussed.
Resumo:
This paper presents a novel actuator design that ameliorates or eliminates the effects of non-linearities that are characteristically present in geared actuator systems and which are very problematic for low velocity applications. The design centres on the providing an internal rotational element within a single actuator to ensure operation of actuator away from the stiction region, whilst allowing zero velocity external output of the actuator. The construction has the added advantage of substantially reducing backlash. The prototype comprises two commercially available servo-actuators to test the principle of operation and results presented indicate that the concept is worth exploring further.
Resumo:
A novel sparse kernel density estimator is derived based on a regression approach, which selects a very small subset of significant kernels by means of the D-optimality experimental design criterion using an orthogonal forward selection procedure. The weights of the resulting sparse kernel model are calculated using the multiplicative nonnegative quadratic programming algorithm. The proposed method is computationally attractive, in comparison with many existing kernel density estimation algorithms. Our numerical results also show that the proposed method compares favourably with other existing methods, in terms of both test accuracy and model sparsity, for constructing kernel density estimates.
Resumo:
This paper derives an efficient algorithm for constructing sparse kernel density (SKD) estimates. The algorithm first selects a very small subset of significant kernels using an orthogonal forward regression (OFR) procedure based on the D-optimality experimental design criterion. The weights of the resulting sparse kernel model are then calculated using a modified multiplicative nonnegative quadratic programming algorithm. Unlike most of the SKD estimators, the proposed D-optimality regression approach is an unsupervised construction algorithm and it does not require an empirical desired response for the kernel selection task. The strength of the D-optimality OFR is owing to the fact that the algorithm automatically selects a small subset of the most significant kernels related to the largest eigenvalues of the kernel design matrix, which counts for the most energy of the kernel training data, and this also guarantees the most accurate kernel weight estimate. The proposed method is also computationally attractive, in comparison with many existing SKD construction algorithms. Extensive numerical investigation demonstrates the ability of this regression-based approach to efficiently construct a very sparse kernel density estimate with excellent test accuracy, and our results show that the proposed method compares favourably with other existing sparse methods, in terms of test accuracy, model sparsity and complexity, for constructing kernel density estimates.
Resumo:
The Improved Stratospheric and Mesospheric Sounder (ISAMS) is designed to measure the Earths middle atmosphere in the range of 4.6 to 16.6 micorns. This paper considers all the coated optical elements in two radiometric test channels. (Analysis of the spectral response will be presented as a seperate paper at this symposium, see Sheppard et al). Comparisons between the compued spectral performance and measurements from actual coatings will be discussed: These will include substrate absorption simulations. The results of environmental testing (durability and stability) are included, together with details of coating deposition and monitoring conditions.
Resumo:
A role for sequential test procedures is emerging in genetic and epidemiological studies using banked biological resources. This stems from the methodology's potential for improved use of information relative to comparable fixed sample designs. Studies in which cost, time and ethics feature prominently are particularly suited to a sequential approach. In this paper sequential procedures for matched case–control studies with binary data will be investigated and assessed. Design issues such as sample size evaluation and error rates are identified and addressed. The methodology is illustrated and evaluated using both real and simulated data sets.
Resumo:
OBJECTIVE: The present study was carried out to investigate effects of meals, rich in either saturated fatty acids (SFA), or n-6 or n-3 fatty acids, on postprandial plasma lipid and hormone concentrations as well as post-heparin plasma lipoprotein lipase (LPL) activity. DESIGN: The study was a randomized single-blind study comparing responses to three test meals. SETTING: The volunteers attended the Clinical Investigation Unit of the Royal Surrey County Hospital on three separate occasions in order to consume the meals. SUBJECTS: Twelve male volunteers with an average age of 22.5 +/- 1.4 years (mean +/- SD), were selected from the University of Surrey student population; one subject dropped out of the study because he found the test meal unpalatable. INTERVENTIONS: Three meals were given in the early evening and postprandial responses were followed overnight for 11h. The oils used to prepare each of the three test meals were: a mixed oil rich in saturated fatty acids (SFA) which mimicked the fatty acid composition of the current UK diet, corn oil, rich in n-6 fatty acids and a fish oil concentrate (MaxEPA) rich in n-3 fatty acids. The oil under investigation (40 g) was incorporated into the test meals which were otherwise identical [208 g carbohydrates, 35 g protein, 5.65 MJ (1350 kcal) energy]. Postprandial plasma triacylglycerol (TAG), gastric inhibitory polypeptide (GIP), and insulin responses, as well as post-heparin LPL activity (measured at 12 h postprandially only) were investigated. RESULTS: Fatty acids of the n-3 series significantly reduced plasma TAG responses compared to the mixed oil meal (P < 0.05) and increased post-heparin LPL activity 15 min after the injection of heparin (P < 0.01). A biphasic response was observed in TAG, with peak responses occurring at 1 h and between 3-7 h postprandially. GIP and insulin showed similar responses to the three test meals and no significant differences were observed. CONCLUSION: We conclude that fish oils can decrease postprandial plasma TAG levels partly through an increase in post-heparin LPL activity, which however, is not due to increased GIP or insulin concentrations.
Resumo:
In this paper we have explored areas of application for health care manipulators and possible user groups. We have shown the steps in the design approach to the conceptual mechanism from the AAS. The future work will be measurement from properties of the muscle with the elbow parameterization test-bed to get a database to design one part of the control area from the AAS. More work on the mechanical design is required before a functional prototype can be built.
Resumo:
It is considered that systemisation, the use of standard systems of components and design solutions, has an effect on the activities of the designer. This paper draws on many areas of knowledge; design movements, economic theory, quality evaluation and organisation theory to substantiate the view that systemisation will reduce the designer's discretion at the point of design. A methodology to test this hypothesis is described which will be of use to other researchers studying the social processes of construction organisations.
Resumo:
The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9– 12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children’s self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children’s perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children’s self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children.