925 resultados para Anisotropic Gevrey Classes
Resumo:
First year undergraduate university classes can be very large, and big student numbers often creates a challenge for instructors to ensure assignments are graded consistently across the cohort. This session describes and demonstrates the use of interactive audience response technology (ART) with assessors (rather than students) to moderate assignment grading. Results from preliminary research indicate this method of moderating the grading of assignments is effective, and achieves more consistent outcomes for students.
Resumo:
This paper examines the contribution of aspects of critical and referential realism to the “logic” of structural explanation through an analysis of Erik Olin Wright’s Classes and the debate surrounding this work. Wright’s Classes has been selected as a case study because it offers an opportunity to examine issues pertaining to “objective” and “subjective” determinations of class and related questions of agency and structure at the level of actual methodological strategies. A close examination of the structure of Wright’s inquiry reveals a number of places where Harre’s and Bhaskar’s approaches may contribute to the prescription of methodological strategies which could overcome some of the antinomies on which the debate on Classes is based. As a case study, the paper underlines the important “underlabourer” role of critical and referential realism and their contribution to questions of agency and structure in the context of actual stages involved in structural explanation
Resumo:
This paper reports on the development of a tool that generates randomised, non-multiple choice assessment within the BlackBoard Learning Management System interface. An accepted weakness of multiple-choice assessment is that it cannot elicit learning outcomes from upper levels of Biggs’ SOLO taxonomy. However, written assessment items require extensive resources for marking, and are susceptible to copying as well as marking inconsistencies for large classes. This project developed an assessment tool which is valid, reliable and sustainable and that addresses the issues identified above. The tool provides each student with an assignment assessing the same learning outcomes, but containing different questions, with responses in the form of words or numbers. Practice questions are available, enabling students to obtain feedback on their approach before submitting their assignment. Thus, the tool incorporates automatic marking (essential for large classes), randomised tasks to each student (reducing copying), the capacity to give credit for working (feedback on the application of theory), and the capacity to target higher order learning outcomes by requiring students to derive their answers rather than choosing them. Results and feedback from students are presented, along with technical implementation details.
Resumo:
The proportion of functional sequence in the human genome is currently a subject of debate. The most widely accepted figure is that approximately 5% is under purifying selection. In Drosophila, estimates are an order of magnitude higher, though this corresponds to a similar quantity of sequence. These estimates depend on the difference between the distribution of genomewide evolutionary rates and that observed in a subset of sequences presumed to be neutrally evolving. Motivated by the widening gap between these estimates and experimental evidence of genome function, especially in mammals, we developed a sensitive technique for evaluating such distributions and found that they are much more complex than previously apparent. We found strong evidence for at least nine well-resolved evolutionary rate classes in an alignment of four Drosophila species and at least seven classes in an alignment of four mammals, including human. We also identified at least three rate classes in human ancestral repeats. By positing that the largest of these ancestral repeat classes is neutrally evolving, we estimate that the proportion of nonneutrally evolving sequence is 30% of human ancestral repeats and 45% of the aligned portion of the genome. However, we also question whether any of the classes represent neutrally evolving sequences and argue that a plausible alternative is that they reflect variable structure-function constraints operating throughout the genomes of complex organisms.
Resumo:
In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the co-regularized least squares (CoRLS) algorithm, in which the views are reproducing kernel Hilbert spaces (RKHS's), and the disagreement penalty is the average squared difference in predictions. The final predictor is the pointwise average of the predictors from each view. We call the set of predictors that can result from this procedure the co-regularized hypothesis class. Our main result is a tight bound on the Rademacher complexity of the co-regularized hypothesis class in terms of the kernel matrices of each RKHS. We find that the co-regularization reduces the Rademacher complexity by an amount that depends on the distance between the two views, as measured by a data dependent metric. We then use standard techniques to bound the gap between training error and test error for the CoRLS algorithm. Experimentally, we find that the amount of reduction in complexity introduced by co regularization correlates with the amount of improvement that co-regularization gives in the CoRLS algorithm.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the elementary school years, with opportunities for children to engage in data modeling. Data modeling involves investigations of meaningful phenomena, deciding what is worthy of attention, and then progressing to organizing, structuring, visualizing, and representing data. Reported here are some findings from a two-part activity (Baxter Brown’s Picnic and Planning a Picnic) implemented at the end of the second year of a current three-year longitudinal study (grade levels 1-3). Planning a Picnic was also implemented in a grade 7 class to provide an opportunity for the different age groups to share their products. Addressed here are the grade 2 children’s predictions for missing data in Baxter Brown’s Picnic, the questions posed and representations created by both grade levels in Planning a Picnic, and the metarepresentational competence displayed in the grade levels’ sharing of their products for Planning a Picnic.
Resumo:
Audience Response Systems (ARS) have been successfully used by academics to facilitate student learning and engagement, particularly in large lecture settings. However, in large core subjects a key challenge is not only to engage students, but also to engage large and diverse teaching teams in order to ensure a consistent approach to grading assessments. This paper provides an insight into the ways in which ARS can be used to encourage participation by tutors in marking and moderation meetings. It concludes that ARS can improve the consistency of grading and the quality of feedback provided to students.
Resumo:
University classes in marketing are often large, and therefore require teams of teachers to cover all of the necessary activities. A major problem with teaching teams is the inconsistency that results from myriad individuals offering subjective opinions. This innovation uses the latest moderation techniques along with Audience Response Technology (ART) to enhance the learning experience by providing more consistent and reliable grading in large classes. Assessment items are moderated before they are graded in meetings that employ ART. Results show the process is effective when the teaching team is very large, or there is a diverse range of experienced and inexperienced teachers. This “behind the scenes” innovation is not immediately apparent to students, but results in more consistent grades, more useful feedback for students, and more confident graders.
Performance of elite seated discus throwers in F30s classes : part II: does feet positioning matter?
Resumo:
Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.
Resumo:
In urban locations in Australia and elsewhere, public space may be said to be under attack from developers and also from attempts by civic authorities to oversee and control it (Davis 1995, Mitchell 2003, Watson 2006, Iveson 2006). The use of public space use by young people in particular, raises issues in Australia and elsewhere in the world. In a context of monitoring and control procedures, young people’s use of public space is often viewed as a threat to the prevailing social order (Loader 1996, White 1998, Crane and Dee 2001). This paper discusses recent technological developments in the surveillance, governance and control of public space used by young people, children and people of all ages.
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
We present a rigorous validation of the analytical Amadei solution for the stress concentration around an arbitrarily orientated borehole in general anisotropic elastic media. First, we revisit the theoretical framework of the Amadei solution and present analytical insights that show that the solution does indeed contain all special cases of symmetry, contrary to previous understanding, provided that the reduced strain coefficients b11 and b55 are not equal. It is shown from theoretical considerations and published experimental data that the b11 and b55 are not equal for realistic rocks. Second, we develop a 3D finite element elastic model within a hybrid analytical–numerical workflow that circumvents the need to rebuild and remesh the model for every borehole and material orientation. Third, we show that the borehole stresses computed from the numerical model and the analytical solution match almost perfectly for different borehole orientations (vertical, deviated and horizontal) and for several cases involving isotropic, transverse isotropic and orthorhombic symmetries. It is concluded that the analytical Amadei solution is valid with no restriction on the borehole orientation or the symmetry of the elastic anisotropy.
Resumo:
Anisotropic damage distribution and evolution have a profound effect on borehole stress concentrations. Damage evolution is an irreversible process that is not adequately described within classical equilibrium thermodynamics. Therefore, we propose a constitutive model, based on non-equilibrium thermodynamics, that accounts for anisotropic damage distribution, anisotropic damage threshold and anisotropic damage evolution. We implemented this constitutive model numerically, using the finite element method, to calculate stress–strain curves and borehole stresses. The resulting stress–strain curves are distinctively different from linear elastic-brittle and linear elastic-ideal plastic constitutive models and realistically model experimental responses of brittle rocks. We show that the onset of damage evolution leads to an inhomogeneous redistribution of material properties and stresses along the borehole wall. The classical linear elastic-brittle approach to borehole stability analysis systematically overestimates the stress concentrations on the borehole wall, because dissipative strain-softening is underestimated. The proposed damage mechanics approach explicitly models dissipative behaviour and leads to non-conservative mud window estimations. Furthermore, anisotropic rocks with preferential planes of failure, like shales, can be addressed with our model.