991 resultados para Prediction of scholastic success


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bibliography: leaves 48-51.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The attainment of high grades on the Victorian Certificate of Education (VCE) is critical to the future study and employment prospects of many Australian adolescents. Thus it is important to understand the factors that contribute to performance in the VCE. The aims of this study were twofold: the main aim was to test competing models of academic performance, subsuming a range of situational and dispositional variables based on a) self-efficacy theory, b) target and purpose goals, c) cognitive skills and self-regulatory strategies, and d) positive psychology. These models were each tested in terms of English performance and mathematics performance as these units contribute proportionally the most to overall VCE scores. In order to study whether pressures peculiar to the VCE impact on performance, the competing models were tested in a sample of Victorian students prior to the VCE (year 10) and then during the VCE (year 11). A preliminary study was conducted in order to develop and test four scales required for use in the major study, using an independent sample of 302 year nine students. The results indicated that these new scales were psychometrically reliable and valid. Three-hundred and seven Australian students participated in the year 10 and 11 study. These students were successively asked to provide their final years 9, 10 and 11 English and mathematics grades at times one, three and five and to complete a series of questionnaires at times two and four. Results of the year 10 study indicated that models based on self-efficacy theory were the best predictors of both English and mathematics performance, with high past grades, high self-efficacy and low anxiety contributing most to performance. While the year 10 self-efficacy models, target goal models, positive psychology models, self-regulatory models and cognitive skill based models were each robust in the sample in year 11, a substantial increase in explained variance was observed from year 10 to year 11 in the purpose goal models. Results indicated that students’ mastery goals and their performance-approach goals became substantially more predictive in the VCE than they were prior to the VCE. This result can be taken to suggest that these students responded in very instrumental ways to the pressures, and importance, of their VCE. An integrated model based on a combination of the variables from the competing models was also tested in the VCE. Results showed that these models were comparable, both in English and mathematics, to the self-efficacy models, but explained less variance than the purpose goal models. Thus in terms of parsimony the integrated models were not preferred. The implications of these results in terms of teaching practices and school counseling practices are discussed. It is recommended that students be encouraged to maintain a positive outlook in relation to their schoolwork and that they be encouraged to set their VCE goals in terms of a combination of self-referenced (mastery) and other-referenced (performance-approach) goals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Explores how machine learning techniques can be used to build effective student modeling systems with constrained development and operational overheads, by integrating top-down and bottom-up initiatives. Emphasizes feature-based modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Institutionally collected data identifying student demographics, course performance in both the precollege mathematics course and the college-level mathematics course, and 'stopping-out' time between the pre-college course and the college-level course were used to create a predictive model of academic success for 'high risk' college-level mathematics students. The two most significant factors were the pre-college mathematics course grade and the student's over-all college GPA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present, paper deals with the CAE-based study Of impact of jacketed projectiles on single- and multi-layered metal armour plates using LS-DYNA. The validation of finite element modelling procedure is mainly based on the mesh convergence study using both shell and solid elements for representing single-layered mild steel target plates. It, is shown that the proper choice of mesh density and the strain rate-dependent material properties are essential for all accurate prediction of projectile residual velocity. The modelling requirements are initially arrived at by correlating against test residual velocities for single-layered mild steel plates of different depths at impact velocities in the ran.-c of approximately 800-870 m/s. The efficacy of correlation is adjudged, in terms of a 'correlation index', defined in the paper: for which values close to unity are desirable. The experience gained for single-layered plates is next; used in simulating projectile impacts on multi-layered mild steel target plates and once again a high degree of correlation with experimental residual velocities is observed. The study is repeated for single- and multi-layered aluminium target plates with a similar level of success in test residual velocity prediction. TO the authors' best knowledge, the present comprehensive study shows in particular for the first time that, with a. proper modelling approach, LS-DYNA can be used with a great degree of confidence in designing perforation-resistant single and multi-layered metallic armour plates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical simulation of separated flows in rocket nozzles is challenging because existing turbulence models are unable to predict it correctly. This paper addresses this issue with the Spalart-Allmaras and Shear Stress Transport (SST) eddy-viscosity models, which predict flow separation with moderate success. Their performances have been compared against experimental data for a conical and two contoured subscale nozzles. It is found that they fail to predict the separation location correctly, exhibiting sensitivity to the nozzle pressure ratio (NPR) and nozzle type. A careful assessment indicated how the model had to be tuned for better, consistent prediction. It is learnt that SST model's failure is caused by limiting of the shear stress inside boundary layer according to Bradshaw's assumption, and by over prediction of jet spreading rate. Accordingly, SST's coefficients were empirically modified to match the experimental wall pressure data. Results confirm that accurate RANS prediction of separation depends on the correct capture of the jet spreading rate, and that it is feasible over a wide range of NPRs by modified values of the diffusion coefficients in the turbulence model. (C) 2015 Elsevier Masson SAS. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The potential of Raman spectroscopy for the determination of meat quality attributes has been investigated using data from a set of 52 cooked beef samples, which were rated by trained taste panels. The Raman spectra, shear force and cooking loss were measured and PLS used to correlate the attributes with the Raman data. Good correlations and standard errors of prediction were found when the Raman data were used to predict the panels' rating of acceptability of texture (R-2 = 0.71, Residual Mean Standard Error of Prediction (RMSEP)% of the mean (mu) = 15%), degree of tenderness (R-2 = 0.65, RMSEP% of mu = 18%), degree of juiciness (R-2 = 0.62, RMSEP% of mu = 16%), and overall acceptability (R-2 = 0.67, RMSEP% of mu = 11%). In contrast, the mechanically determined shear force was poorly correlated with tenderness (R-2 = 0.15). Tentative interpretation of the plots of the regression coefficients suggests that the alpha-helix to beta-sheet ratio of the proteins and the hydrophobicity of the myofibrillar environment are important factors contributing to the shear force, tenderness, texture and overall acceptability of the beef. In summary, this work demonstrates that Raman spectroscopy can be used to predict consumer-perceived beef quality. In part, this overall success is due to the fact that the Raman method predicts texture and tenderness, which are the predominant factors in determining overall acceptability in the Western world. Nonetheless, it is clear that Raman spectroscopy has considerable potential as a method for non-destructive and rapid determination of beef quality parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computational approach to predict the thermodynamics for forming a variety of imidazolium-based salts and ionic liquids from typical starting materials is described. The gas-phase proton and methyl cation acidities of several protonating and methylating agents, as well as the proton and methyl cation affinities of many important methyl-, nitro-, and cyano- substituted imidazoles, have been calculated reliably by using the computationally feasible DFT (B3LYP) and MP2 (extrapolated to the complete basis set limit) methods. These accurately calculated proton and methyl cation affinities of neutrals and anions are used in conjunction with an empirical approach based on molecular volumes to estimate the lattice enthalpies and entropies of ionic liquids, organic solids, and organic liquids. These quantities were used to construct a thermodynamic cycle for salt formation to reliably predict the ability to synthesize a variety of salts including ones with potentially high energetic densities. An adjustment of the gas phase thermodynamic cycle to account for solid- and liquid-phase chemistries provides the best overall assessment of salt formation and stability. This has been applied to imidazoles (the cation to be formed) with alkyl, nitro, and cyano substituents. The proton and methyl cation donors studied were as follows: HCl, HBr, HI, (HO)(2)SO2, HSO3CF3 (TfOH), and HSO3(C6H4)CH3 (TsOH); CH3Cl, CH3Br, CH3I, (CH3O)(2)SO2, CH3SO3CF3 (TfOCH3) and CH3SO3(C6H4)CH3 (TsOCH3). As substitution of the cation with electron-withdrawing groups increases, the triflate reagents appear to be the best overall choice as protonating and methylating agents. Even stronger alkylating agents should be considered to enhance the chances of synthetic success. When using the enthalpies of reaction for the gas-phase reactants (eq 6) to form a salt, a cutoff value of - 13 kcal mol(-1) or lower (more negative) should be used as the minimum value for predicting whether a salt can be synthesized.