843 resultados para the linear logistic test model
Resumo:
Educational assessment was a worldwide commonplace practice in the last century. With the theoretical underpinnings of education shifting from behaviourism and social efficiency to constructivism and cognitive theories in the past two decades, the assessment theories and practices show a widespread changing movement. The emergent assessment paradigm, with a futurist perspective, indicates a deviation away from the prevailing large scale high-stakes standardised testing and an inclination towards classroom-based formative assessment. Innovations and reforms initiated in attempts to achieve better education outcomes for a sustainable future via more developed learning and assessment theories have included the 2007 College English Reform Program (CERP) in Chinese higher education context. This paper focuses on the College English Test (CET) - the national English as a Foreign Language (EFL) testing system for non-English majors at tertiary level in China. It seeks to explore the roles that the CET played in the past two College English curriculum reforms, and the new role that testing and assessment assumed in the newly launched reform. The paper holds that the CET was operationalised to uplift the standards. However, the extended use of this standardised testing system brings constraints as well as negative washback effects on the tertiary EFL education. Therefore in the newly launched reform -CERP, a new assessment model which combines summative and formative assessment approaches is proposed. The testing and assessment, assumed a new role - to engender desirable education outcomes. The question asked is: will the mixed approach to formative and summative assessment provide the intended cure to the agony that tertiary EFL education in China has long been suffering - spending much time, yet achieving little effects? The paper reports the progresses and challenges as informed by the available research literature, yet asserts a lot needs to be explored on the potential of the assessment mix in this examination tradition deep-rooted and examination-obsessed society.
Resumo:
The field of destination image has been widely discussed in the destination literature since the early 1970s (see Mayo, 1973). However the extent to which travel context impacts on an individual’s destination image evaluation, and therefore destination choice, has received scant attention (Hu & Ritchie, 1993). This study, utilising expectancy-value theory, sought to elicit salient destination attributes from consumers across two travel contexts: short-break holidays and longer getaways. Using the Repertory Test technique, attributes elicited as being salient for short-break holidays were consistent with those elicited for longer getaways. While this study was limited to Brisbane’s near-home destinations, the results will be of interest to destination marketers and researchers interested in the challenge of positioning a destination in diverse markets.
Resumo:
A nonlinear interface element modelling method is formulated for the prediction of deformation and failure of high adhesive thin layer polymer mortared masonry exhibiting failure of units and mortar. Plastic flow vectors are explicitly integrated within the implicit finite element framework instead of relying on predictor–corrector like approaches. The method is calibrated using experimental data from uniaxial compression, shear triplet and flexural beam tests. The model is validated using a thin layer mortared masonry shear wall, whose experimental datasets are reported in the literature and is used to examine the behaviour of thin layer mortared masonry under biaxial loading.
Resumo:
The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.
Resumo:
Notched three-point bend specimens (TPB) were tested under crack mouth opening displacement (CMOD) control at a rate of 0.0004 mm/s and the entire fracture process was simulated using a regular triangular two-dimensional lattice network only over the expected fracture proces zone width. The rest of the beam specimen was discretised by a coarse triangular finite element mesh. The discrete grain structure of the concrete was generated assuming the grains to be spherical. The load versus CMOD plots thus simulated agreed reasonably well with the experimental results. Moreover, acoustic emission (AE) hits were recorded during the test and compared with the number of fractured lattice elements. It was found that the cumulative AE hits correlated well with the cumulative fractured lattice elements at all load levels thus providing a useful means for predicting when the micro-cracks form during the fracturing process, both in the pre-peak and in the post-peak regimes.
Resumo:
The line spectral frequency (LSF) of a causal finite length sequence is a frequency at which the spectrum of the sequence annihilates or the magnitude spectrum has a spectral null. A causal finite-length sequencewith (L + 1) samples having exactly L-LSFs, is referred as an Annihilating (AH) sequence. Using some spectral properties of finite-length sequences, and some model parameters, we develop spectral decomposition structures, which are used to translate any finite-length sequence to an equivalent set of AH-sequences defined by LSFs and some complex constants. This alternate representation format of any finite-length sequence is referred as its LSF-Model. For a finite-length sequence, one can obtain multiple LSF-Models by varying the model parameters. The LSF-Model, in time domain can be used to synthesize any arbitrary causal finite-length sequence in terms of its characteristic AH-sequences. In the frequency domain, the LSF-Model can be used to obtain the spectral samples of the sequence as a linear combination of spectra of its characteristic AH-sequences. We also summarize the utility of the LSF-Model in practical discrete signal processing systems.
Resumo:
The momentum balance of the linear-combination integral model for the transition zone is investigated for constant pressure flows. The imbalance is found to be small enough to be negligible for all practical purposes. [S0889-504X(00)00703-0].
Resumo:
Variable cross-sectional area ducts are often used for attenuation at lower frequencies (of the order of firing frequency), whereas concentric tube resonators provide attenuation at relatively higher frequencies. In this paper, analysis of one dimensional control volume approach of conical concentric tube resonators is validated experimentally. The effects of mean flow and taper are investigated. The experimental setup is specially designed to measure the pressure transfer function in the form of Level Difference or Noise Reduction across the test muffler. It is shown that there is a reasonably good agreement between the predicted values of the Noise Reduction and the measured ones for incompressible mean flow as well as stationary medium. (C) 2011 Institute of Noise Control Engineering.
Resumo:
To evaluate the parameters in the two-parameter fracture model, i.e. the critical stress intensity factor and critical crack tip opening displacement for the fracture of plain concrete in Mode 1 for the given test configuration and geometry, considerable computational effort is necessary. A simple graphical method has been proposed using normalized fracture parameters for the three-point bend (3PB) notched specimen and the double-edged notched (DEN) specimen. A similar graphical method is proposed to compute the maximum load carrying capacity of a specimen, using the critical fracture parameters both for 3PB and DEN configurations.
Resumo:
The two-phase thermodynamic (2PT) model is used to determine the absolute entropy and energy of carbon dioxide over a wide range of conditions from molecular dynamics trajectories. The 2PT method determines the thermodynamic properties by applying the proper statistical mechanical partition function to the normal modes of a fluid. The vibrational density of state (DoS), obtained from the Fourier transform of the velocity autocorrelation function, converges quickly, allowing the free energy, entropy, and other thermodynamic properties to be determined from short 20-ps MD trajectories. The anharmonic effects in the vibrations are accounted for by the broadening of the normal modes into bands from sampling the velocities over the trajectory. The low frequency diffusive modes, which lead to finite DoS at zero frequency, are accounted for by considering the DoS as a superposition of gas-phase and solid-phase components (two phases). The analytical decomposition of the DoS allows for an evaluation of properties contributed by different types of molecular motions. We show that this 2PT analysis leads to accurate predictions of entropy and energy of CO2 over a wide range of conditions (from the triple point to the critical point of both the vapor and the liquid phases along the saturation line). This allows the equation of state of CO2 to be determined, which is limited only by the accuracy of the force field. We also validated that the 2PT entropy agrees with that determined from thermodynamic integration, but 2PT requires only a fraction of the time. A complication for CO2 is that its equilibrium configuration is linear, which would have only two rotational modes, but during the dynamics it is never exactly linear, so that there is a third mode from rotational about the axis. In this work, we show how to treat such linear molecules in the 2PT framework.
Resumo:
We report our studies of the linear and nonlinear rheology of aqueous solutions of the surfactant cetyl trimethylammonium tosylate (CTAT) with varying amounts of sodium chloride (NaCl). The CTAT concentration is fixed at 42 mM, and the salt concentration is varied between 0 and 120 mM. On increasing the salt (NaCl) concentration, we see three distinct regimes in the zero-shear viscosity and the high-frequency plateau modulus data. In regime 1, the zero-shear viscosity shows a weak increase with salt concentration due to enhanced micellar growth. The decrease in the zero-shear viscosities with salt concentration in regimes II and III can be explained in terms of intermicellar branching. The most intriguing feature of our data, however, is the anomalous behavior of the high-frequency plateau modulus in regime II (0.12 less than or equal to [NaCl]/[CTAT] less than or equal to 1.42). In this regime, the plateau modulus increases with an increase in NaCl concentration. This is highly interesting, since the correlation length of concentration fluctuations and hence the plateau modulus G(0) are not expected to change appreciably in the semidilute regime. We propose to explain the changes in regime II in terms of a possible unbinding of the organic counterions (tosylate) from the CTA(+) surfaces on the addition of NaCl. In the nonlinear flow curves of the samples with high salt content, significant deviations from the predictions of the Giesekus model for entangled micelles are observed.
Resumo:
Climate change in response to a change in external forcing can be understood in terms of fast response to the imposed forcing and slow feedback associated with surface temperature change. Previous studies have investigated the characteristics of fast response and slow feedback for different forcing agents. Here we examine to what extent that fast response and slow feedback derived from time-mean results of climate model simulations can be used to infer total climate change. To achieve this goal, we develop a multivariate regression model of climate change, in which the change in a climate variable is represented by a linear combination of its sensitivity to CO2 forcing, solar forcing, and change in global mean surface temperature. We derive the parameters of the regression model using time-mean results from a set of HadCM3L climate model step-forcing simulations, and then use the regression model to emulate HadCM3L-simulated transient climate change. Our results show that the regression model emulates well HadCM3L-simulated temporal evolution and spatial distribution of climate change, including surface temperature, precipitation, runoff, soil moisture, cloudiness, and radiative fluxes under transient CO2 and/or solar forcing scenarios. Our findings suggest that temporal and spatial patterns of total change for the climate variables considered here can be represented well by the sum of fast response and slow feedback. Furthermore, by using a simple 1-D heat-diffusion climate model, we show that the temporal and spatial characteristics of climate change under transient forcing scenarios can be emulated well using information from step-forcing simulations alone.
Resumo:
The Linear Ordering Problem is a popular combinatorial optimisation problem which has been extensively addressed in the literature. However, in spite of its popularity, little is known about the characteristics of this problem. This paper studies a procedure to extract static information from an instance of the problem, and proposes a method to incorporate the obtained knowledge in order to improve the performance of local search-based algorithms. The procedure introduced identifies the positions where the indexes cannot generate local optima for the insert neighbourhood, and thus global optima solutions. This information is then used to propose a restricted insert neighbourhood that discards the insert operations which move indexes to positions where optimal solutions are not generated. In order to measure the efficiency of the proposed restricted insert neighbourhood system, two state-of-the-art algorithms for the LOP that include local search procedures have been modified. Conducted experiments confirm that the restricted versions of the algorithms outperform the classical designs systematically. The statistical test included in the experimentation reports significant differences in all the cases, which validates the efficiency of our proposal.
Resumo:
The beam lattice-type models, such as the Euler-Bernoulli (or Timoshenko) beam lattice and the generalized beam (GB) lattice, have been proved very effective in simulating failure processes in concrete and rock due to its simplicity and easy implementation. However, these existing lattice models only take into account tensile failures, so it may be not applicable to simulation of failure behaviors under compressive states. The main aim in this paper is to incorporate Mohr-Coulomb failure criterion, which is widely used in many kinds of materials, into the GB lattice procedure. The improved GB lattice procedure has the capability of modeling both element failures and contact/separation of cracked elements. The numerical examples show its effectiveness in simulating compressive failures. Furthermore, the influences of lateral confinement, friction angle, stiffness of loading platen, inclusion of aggregates on failure processes are respectively analyzed in detail.
Resumo:
Predictions for a 75x205mm surface semi-elliptic defect in the NESC-1 spinning cylinder test have been made using BS PD 6493:1991, the R6 procedure, non-linear cracked body finite element analysis techniques and the local approach to fracture. All the techniques agree in predicting ductile tearing near the inner surface of the cylinder followed by cleavage initiation. However they differ in the amount of ductile tearing, and the exact location and time of any cleavage event. The amount of ductile tearing decreases with increasing sophistication in the analysis, due to the drop in peak crack driving force and more explicit consideration of constraint effects. The local approach predicts a high probability of cleavage in both HAZ and base material after 190s, while the other predictions suggest that cleavage is unlikely in the HAZ due to constraint loss, but likely in the underlying base material. The timing of this event varies from ∼150s for R6 predictions to ∼250-300s using non-linear cracked body analysis.