5 resultados para test case optimization
em QSpace: Queen's University - Canada
Resumo:
Most essay rating research in language assessment has examined human raters’ essay rating as a cognitive process, thus overlooking or oversimplifying the interaction between raters and sociocultural contexts. Given that raters are social beings, their practices have social meanings and consequences. Hence it is important to situate essay rating within its sociocultural context for a more meaningful understanding. Drawing on Engeström’s (1987, 2001) cultural-historical activity theory (CHAT) framework with a sociocultural perspective, this study reconceptualized essay rating as a socially mediated activity with both cognitive (individual raters’ goal-directed decision-making actions) and social layers (raters’ collective object-oriented essay rating activity at related settings). In particular, this study explored raters’ essay rating at one provincial rating centre in China within the context of a high-stakes university entrance examination, the National Matriculation English Test (NMET). This study adopted a multiple-method multiple-perspective qualitative case study design. Think-aloud protocols, stimulated recalls, interviews, and documents served as the data sources. This investigation involved 25 participants at two settings (rating centre and high schools), including rating centre directors, team leaders, NMET essay raters who were high school teachers, and school principals and teaching colleagues of these essay raters. Data were analyzed using Strauss and Corbin’s (1990) open and axial coding techniques, and CHAT for data integration. The findings revealed the interaction between raters and the NMET sociocultural context. Such interaction can be understood through a surface structure (cognitive layer) and a deep structure (social layer) concerning how raters assessed NMET essays, where the surface structure reflected the “what” and the deep structure explained the “how” and “why” in raters’ decision-making. This study highlighted the roles of goals and rules in rater decision-making, rating tensions and raters’ solutions, and the relationship between essay rating and teaching. This study highlights the value of a sociocultural view to essay rating research, demonstrates CHAT as a sociocultural approach to investigate essay rating, and proposes a direction for future washback research on the effect of essay rating. This study also provides support for NMET rating practices that can potentially bring positive washback to English teaching in Chinese high schools.
Resumo:
Flame retardants (FRs) are added to materials to enhance the fire safety level of readily combustible polymers. Although they have been purported to aid in preventing fires in some cases, they have also become a significant cause for concern given the vast data on environmental persistence and human and animal adverse health effects. Evidence since the 1980s has shown that Canadian, American and Europeans have detectable levels of FRs in their bodies. North Americans in particular have high levels of these chemicals due to stringent flammability standards and the higher use of polybrominated diphenyl ethers (PBDEs) in North America as opposed to Europe. FRs have been detected in household dust and some evidence suggests that TVs could be a significant source of exposure to FRs. It is imperative to re-visit the flammability standard (UL94V) that allows for FR use in TVs plastic materials by providing a risk versus benefit analysis to determine if this standard provides a fire safety benefit and if it plays a major role in FR exposure. This report first examined the history of televisions and the progression to the UL94V flammability test standard to understand why FRs were first added to polymers used in the manufacturing of TVs. It has been demonstrated to be due to fire hazards resulting from the use of plastic materials in cathode-ray tube (CRT) TVs that had an “instant-on” feature and high voltage and operating temperatures. In providing a risk versus benefit analysis, this paper presents the argument that 1) by providing a market survey the current flammability test standard (UL94V) is outdated and lacks relevance to current technology as flat, thin, energy efficient Liquid Crystal Displays (LCDs) dominate over traditionally used heavy, bulky and energy-intensive CRTs; 2) FRs do not impart fire safety benefits considering that there is a lack of valid fire safety concern, such as reduced internal and external ignition and fire hazard, and a lack of valid fire data and hazard for television fires in general and finally; 3) the standard is overly stringent as it does not consider the risk due to exposure to FRs in household dust due to the proliferation and greater use of televisions in households. Therefore, this report argues that the UL94V standard has become trapped in history and needs to be updated as it may play a major role in FR exposure.
Resumo:
Strategic supply chain optimization (SCO) problems are often modelled as a two-stage optimization problem, in which the first-stage variables represent decisions on the development of the supply chain and the second-stage variables represent decisions on the operations of the supply chain. When uncertainty is explicitly considered, the problem becomes an intractable infinite-dimensional optimization problem, which is usually solved approximately via a scenario or a robust approach. This paper proposes a novel synergy of the scenario and robust approaches for strategic SCO under uncertainty. Two formulations are developed, namely, naïve robust scenario formulation and affinely adjustable robust scenario formulation. It is shown that both formulations can be reformulated into tractable deterministic optimization problems if the uncertainty is bounded with the infinity-norm, and the uncertain equality constraints can be reformulated into deterministic constraints without assumption of the uncertainty region. Case studies of a classical farm planning problem and an energy and bioproduct SCO problem demonstrate the advantages of the proposed formulations over the classical scenario formulation. The proposed formulations not only can generate solutions with guaranteed feasibility or indicate infeasibility of a problem, but also can achieve optimal expected economic performance with smaller numbers of scenarios.
Resumo:
This paper is concerned with strategic optimization of a typical industrial chemical supply chain, which involves a material purchase and transportation network, several manufacturing plants with on-site material and product inventories, a product transportation network and several regional markets. In order to address large uncertainties in customer demands at the different regional markets, a novel robust scenario formulation, which has been developed by the authors recently, is tailored and applied for the strategic optimization. Case study results show that the robust scenario formulation works well for this real industrial supply chain system, and it outperforms the deterministic formulation and the classical scenario-based stochastic programming formulation by generating better expected economic performance and solutions that are guaranteed to be feasible for all uncertainty realizations. The robust scenario problem exhibits a decomposable structure that can be taken advantage of by Benders decomposition for efficient solution, so the application of Benders decomposition to the solution of the strategic optimization is also discussed. The case study results show that Benders decomposition can reduce the solution time by almost an order of magnitude when the number of scenarios in the problem is large.
Resumo:
The problem of decentralized sequential detection is studied in this thesis, where local sensors are memoryless, receive independent observations, and no feedback from the fusion center. In addition to traditional criteria of detection delay and error probability, we introduce a new constraint: the number of communications between local sensors and the fusion center. This metric is able to reflect both the cost of establishing communication links as well as overall energy consumption over time. A new formulation for communication-efficient decentralized sequential detection is proposed where the overall detection delay is minimized with constraints on both error probabilities and the communication cost. Two types of problems are investigated based on the communication-efficient formulation: decentralized hypothesis testing and decentralized change detection. In the former case, an asymptotically person-by-person optimum detection framework is developed, where the fusion center performs a sequential probability ratio test based on dependent observations. The proposed algorithm utilizes not only reported statistics from local sensors, but also the reporting times. The asymptotically relative efficiency of proposed algorithm with respect to the centralized strategy is expressed in closed form. When the probabilities of false alarm and missed detection are close to one another, a reduced-complexity algorithm is proposed based on a Poisson arrival approximation. In addition, decentralized change detection with a communication cost constraint is also investigated. A person-by-person optimum change detection algorithm is proposed, where transmissions of sensing reports are modeled as a Poisson process. The optimum threshold value is obtained through dynamic programming. An alternative method with a simpler fusion rule is also proposed, where the threshold values in the algorithm are determined by a combination of sequential detection analysis and constrained optimization. In both decentralized hypothesis testing and change detection problems, tradeoffs in parameter choices are investigated through Monte Carlo simulations.