4 resultados para Search engine results page
em Bucknell University Digital Commons - Pensilvania - USA
Resumo:
This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.
Resumo:
In 2011, researchers at Bucknell University and Illinois Wesleyan University compared the search efficacy of Serial Solutions Summon, EBSCO Discovery Service, Google Scholar and conventional library databases. Using a mixed-methods approach, qualitative and quantitative data was gathered on students’ usage of these tools. Regardless of the search system, students exhibited a marked inability to effectively evaluate sources and a heavy reliance on default search settings. On the quantitative benchmarks measured by this study, the EBSCO Discovery Service tool outperformed the other search systems in almost every category. This article describes these results and makes recommendations for libraries considering these tools.
Resumo:
A shared code of connection arguably exists between two plays by Lope de VegaEl mayordomo de la duquesa de Amalfi and El perro del hortelanoand the work of Michel de Montaigne. Nevertheless, one cannot but ask: how it can be that in two works produced so close in time, the same situation is resolved so differently? Montaigne can be said to provide an answer in his Essays, explaining that a similar situation can produce wholly different results: how in the first, one is saved', and in the second, one is destroyed. One might imagine, too, that Belflor's countess and her ennobled secretary, who together sustain a lie in a society that lived by the lie, would have been likewise consoled' by a set of interlocking tropes and similitudes' in the words of Stephen Greenblatt, which linked two contemporary and complementary fashioners of human nature, Lope and Montaigne, in a discursive dialogue on how otherwise honest women and men were subject to the vice of lying in their process of self-fashioning, as well as potentially enslaved' by it.