977 resultados para Modeling complexity
Resumo:
Fruit drying is a process of removing moisture to preserve fruits by preventing microbial spoilage. It increases shelf life, reduce weight and volume thus minimize packing, storage, and transportation cost and enable storage of food under ambient environment. But, it is a complex process which involves combination of heat and mass transfer and physical property change and shrinkage of the material. In this background, the aim of this paper to develop a mathematical model to simulate coupled heat and mass transfer during convective drying of fruit. This model can be used predict the temperature and moisture distribution inside the fruits during drying. Two models were developed considering shrinkage dependent and temperature dependent moisture diffusivity and the results were compared. The governing equations of heat and mass transfer are solved and a parametric study has been done with Comsol Multiphysics 4.3. The predicted results were validated with experimental data.
Resumo:
Accepting the fact that culture and language are interrelated in second language learning (SLL), the web sites should be designed to integrate with the cultural aspects. Yet many SLL web sites fail to integrate with the cultural aspects and/or focus on language acquisition only. This study identified three issues: (1) anthropologists’ cultural models mostly adopted in cross-cultural web user interface have been superficially used; (2) web designers deal with culture as a fixed one which needs to be modeled into interface design elements, so (3) there is a need for a communication framework between educators and design practitioners, which can be utilized in web design processes. This paper discusses what anthropology can contribute to language learning, mediated through web design processes and suggests a cultural user experience framework for web-based SLL by presenting an exemplary matrix. To evaluate the effectiveness of the framework, the key stakeholders (learners, teachers, and designers) participated in a case scenario-based evaluation. The result shows a high possibility that the framework can enhance the effective communication and collaboration for the cultural integration.
Resumo:
In this research we used inductive reasoning through design to understand how stakeholders in the Waterfall Way (New South Wales, Australia) perceive the relationships between themselves and the place they live in. This paper describes a collaborative design methodology used to release information about local identities, which guided the regional brand exercise. The methodology is explicit about the uncertainties and complexities of the design process and of its reception system. As such, it aims to engage with local stakeholders and experts in order to help elicit tacit knowledge and identify system patterns and trends that would possibly not be visible if a top-down expert-based process was used. Through collective design, local people were drawn together in search for a symbol to represent the meaning attached to their places/region in relation to sustainable tourism activity.
Resumo:
The purpose of this paper is to develop a second-moment closure with a near-wall turbulent pressure diffusion model for three-dimensional complex flows, and to evaluate the influence of the turbulent diffusion term on the prediction of detached and secondary flows. A complete turbulent diffusion model including a near-wall turbulent pressure diffusion closure for the slow part was developed based on the tensorial form of Lumley and included in a re-calibrated wall-normal-free Reynolds-stress model developed by Gerolymos and Vallet. The proposed model was validated against several one-, two, and three-dimensional complex flows.
Resumo:
This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.
Resumo:
Elaborated Intrusion theory (EI theory; Kavanagh, Andrade, & May, 2005) posits two main cognitive components in craving: associative processes that lead to intrusive thoughts about the craved substance or activity, and elaborative processes supporting mental imagery of the substance or activity. We used a novel visuospatial task to test the hypothesis that visual imagery plays a key role in craving. Experiment 1 showed that spending 10 min constructing shapes from modeling clay (plasticine) reduced participants' craving for chocolate compared with spending 10 min 'letting your mind wander'. Increasing the load on verbal working memory using a mental arithmetic task (counting backwards by threes) did not reduce craving further. Experiment 2 compared effects on craving of a simpler verbal task (counting by ones) and clay modeling. Clay modeling reduced overall craving strength and strength of craving imagery, and reduced the frequency of thoughts about chocolate. The results are consistent with EI theory, showing that craving is reduced by loading the visuospatial sketchpad of working memory but not by loading the phonological loop. Clay modeling might be a useful self-help tool to help manage craving for chocolate, snacks and other foods.
Resumo:
Flood related scientific and community-based data are rarely systematically collected and analysed in the Philippines. Over the last decades the Pagsangaan River Basin, Leyte, has experienced several flood events. However, documentation describing flood characteristics such as extent, duration or height of these floods are close to non-existing. To address this issue, computerized flood modelling was used to reproduce past events where there was data available for at least partial calibration and validation. The model was also used to provide scenario-based predictions based on A1B climate change assumptions for the area. The most important input for flood modelling is a Digital Elevation Model (DEM) of the river basin. No accurate topographic maps or Light Detection And Ranging (LIDAR)-generated data are available for the Pagsangaan River. Therefore, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Map (GDEM), Version 1, was chosen as the DEM. Although the horizontal spatial resolution of 30 m is rather desirable, it contains substantial vertical errors. These were identified, different correction methods were tested and the resulting DEM was used for flood modelling. The above mentioned data were combined with cross-sections at various strategic locations of the river network, meteorological records, river water level, and current velocity to develop the 1D-2D flood model. SOBEK was used as modelling software to create different rainfall scenarios, including historic flooding events. Due to the lack of scientific data for the verification of the model quality, interviews with local stakeholders served as the gauge to judge the quality of the generated flood maps. According to interviewees, the model reflects reality more accurately than previously available flood maps. The resulting flood maps are now used by the operations centre of a local flood early warning system for warnings and evacuation alerts. Furthermore these maps can serve as a basis to identify flood hazard areas for spatial land use planning purposes.
Resumo:
Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.
Resumo:
Underlying all assessments are human judgements regarding the quality of students’ understandings. Despite their ubiquity, those judgements are conceptually elusive. The articles selected for inclusion in this issue explore the complexity of judgement practice raising critical questions that challenge existing views and accepted policy and practice.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
According to Karl Popper, widely regarded as one of the greatest philosophers of science in the 20th century, falsifiability is the primary characteristic that distinguishes scientific theories from ideologies – or dogma. For example, for people who argue that schools should treat creationism as a scientific theory, comparable to modern theories of evolution, advocates of creationism would need to become engaged in the generation of falsifiable hypothesis, and would need to abandon the practice of discouraging questioning and inquiry. Ironically, scientific theories themselves are accepted or rejected based on a principle that might be called survival of the fittest. So, for healthy theories on development to occur, four Darwinian functions should function: (a) variation – avoid orthodoxy and encourage divergent thinking, (b) selection – submit all assumptions and innovations to rigorous testing, (c) diffusion – encourage the shareability of new and/or viable ways of thinking, and (d) accumulation – encourage the reuseability of viable aspects of productive innovations.
Resumo:
This article argues for an interdisciplinary approach to mathematical problem solving at the elementary school, one that draws upon the engineering domain. A modeling approach, using engineering model eliciting activities, might provide a rich source of meaningful situations that capitalize on and extend students’ existing mathematical learning. The study reports on the developments of 48 twelve-year old students who worked on the Bridge Design activity. Results revealed that young students, even before formal instruction, have the capacity to deal with complex interdisciplinary problems. A number of students created quite appropriate models by developing the necessary mathematical constructs to solve the problem. Students’ difficulties in mathematizing the problem, and in revising and documenting their models are presented and analysed, followed by a discussion on the appropriateness of a modeling approach as a means for introducing complex problems to elementary school students.