886 resultados para Assessing creative learning
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
In order to achieve a better understanding of multiple infections and long latency in the dynamics of Mycobacterium tuberculosis infection, we analyze a simple model. Since backward bifurcation is well documented in the literature with respect to the model we are considering, our aim is to illustrate this behavior in terms of the range of variations of the model's parameters. We show that backward bifurcation disappears (and forward bifurcation occurs) if: (a) the latent period is shortened below a critical value; and (b) the rates of super-infection and re-infection are decreased. This result shows that among immunosuppressed individuals, super-infection and/or changes in the latent period could act to facilitate the onset of tuberculosis. When we decrease the incubation period below the critical value, we obtain the curve of the incidence of tuberculosis following forward bifurcation; however, this curve envelops that obtained from the backward bifurcation diagram.
Resumo:
Abstract Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention.
Resumo:
Background Where malaria endemicity is low, control programmes need increasingly sensitive tools for monitoring malaria transmission intensity (MTI) and to better define health priorities. A cross-sectional survey was conducted in a low endemicity area of the Peruvian north-western coast to assess the MTI using both molecular and serological tools. Methods Epidemiological, parasitological and serological data were collected from 2,667 individuals in three settlements of Bellavista district, in May 2010. Parasite infection was detected using microscopy and polymerase chain reaction (PCR). Antibodies to Plasmodium vivax merozoite surface protein-119 (PvMSP119) and to Plasmodium falciparum glutamate-rich protein (PfGLURP) were detected by ELISA. Risk factors for exposure to malaria (seropositivity) were assessed by multivariate survey logistic regression models. Age-specific antibody prevalence of both P. falciparum and P. vivax were analysed using a previously published catalytic conversion model based on maximum likelihood for generating seroconversion rates (SCR). Results The overall parasite prevalence by microscopy and PCR were extremely low: 0.3 and 0.9%, respectively for P. vivax, and 0 and 0.04%, respectively for P. falciparum, while seroprevalence was much higher, 13.6% for P. vivax and 9.8% for P. falciparum. Settlement, age and occupation as moto-taxi driver during previous year were significantly associated with P. falciparum exposure, while age and distance to the water drain were associated with P. vivax exposure. Likelihood ratio tests supported age seroprevalence curves with two SCR for both P. vivax and P. falciparum indicating significant changes in the MTI over time. The SCR for PfGLURP was 19-fold lower after 2002 as compared to before (λ1 = 0.022 versus λ2 = 0.431), and the SCR for PvMSP119 was four-fold higher after 2006 as compared to before (λ1 = 0.024 versus λ2 = 0.006). Conclusion Combining molecular and serological tools considerably enhanced the capacity of detecting current and past exposure to malaria infections and related risks factors in this very low endemicity area. This allowed for an improved characterization of the current human reservoir of infections, largely hidden and heterogeneous, as well as providing insights into recent changes in species specific MTIs. This approach will be of key importance for evaluating and monitoring future malaria elimination strategies.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.
Resumo:
Brain functions, such as learning, orchestrating locomotion, memory recall, and processing information, all require glucose as a source of energy. During these functions, the glucose concentration decreases as the glucose is being consumed by brain cells. By measuring this drop in concentration, it is possible to determine which parts of the brain are used during specific functions and consequently, how much energy the brain requires to complete the function. One way to measure in vivo brain glucose levels is with a microdialysis probe. The drawback of this analytical procedure, as with many steadystate fluid flow systems, is that the probe fluid will not reach equilibrium with the brain fluid. Therefore, brain concentration is inferred by taking samples at multiple inlet glucose concentrations and finding a point of convergence. The goal of this thesis is to create a three-dimensional, time-dependent, finite element representation of the brainprobe system in COMSOL 4.2 that describes the diffusion and convection of glucose. Once validated with experimental results, this model can then be used to test parameters that experiments cannot access. When simulations were run using published values for physical constants (i.e. diffusivities, density and viscosity), the resulting glucose model concentrations were within the error of the experimental data. This verifies that the model is an accurate representation of the physical system. In addition to accurately describing the experimental brain-probe system, the model I created is able to show the validity of zero-net-flux for a given experiment. A useful discovery is that the slope of the zero-net-flux line is dependent on perfusate flow rate and diffusion coefficients, but it is independent of brain glucose concentrations. The model was simplified with the realization that the perfusate is at thermal equilibrium with the brain throughout the active region of the probe. This allowed for the assumption that all model parameters are temperature independent. The time to steady-state for the probe is approximately one minute. However, the signal degrades in the exit tubing due to Taylor dispersion, on the order of two minutes for two meters of tubing. Given an analytical instrument requiring a five μL aliquot, the smallest brain process measurable for this system is 13 minutes.
Resumo:
Perceptual learning is a training induced improvement in performance. Mechanisms underlying the perceptual learning of depth discrimination in dynamic random dot stereograms were examined by assessing stereothresholds as a function of decorrelation. The inflection point of the decorrelation function was defined as the level of decorrelation corresponding to 1.4 times the threshold when decorrelation is 0%. In general, stereothresholds increased with increasing decorrelation. Following training, stereothresholds and standard errors of measurement decreased systematically for all tested decorrelation values. Post training decorrelation functions were reduced by a multiplicative constant (approximately 5), exhibiting changes in stereothresholds without changes in the inflection points. Disparity energy model simulations indicate that a post-training reduction in neuronal noise can sufficiently account for the perceptual learning effects. In two subjects, learning effects were retained over a period of six months, which may have application for training stereo deficient subjects.
Resumo:
Competing water demands for household consumption as well as the production of food, energy, and other uses pose challenges for water supply and sustainable development in many parts of the world. Designing creative strategies and learning processes for sustainable water governance is thus of prime importance. While this need is uncontested, suitable approaches still have to be found. In this article we present and evaluate a conceptual approach to scenario building aimed at transdisciplinary learning for sustainable water governance. The approach combines normative, explorative, and participatory scenario elements. This combination allows for adequate consideration of stakeholders’ and scientists’ systems, target, and transformation knowledge. Application of the approach in the MontanAqua project in the Swiss Alps confirmed its high potential for co-producing new knowledge and establishing a meaningful and deliberative dialogue between all actors involved. The iterative and combined approach ensured that stakeholders’ knowledge was adequately captured, fed into scientific analysis, and brought back to stakeholders in several cycles, thereby facilitating learning and co-production of new knowledge relevant for both stakeholders and scientists. However, the approach also revealed a number of constraints, including the enormous flexibility required of stakeholders and scientists in order for them to truly engage in the co-production of new knowledge. Overall, the study showed that shifts from strategic to communicative action are possible in an environment of mutual trust. This ultimately depends on creating conditions of interaction that place scientists’ and stakeholders’ knowledge on an equal footing.
Resumo:
Livelihood resilience draws attention to the factors and processes that keep livelihoods functioning despite change and thus enriches the livelihood approach which puts people, their differential capabilities to cope with shocks and how to reduce poverty and improve adaptive capacity at the centre of analysis. However, the few studies addressing resilience from a livelihood perspective take different approaches and focus only on some dimensions of livelihoods. This paper presents a framework that can be used for a comprehensive empirical analysis of livelihood resilience. We use a concept of resilience that considers agency as well as structure. A review of both theoretical and empirical literature related to livelihoods and resilience served as the basis to integrate the perspectives. The paper identifies the attributes and indicators of the three dimensions of resilience, namely, buffer capacity, self-organisation and capacity for learning. The framework has not yet been systematically tested; however, potentials and limitations of the components of the framework are explored and discussed by drawing on empirical examples from literature on farming systems. Besides providing a basis for applying the resilience concept in livelihood-oriented research, the framework offers a way to communicate with practitioners on identifying and improving the factors that build resilience. It can thus serve as a tool for monitoring the effectiveness of policies and practices aimed at building livelihood resilience.
Resumo:
Land degradation is intrinsically complex and involves decisions by many agencies and individuals, land degradation map- ping should be used as a learning tool through which managers, experts and stakeholders can re-examine their views within a wider semantic context. In this paper, we introduce an analytical framework for mapping land degradation, developed by World Overview for Conservation Approaches and technologies (WOCAT) programs, which aims to develop some thematic maps that serve as an useful tool and including effective information on land degradation and conservation status. Consequently, this methodology would provide an important background for decision-making in order to launch rehabilitation/remediation actions in high-priority intervention areas. As land degradation mapping is a problem-solving task that aims to provide clear information, this study entails the implementation of WOCAT mapping tool, which integrate a set of indicators to appraise the severity of land degradation across a representative watershed. So this work focuses on the use of the most relevant indicators for measuring impacts of different degradation processes in El Mkhachbiya catchment, situated in Northwest of Tunisia and those actions taken to deal with them based on the analysis of operating modes and issues of degradation in different land use systems. This study aims to provide a database for surveillance and monitoring of land degradation, in order to support stakeholders in making appropriate choices and judge guidelines and possible suitable recommendations to remedy the situation in order to promote sustainable development. The approach is illustrated through a case study of an urban watershed in Northwest of Tunisia. Results showed that the main land degradation drivers in the study area were related to natural processes, which were exacerbated by human activities. So the output of this analytical framework enabled a better communication of land degradation issues and concerns in a way relevant for policymakers.
Resumo:
Patient-specific biomechanical models including local bone mineral density and anisotropy have gained importance for assessing musculoskeletal disorders. However the trabecular bone anisotropy captured by high-resolution imaging is only available at the peripheral skeleton in clinical practice. In this work, we propose a supervised learning approach to predict trabecular bone anisotropy that builds on a novel set of pose invariant feature descriptors. The statistical relationship between trabecular bone anisotropy and feature descriptors were learned from a database of pairs of high resolution QCT and clinical QCT reconstructions. On a set of leave-one-out experiments, we compared the accuracy of the proposed approach to previous ones, and report a mean prediction error of 6% for the tensor norm, 6% for the degree of anisotropy and 19◦ for the principal tensor direction. These findings show the potential of the proposed approach to predict trabecular bone anisotropy from clinically available QCT images.
Resumo:
As co-founder of KIPP, I know from experience and research that more time in school works. A well-designed extended-time program can help underserved students catch up academically, and prepare them for the rigors of higher education. Implementing extended time more widely poses challenges, but there are also creative solutions to these challenges.
Resumo:
Ocean acidification is one of the most pressing environmental concerns of our time, and not surprisingly, we have seen a recent explosion of research into the physiological impacts and ecological consequences of changes in ocean chemistry. We are gaining considerable insights from this work, but further advances require greater integration across disciplines. Here, we showed that projected near-future CO2 levels impaired the ability of damselfish to learn the identity of predators. These effects stem from impaired neurotransmitter function; impaired learning under elevated CO2 was reversed when fish were treated with gabazine, an antagonist of the GABA-A receptor - a major inhibitory neurotransmitter receptor in the brain of vertebrates. The effects of CO2 on learning and the link to neurotransmitter interference were manifested as major differences in survival for fish released into the wild. Lower survival under elevated CO2 , as a result of impaired learning, could have a major influence on population recruitment.
Resumo:
Organisms inhabiting coastal waters naturally experience diel and seasonal physico-chemical variations. According to various assumptions, coastal species are either considered to be highly tolerant to environmental changes or, conversely, living at the thresholds of their physiological performance. Therefore, these species are either more resistant or more sensitive, respectively, to ocean acidification and warming. Here, we focused on Crepidula fornicata, an invasive gastropod that colonized bays and estuaries on northwestern European coasts during the 20th century. Small (<3 cm in length) and large (>4.5 cm in length), sexually mature individuals of C. fornicata were raised for 6 months in three different pCO2 conditions (390 µatm, 750 µatm, and 1400 µatm) at four successive temperature levels (10°C, 13°C, 16°C, and 19°C). At each temperature level and in each pCO2 condition, we assessed the physiological rates of respiration, ammonia excretion, filtration and calcification on small and large individuals. Results show that, in general, temperature positively influenced respiration, excretion and filtration rates in both small and large individuals. Conversely, increasing pCO2 negatively affected calcification rates, leading to net dissolution in the most drastic pCO2 condition (1400 µatm) but did not affect the other physiological rates. Overall, our results indicate that C. fornicata can tolerate ocean acidification, particularly in the intermediate pCO2 scenario. Moreover, in this eurythermal species, moderate warming may play a buffering role in the future responses of organisms to ocean acidification.