52 resultados para Information literacy integration model
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Much has been discussed about Digital Literacy, but it is quite obscure the identification of the skills required to develop such process. This study was done towards an integration of the Digital Literacy process to the specific informational skills a person may dominate, search, retrieve and use information efficiently, in its professional, academic or personal life. The main objective of this work is to propose methodological parameters for training in informational skills. Otherwise, the specific objectives are associated to the supposition and identification of the desired skills of the Digital Literacy program participants. The methodological procedures applied to the research are of exploratory character, and to do so two tools are used: the literature research and case studies. Besides having the methodology in structured information competence, the study points out to the fact that the country is too far from what is desired concerning development and employment of Digital Literacy programs consistent enough to support the teaching and learning of searching, recovering and using of information by the participants. Therefore, it is essential to create programs that provide not only machinery, but motivate individuals to develop informational skills to help in the learning process.
Resumo:
The `reflexive thinking` concept is discussed in this article as a means of contextualizing John Dewey`s intellectual legacy. `Reflection` represents a fundamental element for the construction of the necessary competences to information seeking and use, and consequently to individual and collective development. Since the reflexive thinking habit in information literacy is a way of learning, some questions concerning teaching and learning processes are also investigated. The discussion is, therefore, supported by the supposition that reflexive thinking is a cognitive strategy that allows a deeper comprehension of related problems, phenomena, and processes by means of the perception of the relations and the identification of involved elements, as well as the analysis and interpretation of meanings, empowering the information literacy process.
Resumo:
Model trees are a particular case of decision trees employed to solve regression problems. They have the advantage of presenting an interpretable output, helping the end-user to get more confidence in the prediction and providing the basis for the end-user to have new insight about the data, confirming or rejecting hypotheses previously formed. Moreover, model trees present an acceptable level of predictive performance in comparison to most techniques used for solving regression problems. Since generating the optimal model tree is an NP-Complete problem, traditional model tree induction algorithms make use of a greedy top-down divide-and-conquer strategy, which may not converge to the global optimal solution. In this paper, we propose a novel algorithm based on the use of the evolutionary algorithms paradigm as an alternate heuristic to generate model trees in order to improve the convergence to globally near-optimal solutions. We call our new approach evolutionary model tree induction (E-Motion). We test its predictive performance using public UCI data sets, and we compare the results to traditional greedy regression/model trees induction algorithms, as well as to other evolutionary approaches. Results show that our method presents a good trade-off between predictive performance and model comprehensibility, which may be crucial in many machine learning applications. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents and reviews the recommendations done by experts during the specialists meeting held in the city of Alexandria, Egypt, in the end of 2005 and, according to this information, the Brazilian situation is analyzed. Internationalization and institutionalization of information literacy and lifelong learning as essential factors to the development of the nations are also explored. Beacons of the Information Society translate the vision and concepts involved. In Brazil, the actions around information literacy are not a consensus. The challenges to be faced include: to discover forms to foster and to appropriately disseminate national and local knowledge, to advance discussions and deepen the subject, to discover adequate alternatives for disseminating information practices that encompass distinct professional groups and populations, to overcome structural development gaps. As a matter that permeates each and every process of learning, research, development, problem-solving and decision-making, information literacy went beyond the boundaries of librarianship and transformed itself into a world transdiciplinary movement, even under the aegis of different denominations and emphasis.
Resumo:
This essay is a trial on giving some mathematical ideas about the concept of biological complexity, trying to explore four different attributes considered to be essential to characterize a complex system in a biological context: decomposition, heterogeneous assembly, self-organization, and adequacy. It is a theoretical and speculative approach, opening some possibilities to further numerical and experimental work, illustrated by references to several researches that applied the concepts presented here. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The lateral part of intermediate layer of superior colliculus (SCI) is a critical substrate for successful predation by rats. Hunting-evoked expression of the activity marker Fos is concentrated in SCI while prey capture in rats with NMDA lesions in SCI is impaired. Particularly affected are rapid orienting and stereotyped sequences of actions associated with predation of fast moving prey. Such deficits are consistent with the view that the deep layers of SC are important for sensory guidance of movement. Although much of the relevant evidence involves visual control of movement, less is known about movement guidance by somatosensory input from vibrissae. Indeed, our impression is that prey contact with whiskers is a likely stimulus to trigger predation. Moreover, SCI receives whisker and orofacial somatosensory information directly from trigeminal complex, and indirectly from zona incerta, parvicelular reticular formation and somatosensory barrel cortex. To better understand sensory guidance of predation by vibrissal information we investigated prey capture by rats after whisker removal and the role of superior colliculus (SC) by comparing Fos expression after hunting with and without whiskers. Rats were allowed to hunt cockroaches, after which their whiskers were removed. Two days later they were allowed to hunt cockroaches again. Without whiskers the rats were less able to retain the cockroaches after capture and less able to pursue them in the event of the cockroach escaping. The predatory behaviour of rats with re-grown whiskers returned to normal. In parallel, Fos expression in SCI induced by predation was significantly reduced in whiskerless animals. We conclude that whiskers contribute to the efficiency of rat prey capture and that the loss of vibrissal input to SCI, as reflected by reduced Fos expression, could play a critical role in predatory deficits of whiskerless rats. (C) 2011 IBRO. Published by Elsevier Ltd. All rights reserved.
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.
Resumo:
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity beta. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent alpha, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When beta is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Here, we study the stable integration of real time optimization (RTO) with model predictive control (MPC) in a three layer structure. The intermediate layer is a quadratic programming whose objective is to compute reachable targets to the MPC layer that lie at the minimum distance to the optimum set points that are produced by the RTO layer. The lower layer is an infinite horizon MPC with guaranteed stability with additional constraints that force the feasibility and convergence of the target calculation layer. It is also considered the case in which there is polytopic uncertainty in the steady state model considered in the target calculation. The dynamic part of the MPC model is also considered unknown but it is assumed to be represented by one of the models of a discrete set of models. The efficiency of the methods presented here is illustrated with the simulation of a low order system. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
The understanding of complex physiological processes requires information from many different areas of knowledge. To meet this interdisciplinary scenario, the ability of integrating and articulating information is demanded. The difficulty of such approach arises because, more often than not, information is fragmented through under graduation education in Health Sciences. Shifting from a fragmentary and deep view of many topics to joining them horizontally in a global view is not a trivial task for teachers to implement. To attain that objective we proposed a course herein described Biochemistry of the envenomation response aimed at integrating previous contents of Health Sciences courses, following international recommendations of interdisciplinary model. The contents were organized by modules with increasing topic complexity. The full understanding of the envenoming pathophysiology of each module would be attained by the integration of knowledge from different disciplines. Active-learning strategy was employed focusing concept map drawing. Evaluation was obtained by a 30-item Likert-type survey answered by ninety students; 84% of the students considered that the number of relations that they were able to establish as seen by concept maps increased throughout the course. Similarly, 98% considered that both the theme and the strategy adopted in the course contributed to develop an interdisciplinary view.
Resumo:
This paper presents an approach for assisting low-literacy readers in accessing Web online information. The oEducational FACILITAo tool is a Web content adaptation tool that provides innovative features and follows more intuitive interaction models regarding accessibility concerns. Especially, we propose an interaction model and a Web application that explore the natural language processing tasks of lexical elaboration and named entity labeling for improving Web accessibility. We report on the results obtained from a pilot study on usability analysis carried out with low-literacy users. The preliminary results show that oEducational FACILITAo improves the comprehension of text elements, although the assistance mechanisms might also confuse users when word sense ambiguity is introduced, by gathering, for a complex word, a list of synonyms with multiple meanings. This fact evokes a future solution in which the correct sense for a complex word in a sentence is identified, solving this pervasive characteristic of natural languages. The pilot study also identified that experienced computer users find the tool to be more useful than novice computer users do.
Resumo:
Optimized experimental conditions for extracting accurate information at subpixel length scales from analyzer-based X-ray imaging were obtained and applied to investigate bone regeneration by means of synthetic beta-TCP grafting materials in a rat calvaria model. The results showed a 30% growth in the particulate size due to bone ongrowth/ingrowth within the critical size defect over a 1-month healing period.
Resumo:
Hypercycles are information integration systems which are thought to overcome the information crisis of prebiotic evolution by ensuring the coexistence of several short templates. For imperfect template replication, we derive a simple expression for the maximum number of distinct templates n(m). that can coexist in a hypercycle and show that it is a decreasing function of the length L of the templates. In the case of high replication accuracy we find that the product n(m)L tends to a constant value, limiting thus the information content of the hypercycle. Template coexistence is achieved either as a stationary equilibrium (stable fixed point) or a stable periodic orbit in which the total concentration of functional templates is nonzero. For the hypercycle system studied here we find numerical evidence that the existence of an unstable fixed point is a necessary condition for the presence of periodic orbits. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
OBJECTIVES: The complexity and heterogeneity of human bone, as well as ethical issues, most always hinder the performance of clinical trials. Thus, in vitro studies become an important source of information for the understanding of biomechanical events on implant-supported prostheses, although study results cannot be considered reliable unless validation studies are conducted. The purpose of this work was to validate an artificial experimental model based on its modulus of elasticity, to simulate the performance of human bone in vivo in biomechanical studies of implant-supported prostheses. MATERIAL AND METHODS: In this study, fast-curing polyurethane (F16 polyurethane, Axson) was used to build 40 specimens that were divided into five groups. The following reagent ratios (part A/part B) were used: Group A (0.5/1.0), Group B (0.8/1.0), Group C (1.0/1.0), Group D (1.2/1.0), and Group E (1.5/1.0). A universal testing machine (Kratos model K - 2000 MP) was used to measure modulus of elasticity values by compression. RESULTS: Mean modulus of elasticity values were: Group A - 389.72 MPa, Group B - 529.19 MPa, Group C - 571.11 MPa, Group D - 470.35 MPa, Group E - 437.36 MPa. CONCLUSION: The best mechanical characteristics and modulus of elasticity value comparable to that of human trabecular bone were obtained when A/B ratio was 1:1.
Resumo:
Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB) spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.