803 resultados para Information technology and education
Resumo:
The physics of plasmas encompasses basic problems from the universe and has assured us of promises in diverse applications to be implemented in a wider range of scientific and engineering domains, linked to most of the evolved and evolving fundamental problems. Substantial part of this domain could be described by R–D mechanisms involving two or more species (reaction–diffusion mechanisms). These could further account for the simultaneous non-linear effects of heating, diffusion and other related losses. We mention here that in laboratory scale experiments, a suitable combination of these processes is of vital importance and very much decisive to investigate and compute the net behaviour of plasmas under consideration. Plasmas are being used in the revolution of information processing, so we considered in this technical note a simple framework to discuss and pave the way for better formalisms and Informatics, dealing with diverse domains of science and technologies. The challenging and fascinating aspects of plasma physics is that it requires a great deal of insight in formulating the relevant design problems, which in turn require ingenuity and flexibility in choosing a particular set of mathematical (and/or experimental) tools to implement them.
Resumo:
Pós-graduação em Educação Escolar - FCLAR
Resumo:
Information technologies have enabled new forms of interactions among individuals, and can also be found in various educational institutions. However, despite appropriating new technological tools, educational practices continue to be based on a supposed pedagogical subject conceived as a transcendent way, ignoring his emergence and his origin in a particular historical period , and the control mechanisms by which is immersed. as well. This article aims to present results of an analysis on technology and education based on the thought of philosopher Michel Foucault. We analyzed proposals and discourses found in articles in a magazine entitled Patio - Revista Pedagógica.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The paper presents some reflections, in the form of questions, about the challenges and possibilities of education in respect to integration of information technology and communication in the school curriculum. For this, we present: initial considerations on contemporary education and the urgent need to rethink the teaching curriculum on information society and knowledge, perspectives on the school curriculum and the moments for which it permeates even be translated into everyday curriculum for teaching practices , and finally, emphasizes the need for integration of technologies, so cross-curricular teaching practices through inter-and transdisciplinary projects.
Resumo:
This edited collection grew out of a symposium held at Utah State University in Logan in 2002. According to the editors, the symposium's purpose was to "publicly explore the particular ways environmental writing educates the public through a fusion of science and literary expression." The Search for a Common Language achieves that purpose by including short prose pieces-ranging from memoirs, essays on specific locations, and scientific papers - as well as poetry on natural themes. The range of topics and genres and the inclusion of poetry provide a variety of ways to talk about the environment and reach out to different audiences to educate them about the natural world.
Resumo:
OBJECTIVE: Due to their toxicity, diesel emissions have been submitted to progressively more restrictive regulations in developed countries. However, in Brazil, the implementation of the Cleaner Diesel Technologies policy (Euro IV standards for vehicles produced in 2009 and low-sulfur diesel with 50 ppm of sulfur) was postponed until 2012 without a comprehensive analysis of the effect of this delay on public health parameters. We aimed to evaluate the impact of the delay in implementing the Cleaner Diesel Technologies policy on health indicators and monetary health costs in Brazil. METHODS: The primary estimator of exposure to air pollution was the concentration of ambient fine particulate matter (particles with aerodynamic diameters, <2.5 mu m, [PM2.5]). This parameter was measured daily in six Brazilian metropolitan areas during 2007-2008. We calculated 1) the projected reduction in the PM2.5 that would have been achieved if the Euro IV standards had been implemented in 2009 and 2) the expected reduction after implementation in 2012. The difference between these two time curves was transformed into health outcomes using previous dose-response curves. The economic valuation was performed based on the DALY (disability-adjusted life years) method. RESULTS: The delay in implementing the Cleaner Diesel Technologies policy will result in an estimated excess of 13,984 deaths up to 2040. Health expenditures are projected to be increased by nearly US$ 11.5 billion for the same period. CONCLUSIONS: The present results indicate that a significant health burden will occur because of the postponement in implementing the Cleaner Diesel Technologies policy. These results also reinforce the concept that health effects must be considered when revising fuel and emission policies.
Resumo:
The Grupo de Estudos e Pesquisas de Tecnologia da Informacao nos Processos de Trabalho em Enfermagem (Study and Research Group for Information Technology in the Nursing Working Processes, GEPETE) has the purpose of producing and socializing knowledge in information technology and health and nursing communication, making associations with research groups in this field and promoting student participation. This study was performed by the group tutors with the objective to report on the development of the virtual learning environment (VLE) and the tutors' experience as mediators of a research group using the Moodle platform. To do this, a VLE was developed and pedagogical mediation was performed following the theme of mentoring. An initial diagnosis was made of the difficulties in using this technology in interaction and communication, which permitted the proposal of continuing to use the platform as a resource to support research activities, offer lead researchers the mechanisms to socialize projects and offer the possibility of giving advice at a distance.
Resumo:
This review reports the Brazilian history in astrobiology, as well as the first delineation of a vision of the future development of the field in the country, exploring its abundant biodiversity, highly capable human resources and state-of-the-art facilities, reflecting the last few years of stable governmental investments in science, technology and education, all conditions providing good perspectives on continued and steadily growing funding for astrobiology-related research. Brazil is growing steadily and fast in terms of its worldwide economic power, an effect being reflected in different areas of the Brazilian society, including industry, technology, education, social care and scientific production. In the field of astrobiology, the country has had some important landmarks, more intensely after the First Brazilian Workshop on Astrobiology in 2006. The history of astrobiology in Brazil, however, is not so recent and had its first occurrence in 1958. Since then, researchers carried out many individual initiatives across the country in astrobiology-related fields, resulting in an ever growing and expressive scientific production. The number of publications, including articles and theses, has particularly increased in the last decade, but still counting with the effort of researchers working individually. That scenario started to change in 2009, when a formal group of Brazilian researchers working with astrobiology was organized, aiming at congregating the scientific community interested in the subject and to promote the necessary interactions to achieve a multidisciplinary work, receiving facilities and funding from the University de Sao Paulo and other funding agencies. Received 29 February 2012, accepted 17 May 2012, first published online 18 July 2012
Resumo:
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.
Resumo:
Quando si parla di green information technology si fa riferimento a un nuovo filone di ricerche focalizzate sulle tecnologie ecologiche o verdi rivolte al rispetto ambientale. In prima battuta ci si potrebbe chiedere quali siano le reali motivazioni che possono portare allo studio di tecnologie green nel settore dell’information technology: sono così inquinanti i computer? Non sono le automobili, le industrie, gli aerei, le discariche ad avere un impatto inquinante maggiore sull’ambiente? Certamente sì, ma non bisogna sottovalutare l’impronta inquinante settore IT; secondo una recente indagine condotta dal centro di ricerche statunitense Gartner nel 2007, i sistemi IT sono tra le maggiori fonti di emissione di CO2 e di altri gas a effetto serra , con una percentuale del 2% sulle emissioni totali del pianeta, eguagliando il tasso di inquinamento del settore aeromobile. Il numero enorme di computer disseminato in tutto il mondo assorbe ingenti quantità di energia elettrica e le centrali che li alimentano emettono tonnellate di anidride carbonica inquinando l’atmosfera. Con questa tesi si vuole sottolineare l’impatto ambientale del settore verificando, attraverso l’analisi del bilancio sociale ed ambientale, quali misure siano state adottate dai leader del settore informatico. La ricerca è volta a dimostrare che le più grandi multinazionali informatiche siano consapevoli dell’inquinamento prodotto, tuttavia non adottano abbastanza soluzioni per limitare le emissioni, fissando futili obiettivi futuri.
Resumo:
The meaning of a place has been commonly assigned to the quality of having root (rootedness) or sense of belonging to that setting. While on the contrary, people are nowadays more concerned with the possibilities of free moving and networks of communication. So, the meaning, as well as the materiality of architecture has been dramatically altered with these forces. It is therefore of significance to explore and redefine the sense and the trend of architecture at the age of flow. In this dissertation, initially, we review the gradually changing concept of "place-non-place" and its underlying technological basis. Then we portray the transformation of meaning of architecture as influenced by media and information technology and advanced methods of mobility, in the dawn of 21st century. Against such backdrop, there is a need to sort and analyze architectural practices in response to the triplet of place-non-place and space of flow, which we plan to achieve conclusively. We also trace the concept of flow in the process of formation and transformation of old cities. As a brilliant case study, we look at Persian Bazaar from a socio-architectural point of view. In other word, based on Robert Putnam's theory of social capital, we link social context of the Bazaar with architectural configuration of cities. That is how we believe "cities as flow" are not necessarily a new paradigm.
Resumo:
Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.
Resumo:
Al fine di migliorare le tecniche di coltura cellulare in vitro, sistemi a bioreattore sono sempre maggiormente utilizzati, e.g. ingegnerizzazione del tessuto osseo. Spinner Flasks, bioreattori rotanti e sistemi a perfusione di flusso sono oggi utilizzati e ogni sistema ha vantaggi e svantaggi. Questo lavoro descrive lo sviluppo di un semplice bioreattore a perfusione ed i risultati della metodologia di valutazione impiegata, basata su analisi μCT a raggi-X e tecniche di modellizzazione 3D. Un semplice bioreattore con generatore di flusso ad elica è stato progettato e costruito con l'obiettivo di migliorare la differenziazione di cellule staminali mesenchimali, provenienti da embrioni umani (HES-MP); le cellule sono state seminate su scaffold porosi di titanio che garantiscono una migliore adesione della matrice mineralizzata. Attraverso un microcontrollore e un'interfaccia grafica, il bioreattore genera tre tipi di flusso: in avanti (senso orario), indietro (senso antiorario) e una modalità a impulsi (avanti e indietro). Un semplice modello è stato realizzato per stimare la pressione generata dal flusso negli scaffolds (3•10-2 Pa). Sono stati comparati tre scaffolds in coltura statica e tre all’interno del bioreattore. Questi sono stati incubati per 21 giorni, fissati in paraformaldehyde (4% w/v) e sono stati soggetti ad acquisizione attraverso μCT a raggi-X. Le immagini ottenute sono state poi elaborate mediante un software di imaging 3D; è stato effettuato un sezionamento “virtuale” degli scaffolds, al fine di ottenere la distribuzione del gradiente dei valori di grigio di campioni estratti dalla superficie e dall’interno di essi. Tale distribuzione serve per distinguere le varie componenti presenti nelle immagini; in questo caso gli scaffolds dall’ipotetica matrice cellulare. I risultati mostrano che sia sulla superficie che internamente agli scaffolds, mantenuti nel bioreattore, è presente una maggiore densità dei gradienti dei valori di grigio ciò suggerisce un migliore deposito della matrice mineralizzata. Gli insegnamenti provenienti dalla realizzazione di questo bioreattore saranno utilizzati per progettare una nuova versione che renderà possibile l’analisi di più di 20 scaffolds contemporaneamente, permettendo un’ulteriore analisi della qualità della differenziazione usando metodologie molecolari ed istochimiche.