934 resultados para computer technology


Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente trabalho versa sobre as práticas de ensino-aprendizagem da leitura em língua materna, tendo o computador como uma das principais ferramentas didáticas. Trata-se de uma pesquisa-ação fundamentada em pressupostos sociointeracionistas de língua, linguagem e leitura. Portanto, faz-se referência à perspectiva dialógica, aos gêneros discursivos - concepções teóricas e como objeto de ensino - à sequência didática, ao blog como suporte virtual e gênero relato de experiência em blog, assim como ao ensino da compreensão escrita em língua materna e ai computador como ferramenta didática. Esse aparato teórico é utilizado para analisar parte de uma sequência didática que priorizou o trabalho com a competência de leitura e escrita (sendo a primeira o foco de nossos estudos) desenvolvida com uma turma do ensino fundamental II do Município de Benevides/PA. E tenta verificar em que medida essas práticas, ao se utilizar da ferramenta didático-tecnológica computador, contribuem para o aprimoramento da compreensão escrita dos aprendentes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O presente estudo teve como objetivo central analisar as prescrições curriculares oficiais para a implantação do ensino médio integrado na Rede de Escolas de Educação Tecnológica do Estado do Pará (EETEPA) no período de 2004 a 2009. Obteve financiamento do CNPq e foi vinculado ao Grupo de Estudos e Pesquisa sobre Currículo e Formação de Professores na Perspectiva da Inclusão (INCLUDERE). Trata-se de um estudo caso de caráter documental. A trajetória do estudo foi realizada por meio de pesquisa exploratória em estudos correlatos, com revisão bibliográfica em fontes secundárias da área de educação seguida de pesquisa documental em fontes primárias, como: leis, atos normativos, boletins informativos, proposta pedagógica, plano de curso de informática de uma unidade tecnológica da Rede EETEPA por meio de critério seletivo. Os aspectos discursivos tratados foram: os fundamentos teóricos para implantação do ensino médio integrado à educação profissional; As políticas curriculares para o ensino médio e educação profissional destacando o caráter dual entre a formação geral e profissional. Bem como, foi resgatado o movimento da implantação do ensino médio integrado em nível nacional e localmente no estado paraense mediante as prescrições curriculares oficiais. Os resultados alcançados: referiram-se primeiramente as estratégias adotadas pela SEDUC para implantação do ensino médio integrado nos estabelecimentos de ensino da Rede de EETEPA, a saber: Criação da Diretoria de Ensino Médio, com duas coordenações, a de Ensino Médio e de Educação Profissional; Quebra do contrato com a OSETPP, resgatando para a administração da SEDUC, as 11 (onze) escolas; Criação da Rede EETEPA; Realização de eventos (conferências, fóruns e seminários); Elaboração da proposta educacional para Rede EETEPA; Elaboração do Projeto Político-Pedagógico para Rede EETEPA; Orientações para reestruturação dos projetos políticos-pedagógicos das escolas tecnológicas e construção dos planos de cursos técnicos; Abertura de edital público para oferta de cursos subsequentes, integrado e PROEJA; e, Iniciação das reformas físicas nas unidades tecnológicas. Foi realizada, também, a identificação e análise das prescrições curriculares oficiais para Rede EETEPA, a saber, Proposta Educacional para a Rede EETEPA; Orientações para implantação de cursos técnicos de nível médio na forma integrada para a Rede EETEPA; Diretrizes Específicas II: Orientações Gerais para o Ensino Médio Integrado. Ao analisar esses documentos, constatei que a proposta educacional prescrita para Rede EETEPA pela COEP/DEMP-SEDUC, coaduna com a proposta idealizada pelo Ministério de Educação, ambas resgatam elementos já disseminados pelo pensamento educacional brasileiro desde as décadas dos anos 1980, com a finalidade de se erigir os fundamentos de uma escola unitária e politécnica, deixando explicita a concepção filosófica inspiradora do documento. Contudo, constatou-se que, o plano de curso de informática da Escola Técnica Magalhães Barata (localizada a região metropolitana de Belém do Pará) não conseguiu apresentar uma proposta coerente com os fundamentos do ensino médio integrado.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Os processos de formação dos grupos humanos estão sendo transformados pelas novas tecnologias de informação inovando procedimentos, técnicas e meios. As gerações atuais têm contatos com máquinas, computadores, senhas e códigos de acessos desconhecidos pelos mais velhos e àqueles distantes social e economicamente ainda destas tecnologias, meios, códigos e inovações. Negros e empobrecidos são os que mais se distanciam da geografia deste veículo e meio de aprender, de saber, de trocar, de dialogar e de obter poder. A escrita, a energia e a tecnologia computacional interligadas processam um efeito colateral avassalador associados à acumulação de riquezas e bens nas mãos de poucos e juntos promovem o mais árduo processo discriminador e de exclusão social no planeta, atingindo, sobretudo, as populações tradicionalmente mais vulneráveis pelo racismo e pelo machismo, ou seja, negros e mulheres, sendo eles os mais empobrecidos no planeta.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article discusses the project of the Information Society and the discourses that undergo it, as part of a political and ideological conception universalized by those countries that created and dominate computer technology, which is in turn is aligned with the Post-Fordist industrial capitalist order and its emphasis on economic accumulation and consumerism. We explain how information technology creates routines and legitimate social orders, taking for analyzes the case of the Clinton-Gore policy in the United States, when the discourse of the computer society was associated with the development and social welfare. This association is revealed in the speech made by Clinton in the city of Knoxville in year 1996. There we see the beginnings of the concern about the Digital Divide as a new form of "social disease" that prevents the passage to a better world, focused on productivity, accumulation and consumption in information-dense societies. This generates a clash between the industrial-graph-centric world and the oral-pre-industrial communities, as a result of attempting to transplant the institutional forms of the developed West. We explain the pillars of the new computerized order, and how they replaced previous epic narratives creating techno-deterministic or techno-phobic discourses in prejudice of more critical approaches. We identify the effects such deterministic discourses that connote the association between the Information Society, welfare and development, questioning the urgency of deploying this system at global level without profound critical discussion, clear goals focused on the benefit of the human beings, and the open participation of the users of the system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the advancement of computer technology and the availability of technology computer aided design (CAD) errors in the designs are getting smaller. To this end the project aims to assess the reliability of the machine (CNC), which was designed by students of mechanical engineering college engineering - UNESP Bauru, by designing, modeling, simulation and machining an airfoil automotive. The profile template selected for the study will be a NACA 0012 machined plates in medium density fiberboard (MDF) and will be performed with a structural analysis simulation using finite elements and a software CFD (Computational Fluid Dynamics), and test the real scale model in a wind tunnel. The results obtained in the wind tunnel and CFD software will be compared to see the error in the machining process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

[ES] Esta tesis tiene como objetivo principal conocer y diagnosticar la situación real de la informática y por extensión la de las tecnologías de la información y la comunicación (TIC) en el sistema educativo portugués, en los grados de la enseñanza preuniversitaria, teniendo como centro de atención los docentes y su perfil académico y profesional.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The subject of the present research is related to the field of computer technology applied to support intellectual activities such as text translation, screenwriting and content organization of popular and education courses, especially concerning museum visits. The research has started with the deep analysis of the cognitive process which characterizes a screenwriter while working. This choice has been made because a screenplay is not only an aid to the realization of a show but, more in general, it can be considered as the planning of an education, popular and formative intellectual activity. After this analysis, the research has focused on the specific area of the planning, description and introduction of topics related to the history of science, and in particular, of computer science. To focus on this area it has been fundamental to analyse subjects concerning the didactics of museum visits organization. The aim was to find out the guide lines that a teacher should follow when planning the visit of a museum (virtual museum of the history of computer science). The consequent designing and realisation of an automatic support system for the description and the production of a formative, education and popular multimedia product (for the history of computer science), has been possible thanks to the results achieved through this research. The system obtained is provided by the following features: ·management of multimedia slides (such as texts, video, audio or images) which can be classified on the bases of the topic and of the profile of the user; ·automatic creation of a sequence of multimedia slides which introduce the topic; ·management of the interaction with the user to check and give validity to the product. The most innovative aspect of the present research is represented by the fact that the product is realised on the bases of the profile of the user.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die rasante Entwicklung der Computerindustrie durch die stetige Verkleinerung der Transistoren führt immer schneller zum Erreichen der Grenze der Si-Technologie, ab der die Tunnelprozesse in den Transistoren ihre weitere Verkleinerung und Erhöhung ihrer Dichte in den Prozessoren nicht mehr zulassen. Die Zukunft der Computertechnologie liegt in der Verarbeitung der Quanteninformation. Für die Entwicklung von Quantencomputern ist die Detektion und gezielte Manipulation einzelner Spins in Festkörpern von größter Bedeutung. Die Standardmethoden der Spindetektion, wie ESR, erlauben jedoch nur die Detektion von Spinensembles. Die Idee, die das Auslesen von einzelnen Spins ermöglich sollte, besteht darin, die Manipulation getrennt von der Detektion auszuführen.rn Bei dem NV−-Zentrum handelt es sich um eine spezielle Gitterfehlstelle im Diamant, die sich als einen atomaren, optisch auslesbaren Magnetfeldsensor benutzen lässt. Durch die Messung seiner Fluoreszenz sollte es möglich sein die Manipulation anderer, optisch nicht detektierbaren, “Dunkelspins“ in unmittelbarer Nähe des NV-Zentrums mittels der Spin-Spin-Kopplung zu detektieren. Das vorgeschlagene Modell des Quantencomputers basiert auf dem in SWCNT eingeschlossenen N@C60.Die Peapods, wie die Einheiten aus den in Kohlenstoffnanoröhre gepackten Fullerenen mit eingefangenem Stickstoff genannt werden, sollen die Grundlage für die Recheneinheiten eines wahren skalierbaren Quantencomputers bilden. Die in ihnen mit dem Stickstoff-Elektronenspin durchgeführten Rechnungen sollen mit den oberflächennahen NV-Zentren (von Diamantplatten), über denen sie positioniert sein sollen, optisch ausgelesen werden.rnrnDie vorliegende Arbeit hatte das primäre Ziel, die Kopplung der oberflächennahen NV-Einzelzentren an die optisch nicht detektierbaren Spins der Radikal-Moleküle auf der Diamantoberfläche mittels der ODMR-Kopplungsexperimente optisch zu detektieren und damit entscheidende Schritte auf dem Wege der Realisierung eines Quantenregisters zu tun.rn Es wurde ein sich im Entwicklungsstadium befindende ODMR-Setup wieder aufgebaut und seine bisherige Funktionsweise wurde an kommerziellen NV-Zentrum-reichen Nanodiamanten verifiziert. Im nächsten Schritt wurde die Effektivität und Weise der Messung an die Detektion und Manipulation der oberflächennah (< 7 nm Tiefe) implantieren NV-Einzelzenten in Diamantplatten angepasst.Ein sehr großer Teil der Arbeit, der hier nur bedingt beschrieben werden kann, bestand aus derrnAnpassung der existierenden Steuersoftware an die Problematik der praktischen Messung. Anschließend wurde die korrekte Funktion aller implementierten Pulssequenzen und anderer Software-Verbesserungen durch die Messung an oberflächennah implantierten NV-Einzelzentren verifiziert. Auch wurde der Messplatz um die zur Messung der Doppelresonanz notwendigen Komponenten wie einen steuerbaren Elektromagneten und RF-Signalquelle erweitert. Unter der Berücksichtigung der thermischen Stabilität von N@C60 wurde für zukünftige Experimente auch ein optischer Kryostat geplant, gebaut, in das Setup integriert und charakterisiert.rn Die Spin-Spin-Kopplungsexperimente wurden mit dem sauerstoffstabilen Galvinoxyl-Radikalals einem Modell-System für Kopplung durchgeführt. Dabei wurde über die Kopplung mit einem NVZentrum das RF-Spektrum des gekoppelten Radikal-Spins beobachtet. Auch konnte von dem gekoppelten Spin eine Rabi-Nutation aufgenommen werden.rn Es wurden auch weitere Aspekte der Peapod Messung und Oberflächenimplantation betrachtet.Es wurde untersucht, ob sich die NV-Detektion durch die SWCNTs, Peapods oder Fullerene stören lässt. Es zeigte sich, dass die Komponenten des geplanten Quantencomputers, bis auf die C60-Cluster, für eine ODMR-Messanordnung nicht detektierbar sind und die NV-Messung nicht stören werden. Es wurde auch betrachtet, welche Arten von kommerziellen Diamantplatten für die Oberflächenimplantation geeignet sind, für die Kopplungsmessungen geeignete Dichte der implantierten NV-Zentren abgeschätzt und eine Implantation mit abgeschätzter Dichte betrachtet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article presents the model of a multi-agent system (SMAF), which objectives are the input of fuzzy incidents as the human experts express them with different severities degrees and the further search and suggestion of solutions. The solutions will be later confirm or not by the users. This model was designed, implemented and tested in the telecommunications field, with heterogeneous agents in a cooperative model. In the design, different abstract levels where considered, according to the agents? objectives, their ways to carry it out and the environment in which they act. Each agent is modeled with different spectrum of the knowledge base

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract is not available

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The boundary element method (BEM) has been applied successfully to many engineering problems during the last decades. Compared with domain type methods like the finite element method (FEM) or the finite difference method (FDM) the BEM can handle problems where the medium extends to infinity much easier than domain type methods as there is no need to develop special boundary conditions (quiet or absorbing boundaries) or infinite elements at the boundaries introduced to limit the domain studied. The determination of the dynamic stiffness of arbitrarily shaped footings is just one of these fields where the BEM has been the method of choice, especially in the 1980s. With the continuous development of computer technology and the available hardware equipment the size of the problems under study grew and, as the flop count for solving the resulting linear system of equations grows with the third power of the number of equations, there was a need for the development of iterative methods with better performance. In [1] the GMRES algorithm was presented which is now widely used for implementations of the collocation BEM. While the FEM results in sparsely populated coefficient matrices, the BEM leads, in general, to fully or densely populated ones, depending on the number of subregions, posing a serious memory problem even for todays computers. If the geometry of the problem permits the surface of the domain to be meshed with equally shaped elements a lot of the resulting coefficients will be calculated and stored repeatedly. The present paper shows how these unnecessary operations can be avoided reducing the calculation time as well as the storage requirement. To this end a similar coefficient identification algorithm (SCIA), has been developed and implemented in a program written in Fortran 90. The vertical dynamic stiffness of a single pile in layered soil has been chosen to test the performance of the implementation. The results obtained with the 3-d model may be compared with those obtained with an axisymmetric formulation which are considered to be the reference values as the mesh quality is much better. The entire 3D model comprises more than 35000 dofs being a soil region with 21168 dofs the biggest single region. Note that the memory necessary to store all coefficients of this single region is about 6.8 GB, an amount which is usually not available with personal computers. In the problem under study the interface zone between the two adjacent soil regions as well as the surface of the top layer may be meshed with equally sized elements. In this case the application of the SCIA leads to an important reduction in memory requirements. The maximum memory used during the calculation has been reduced to 1.2 GB. The application of the SCIA thus permits problems to be solved on personal computers which otherwise would require much more powerful hardware.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is difficult, if not impossible, to find something that is not changing in computer technology: circuits, architectures, languages, methods, fields of application ... The "central object" itself of this brand of engineering, software, represents such a diverse reality (many objects) that the fact that it has only one name gives rise to considerable confusion. This issue, among others, was taken up by Fox (1) and, at this point, I would like to underline that it is more of a pragmatic issue than an academic one. Thus, Software Engineering Education moves in an unstable, undefined'world. This axiom governs and limits the. validity of all educational proposals in the area of Software Engineering and, thereforer all the ideas presented in this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Geologic storage of carbon dioxide (CO2) has been proposed as a viable means for reducing anthropogenic CO2 emissions. Once injection begins, a program for measurement, monitoring, and verification (MMV) of CO2 distribution is required in order to: a) research key features, effects and processes needed for risk assessment; b) manage the injection process; c) delineate and identify leakage risk and surface escape; d) provide early warnings of failure near the reservoir; and f) verify storage for accounting and crediting. The selection of the methodology of monitoring (characterization of site and control and verification in the post-injection phase) is influenced by economic and technological variables. Multiple Criteria Decision Making (MCDM) refers to a methodology developed for making decisions in the presence of multiple criteria. MCDM as a discipline has only a relatively short history of 40 years, and it has been closely related to advancements on computer technology. Evaluation methods and multicriteria decisions include the selection of a set of feasible alternatives, the simultaneous optimization of several objective functions, and a decision-making process and evaluation procedures that must be rational and consistent. The application of a mathematical model of decision-making will help to find the best solution, establishing the mechanisms to facilitate the management of information generated by number of disciplines of knowledge. Those problems in which decision alternatives are finite are called Discrete Multicriteria Decision problems. Such problems are most common in reality and this case scenario will be applied in solving the problem of site selection for storing CO2. Discrete MCDM is used to assess and decide on issues that by nature or design support a finite number of alternative solutions. Recently, Multicriteria Decision Analysis has been applied to hierarchy policy incentives for CCS, to assess the role of CCS, and to select potential areas which could be suitable to store. For those reasons, MCDM have been considered in the monitoring phase of CO2 storage, in order to select suitable technologies which could be techno-economical viable. In this paper, we identify techniques of gas measurements in subsurface which are currently applying in the phase of characterization (pre-injection); MCDM will help decision-makers to hierarchy the most suitable technique which fit the purpose to monitor the specific physic-chemical parameter.