992 resultados para Four basic operations
Resumo:
The current research had as main objective to analyze the possibility of knowledge elaboration/re-elaboration about ideas and algorithmic procedures related to basic operations by pupils of the 6th degree fundamental teaching in a significant learning process. This way the study had as basis a methodological intervention developed in a 6th degree class of a Fundamental Teaching Municipal School in the city of João Pessoa, PB. The research had as central steps the application of pre-tests (1 and 2); the execution of semi-structured interviews with the pupils involved in the theme deep studies; the elaboration and development of teaching activities, having as referential the significant learning and the application of a pre-test. The data collected in the pre-tests (1 and 2) showed a low level of the pupils comprehension about the contents related to the four operations. The answers to the post-test questions were analyzed mainly from the qualitative point of view based on the mathematic concepts comprehension theory proposed by Skemp (1980) having as complementary subsidy data collected through interviews. The analysis of the results obtained in the post-test showed that the major part of pupils reached a relational comprehension about the ideas and algorithmic procedures related to addition, subtraction, multiplication, and division. Such results showed us that the application of a teaching methodology that privileges the content comprehension, considering the pupils previous knowledge and the reflection about the action along the activities proposed, made possible the elaboration or re-elaboration of knowledge by pupils regarding to contents adopted as theme for our research
Resumo:
A aprendizagem é um processo continuo permeado por construções e reconstruções do conhecimento, com a inserção do computador no processo de ensino aprendizagem, juntamente com a análise das abordagens da Psicologia Educacional e Educação Matemática, foi possível, neste trabalho, a elaboração de um prototipo computacional voltado para o auxilio a aprendizagem da matemática. Este prototipo e um ambiente computacional interativo para auxiliar o aprendizado das quatro operações básicas (adição, subtração,multiplicação e divisão). Assunto este de grande repercussão no ambiente escolar, pois se não aprendido adequadamente, apresenta sérios problemas na evolução do aprendizado matemático do estudante. O trabalho envolve quatro etapas: Aspectos teóricos sobre o processo de ensino aprendizagem, dando-se maior ênfase a abordagem construtivista; Processo de ensino aprendizagem de Matemática, suas dificuldades e perspectivas de mudanças mediante ao aprendizado auxiliado por meios computacionais; concepção e modelagem do prototipo seguido dos Resultados obtidos durante aplicações do mesmo, resultados esses favoráveis a proposta inicial do trabalho.
Resumo:
Novel input modalities such as touch, tangibles or gestures try to exploit human's innate skills rather than imposing new learning processes. However, despite the recent boom of different natural interaction paradigms, it hasn't been systematically evaluated how these interfaces influence a user's performance or whether each interface could be more or less appropriate when it comes to: 1) different age groups; and 2) different basic operations, as data selection, insertion or manipulation. This work presents the first step of an exploratory evaluation about whether or not the users' performance is indeed influenced by the different interfaces. The key point is to understand how different interaction paradigms affect specific target-audiences (children, adults and older adults) when dealing with a selection task. 60 participants took part in this study to assess how different interfaces may influence the interaction of specific groups of users with regard to their age. Four input modalities were used to perform a selection task and the methodology was based on usability testing (speed, accuracy and user preference). The study suggests a statistically significant difference between mean selection times for each group of users, and also raises new issues regarding the “old” mouse input versus the “new” input modalities.
Resumo:
Tapahtumankäsittelyä pidetään yleisesti eräänä luotettavan tietojenkäsittelyn perusvaatimuksena. Tapahtumalla tarkoitetaan operaatiosarjaa, jonka suoritusta voidaan pitää yhtenä loogisena toimenpiteenä. Tapahtumien suoritukselle on asetettu neljä perussääntöä, joista käytetään lyhennettä ACID. Hajautetuissa järjestelmissä tapahtumankäsittelyn tarve kasvaa entisestään, sillä toimenpiteiden onnistumista ei voida varmistaa pelkästään paikallisten menetelmien avulla. Hajautettua tapahtumankäsittelyä on yritetty standardoida useaan otteeseen, muttayrityksistä huolimatta siitä ei ole olemassa yleisesti hyväksyttyjä ja avoimia standardeja. Lähimpänä tällaisen standardin asemaa on todennäköisesti X/Open DTP-standardiperhe ja varsinkin siihen kuuluva XA-standardi. Tässä työssä on lisäksi tutkittu, kuinka Intellitel ONE-järjestelmän valmistajariippumatonta tietokanta-arkkitehtuuria tulisi kehittää, kun tavoitteena on mahdollistaa sen avulla suoritettavien tapahtumankäsittelyä vaativien sovellusten käyttäminen.
Resumo:
The remarks that I have prepared deal with direct contacts selling pest and bird control programs. I am going to limit my remarks to what I feel are the more important aspects of selling Bird Control. I think it is safe to say that one of the most difficult aspects of selling for most sales personnel is prospecting, that is, finding accounts to call on. Our sales personnel have to more or less come up with their own leads. They have to find out who to contact once they get there. I have found that the best prospect most of us have for selling Bird Control accounts are our present pest control accounts. Generally speaking, we try to main¬tain contact with our applicators in the field, who are in these accounts every day, asking them if there are any of their accounts that are having bird control problems. Another method of finding potential accounts, is driving around looking. It is more difficult to drive around and look for rat and/or roach problems, but generally speaking if a building or some type of business has a bird problem, it is fairly easy to locate. Another thing we can do is call on specific accounts. There are generally cer¬tain accounts that just by the manufacturing process do attract birds, for example: food plants, mills, beet plants, grain elevators, food processors, and so on. Other type operations which lend themselves to bird problems are industrial plants because of the super-structure (physical plant) that they have. Sub-stations and power plants are very attractive to birds. Some other situations that should be checked for bird problems are lumber yards and contractors' storage buildings. After deciding on a contact we get into what I call my basic four. There are four basic things that I try to impress upon our personnel to keep in mind when they go in to make a contact. The first one is the interview or actually making the contact so that you get an opportunity to have the interview, either calling for an appointment or making a "cold" call. The second one is closing for the survey. The third one is making the survey and preparing a proposal. The fourth and last one is the proposal presentation and closing of the sale. An additional item which would make a basic five is after you make the sale don't forget to follow up on the sale.
Resumo:
The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Three-dimensional modeling of piezoelectric devices requires a precise knowledge of piezoelectric material parameters. The commonly used piezoelectric materials belong to the 6mm symmetry class, which have ten independent constants. In this work, a methodology to obtain precise material constants over a wide frequency band through finite element analysis of a piezoceramic disk is presented. Given an experimental electrical impedance curve and a first estimate for the piezoelectric material properties, the objective is to find the material properties that minimize the difference between the electrical impedance calculated by the finite element method and that obtained experimentally by an electrical impedance analyzer. The methodology consists of four basic steps: experimental measurement, identification of vibration modes and their sensitivity to material constants, a preliminary identification algorithm, and final refinement of the material constants using an optimization algorithm. The application of the methodology is exemplified using a hard lead zirconate titanate piezoceramic. The same methodology is applied to a soft piezoceramic. The errors in the identification of each parameter are statistically estimated in both cases, and are less than 0.6% for elastic constants, and less than 6.3% for dielectric and piezoelectric constants.
Resumo:
Societal changes have, throughout history, pushed the long-established boundaries of education across all grade levels. Technology and media merge with education in a continuous complex social process with human consequences and effects. We, teachers, can aspire to understand and interpret this volatile context that is being redesigned at the same time society itself is being reshaped as a result of the technological evolution. The language- learning classroom is not impenetrable to these transformations. Rather, it can perhaps be seen as a playground where teachers and students gather to combine the past and the present in an integrated approach. We draw on the results from a previous study and argue that Digital Storytelling as a Process is capable of aggregating and fostering positive student development in general, as well as enhancing interpersonal relationships and self-knowledge while improving digital literacy. Additionally, we establish a link between the four basic language-learning skills and the Digital Storytelling process and demonstrate how these converge into what can be labeled as an integrated language learning approach.
Resumo:
BACKGROUND: Accurate catalogs of structural variants (SVs) in mammalian genomes are necessary to elucidate the potential mechanisms that drive SV formation and to assess their functional impact. Next generation sequencing methods for SV detection are an advance on array-based methods, but are almost exclusively limited to four basic types: deletions, insertions, inversions and copy number gains. RESULTS: By visual inspection of 100 Mbp of genome to which next generation sequence data from 17 inbred mouse strains had been aligned, we identify and interpret 21 paired-end mapping patterns, which we validate by PCR. These paired-end mapping patterns reveal a greater diversity and complexity in SVs than previously recognized. In addition, Sanger-based sequence analysis of 4,176 breakpoints at 261 SV sites reveal additional complexity at approximately a quarter of structural variants analyzed. We find micro-deletions and micro-insertions at SV breakpoints, ranging from 1 to 107 bp, and SNPs that extend breakpoint micro-homology and may catalyze SV formation. CONCLUSIONS: An integrative approach using experimental analyses to train computational SV calling is essential for the accurate resolution of the architecture of SVs. We find considerable complexity in SV formation; about a quarter of SVs in the mouse are composed of a complex mixture of deletion, insertion, inversion and copy number gain. Computational methods can be adapted to identify most paired-end mapping patterns.
Resumo:
En aquest treball s'amplia la implementació en Java de les estructures de dades iniciada per Esteve Mariné, utilitzant el seu disseny bàsic. Concretament, s'ha fet la programació de les estructures de a) classes disjuntes, utilitzant els algorismes de llistes encadenades i amb estructura d'arbre, b) monticles, amb els algorismes binari, binomial i de Fibonacci, i c) arbres de recerca basats en l'algorisme d'arbre binari vermell-negre, el qual complementa els dos ja existents amb algorismes d'encadenaments i AVL. Per a examinar l'evolució de les estructures, s'ha preparat un visualitzador gràfic interactiu amb l'usuari que permet fer les operacions bàsiques de l'estructura. Amb aquest entorn és possible desar les estructures, tornar a reproduir-les i desfer i tornar a repetir les operacions fetes sobre l'estructura. Finalment, aporta una metodologia, amb visualització mitjançant gràfics, de l'avaluació comparativa dels algorismes implementats, que permet modificar els paràmetres d'avaluació com ara nombre d'elements que s'han de tractar, algorismes que s'han de comparar i nombre de repeticions. Les dades obtingudes es poden exportar per a analitzar-les posteriorment.
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
Presentemente, o controlo de gestão está vocacionado para agir antes de os factos indesejáveis ocorrerem, assegurando que os objectivos estabelecidos pela gestão são atingidos dentro do timing fixado. Além disso, o controlo de gestão deve ser o motor que permita alcançar as melhores performances nas áreas críticas da empresa, não só no domínio económico e financeiro, mas também nas áreas do crescimento, segurança e produtividade. Um dos mais importantes objectos das administrações actuais, é determinar se o desempenho da organização está de acordo com o que foi estabelecido previamente, ou seja, seus objectivos e metas. O meio através do qual se verificaria este desempenho seria a utilização de métodos e sistemas de avaliação de desempenho eficazes. Neste contexto, o presente estudo consiste em fazer um estudo exploratório descritivo identificando e averiguando de que forma as instituições bancárias de Cabo Verde efectuam a gestão de alguns aspectos, especialmente a avaliação de desempenho e o controlo estratégico, e que indicadores utilizam. Não obstante os objectivos específicos do trabalho serem outros, também damos especial atenção às características do mercado cabo-verdiano e à importância do sector bancário para a economia. Finalmente, apresentamos o Balanced Scorecard como uma ferramenta capaz de suprir as dificuldades da avaliação de desempenho e o conjunto de indicadores que vemos como o mais adequado. Neste ponto, concentramos nas quatro perspectivas básicas e no mapa estratégico, referindo o papel do Balanced Scorecard no alinhamento estratégico e na avaliação do desempenho organizacional. Para concluir, reforçamos o estudo, entrevistando um especialista (Director Financeiro) de um dos bancos da praça, cujo nome prometemos não publicar. Dessa forma, esperamos contribuir para uma melhor percepção da realidade em estudo, tanto do ponto de vista teórico, quanto da verificação das práticas no sector. Presently, the management control is oriented to act before the undesirable facts happen, assuring that the management established objectives are being achieved in the fixed timing. Besides, the management control must be an engine that permits to achieve the best performances at critical company areas, not only in the economic and financial areas, but at the growth, security and productivity areas too. One of the most important administration objects nowadays is to know if the organization performance is according to the fixed targets. The performance measurement could be done through effective methods and performance measurement systems. That’s why this assignment consists in doing an exploratory and descriptive study, identifying and investigating how the bank institutions of Cape Verde manage some things, particularly the performance measurement and the strategic control, and to know which indicators they use. Although the specific objectives of this assignment are others, we also give special attention to the Capeverdean market characteristics, and to the relevance of the banking industry to the economy of the country. Finally, we present the Balanced Scorecard as a competent tool to supply the measurement performance difficulties and a number of indicators that we find appropriate. In this point, we focus in the four basic perspectives and the strategic map, referring to the role of the Balanced Scorecard in the strategic alignment and organization performance measurement. We conclude this study with an interview to an expert (A Financial Manager) of a bank working in Cape Verde, whose name we promise to preserve. In this way, we hope to contribute to a better perception of this reality, in the theoretical point-of-view as much as in the practical check of this industry’s labour.
Resumo:
This paper surveys asset allocation methods that extend the traditional approach. An important feature of the the traditional approach is that measures the risk and return tradeoff in terms of mean and variance of final wealth. However, there are also other important features that are not always made explicit in terms of investor s wealth, information, and horizon: The investor makes a single portfolio choice based only on the mean and variance of her final financial wealth and she knows the relevant parameters in that computation. First, the paper describes traditional portfolio choice based on four basic assumptions, while the rest of the sections extend those assumptions. Each section will describe the corresponding equilibrium implications in terms of portfolio advice and asset pricing.
Resumo:
Interaction analysis is not a prerogative of any discipline in social sciences. It has its own history within each disciplinary field and is related to specific research objects. From the standpoint of psychology, this article first draws upon a distinction between factorial and dialogical conceptions of interaction. It then briefly presents the basis of a dialogical approach in psychology and focuses upon four basic assumptions. Each of them is examined on a theoretical and on a methodological level with a leading question: to what extent is it possible to develop analytical tools that are fully coherent with dialogical assumptions? The conclusion stresses the difficulty of developing methodological tools that are fully consistent with dialogical assumptions and argues that there is an unavoidable tension between accounting for the complexity of an interaction and using methodological tools which necessarily "monologise" this complexity.