1000 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::MATEMATICA: ENSINO DE CI
Resumo:
This dissertation aims to suggest the teacher of high school mathematics a way of teaching logic to students. For this uses up a teaching sequence that explores the mathematical concepts that are involved in the operation of a calculator one of the greatest symbols of mathematics.
Resumo:
This dissertation aims to suggest the teacher of high school mathematics a way of teaching logic to students. For this uses up a teaching sequence that explores the mathematical concepts that are involved in the operation of a calculator one of the greatest symbols of mathematics.
Resumo:
This work presents discussions on the teaching of Chemical Bonds in high school and some implications of this approach in learning chemistry by students. In general, understanding how the chemicals combine to form substances and compounds, it is a key point for understanding the properties of substances and their structure. In this sense, the chemical bonds represent an extremely important issue, and their knowledge is essential for a better understanding of the changes occurring in our world. Despite these findings, it is observed that the way in which this concept is discussed in chemistry class has contributed, paradoxically, to the emergence of several alternative designs, making the understanding of the subject by students. It is believed that one of the explanations for these observations is the exclusive use of the "octet rule" as an explanatory model for the Chemical Bonds. The use of such a model over time eventually replace chemical principles that gave rise to it, transforming knowledge into a series of uninteresting rituals and even confusing for students. Based on these findings, it is deemed necessary a reformulation in the way to approach this content in the classroom, taking into account especially the fact that the explanations of the formation of substances should be based on the energy concept, which is fundamental to understanding how atoms combine. Thus, the main question of the survey and described here of the following question: Can the development of an explanatory model for the Chemical Bonds in high school based on the concept of energy and without the need to use the "octet rule"? Based on the concepts and methodologies of modeling activity, we sought the development of a teaching model was made through Teaching Units designed to give subsidies to high school teachers to address the chemical bonds through the concept of energy. Through this work it is intended to make the process of teaching and learning of Chemical Bonds content becomes more meaningful to students, developing models that contribute to the learning of this and hence other basic fundamentals of chemistry.
Resumo:
This work presents discussions on the teaching of Chemical Bonds in high school and some implications of this approach in learning chemistry by students. In general, understanding how the chemicals combine to form substances and compounds, it is a key point for understanding the properties of substances and their structure. In this sense, the chemical bonds represent an extremely important issue, and their knowledge is essential for a better understanding of the changes occurring in our world. Despite these findings, it is observed that the way in which this concept is discussed in chemistry class has contributed, paradoxically, to the emergence of several alternative designs, making the understanding of the subject by students. It is believed that one of the explanations for these observations is the exclusive use of the "octet rule" as an explanatory model for the Chemical Bonds. The use of such a model over time eventually replace chemical principles that gave rise to it, transforming knowledge into a series of uninteresting rituals and even confusing for students. Based on these findings, it is deemed necessary a reformulation in the way to approach this content in the classroom, taking into account especially the fact that the explanations of the formation of substances should be based on the energy concept, which is fundamental to understanding how atoms combine. Thus, the main question of the survey and described here of the following question: Can the development of an explanatory model for the Chemical Bonds in high school based on the concept of energy and without the need to use the "octet rule"? Based on the concepts and methodologies of modeling activity, we sought the development of a teaching model was made through Teaching Units designed to give subsidies to high school teachers to address the chemical bonds through the concept of energy. Through this work it is intended to make the process of teaching and learning of Chemical Bonds content becomes more meaningful to students, developing models that contribute to the learning of this and hence other basic fundamentals of chemistry.
Resumo:
This work studies the van Hiele model, the levels of development of geometric thinking and its learning phases. Using this knowledge, we prepared a Research Instrument to identify the Level of Development in Geometric Thinking (Levels of van Hiele) of Middle School students, related to contents of Polygons. We have applied this Research Instrument to 237 students from a public school (state) in Curitiba, and we made an analysis of the acquired data. We have improved the Instrument’s questions so that it can be used by teachers during the class. Helping to identify to which level content the student belongs, related to the proposed.
Resumo:
The aim of this present work is investigating the interest and motivation for learning, awakened in pupils when the educator practice is guided by the ethnomathematics perspective. The main question is: Can an ethnomathematic approach awaken enthusiasm in pupils, causing it to become more critic and active in building their knowledge? The methodology that guides the investigation is qualitative, based on technical arising of the ethnographic case study. Theoretical contributions that support the investigation are from the scientific methodology and from ethnomathematics. The research material is composed by: researcher’s field diary, audio recording of participant observation, interviews reports of community residents and students parents, highlighting the material produced by students. This study was developed on an 8º year of high school of rural community. During the work were prioritized the ethnomathematics concepts of the Ethnomathematic Program, which establish a link exchange, where the lecturers inserts themselves on the reality of pupils in a way that promote an appreciation of their identity and a commitment to their learning. The educator investigates and values the ideas of pupils throughout dialogues. There are challenges for the application of education with ethno mathematic perspective, pointed out by authors, listed and supplemented in the research. In this context, it is believed that the socio-cultural knowledge must be respect, and as they are understood their specialties, capabilities and characteristics, this can guide teaching practice, making significant process for pupils, providing appropriation of scientific knowledge. Analysis of research practice indicated that students, research subjects, when they decided contextual issues, with their way of life, felt appreciated. The conclusion is that, with continuous action of contextualized of school mathematics, from the recognition of the environment and of cultural identity, the educator has the opportunity of review their own participant condition, and therefore promote an enthusiasm for learning. Because a motivated pupil becomes active, since that the all project is guided in a significant theme.
Resumo:
In this work, we study and compare two percolation algorithms, one of then elaborated by Elias, and the other one by Newman and Ziff, using theorical tools of algorithms complexity and another algorithm that makes an experimental comparation. This work is divided in three chapters. The first one approaches some necessary definitions and theorems to a more formal mathematical study of percolation. The second presents technics that were used for the estimative calculation of the algorithms complexity, are they: worse case, better case e average case. We use the technique of the worse case to estimate the complexity of both algorithms and thus we can compare them. The last chapter shows several characteristics of each one of the algorithms and through the theoretical estimate of the complexity and the comparison between the execution time of the most important part of each one, we can compare these important algorithms that simulate the percolation.
Resumo:
In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made
Resumo:
In this work we presented an exhibition of the mathematical theory of orthogonal compact support wavelets in the context of multiresoluction analysis. These are particularly attractive wavelets because they lead to a stable and very efficient algorithm, that is Fast Transform Wavelet (FWT). One of our objectives is to develop efficient algorithms for calculating the coefficients wavelet (FWT) through the pyramid algorithm of Mallat and to discuss his connection with filters Banks. We also studied the concept of multiresoluction analysis, that is the context in that wavelets can be understood and built naturally, taking an important step in the change from the Mathematical universe (Continuous Domain) for the Universe of the representation (Discret Domain)
Resumo:
The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º
Resumo:
In this work we study the survival cure rate model proposed by Yakovlev (1993) that are considered in a competing risk setting. Covariates are introduced for modeling the cure rate and we allow some covariates to have missing values. We consider only the cases by which the missing covariates are categorical and implement the EM algorithm via the method of weights for maximum likelihood estimation. We present a Monte Carlo simulation experiment to compare the properties of the estimators based on this method with those estimators under the complete case scenario. We also evaluate, in this experiment, the impact in the parameter estimates when we increase the proportion of immune and censored individuals among the not immune one. We demonstrate the proposed methodology with a real data set involving the time until the graduation for the undergraduate course of Statistics of the Universidade Federal do Rio Grande do Norte
Resumo:
The present essay shows strategies of improvement in a well succeded evolutionary metaheuristic to solve the Asymmetric Traveling Salesman Problem. Such steps consist in a Memetic Algorithm projected mainly to this problem. Basically this improvement applied optimizing techniques known as Path-Relinking and Vocabulary Building. Furthermore, this last one has being used in two different ways, in order to evaluate the effects of the improvement on the evolutionary metaheuristic. These methods were implemented in C++ code and the experiments were done under instances at TSPLIB library, being possible to observe that the procedures purposed reached success on the tests done
Resumo:
In this work we present the principal fractals, their caracteristics, properties abd their classification, comparing them to Euclidean Geometry Elements. We show the importance of the Fractal Geometry in the analysis of several elements of our society. We emphasize the importance of an appropriate definition of dimension to these objects, because the definition we presently know doesn t see a satisfactory one. As an instrument to obtain these dimentions we present the Method to count boxes, of Hausdorff- Besicovich and the Scale Method. We also study the Percolation Process in the square lattice, comparing it to percolation in the multifractal subject Qmf, where we observe som differences between these two process. We analize the histogram grafic of the percolating lattices versus the site occupation probability p, and other numerical simulations. And finaly, we show that we can estimate the fractal dimension of the percolation cluster and that the percolatin in a multifractal suport is in the same universality class as standard percolation. We observe that the area of the blocks of Qmf is variable, pc is a function of p which is related to the anisotropy of Qmf
Resumo:
Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations
Resumo:
In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere