41 resultados para Teorema da interpolação de Marcinkiewicz
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In machining of internal threads, dedicated tools, known as taps, are needed for each profile type, diameter, and low cutting speed values are used when compared to main machining processes. This restriction in the cutting speed is associated with the difficulty of synchronizing the tool s rotation speed and feed velocity in the process. This fact restricts the flexibility and makes machining lead times longer when manufacturing of components with threads is required. An alternative to the constraints imposed by the tap is the thread milling with helical interpolation technique. The technique is the fusion of two movements: rotation and helical interpolation. The tools may have different configurations: a single edge or multiple edges (axial, radial or both). However, thread milling with helical interpolation technique is relatively new and there are limited studies on the subject, a fact which promotes challenges to its wide application in the manufacturing shop floor. The objective of this research is determine the performance of different types of tools in the thread milling with helical interpolation technique using hardened steel workpieces. In this sense, four tool configurations were used for threading milling in AISI 4340 quenched and tempered steel (40 HRC). The results showed that climb cut promoted a greater number of machined threads, regardless of tool configuration. The upcut milling causes chippings in cutting edge, while the climb cutting promotes abrasive wear. Another important point is that increase in hole diameter by tool diameter ratio increases tool lifetime
Resumo:
Present day weather forecast models usually cannot provide realistic descriptions of local and particulary extreme weather conditions. However, for lead times of about a small number of days, they provide reliable forecast of the atmospheric circulation that encompasses the subscale processes leading to extremes. Hence, forecasts of extreme events can only be achieved through a combination of dynamical and statistical analysis methods, where a stable and significant statistical model based on prior physical reasoning establishes posterior statistical-dynamical model between the local extremes and the large scale circulation. Here we present the development and application of such a statistical model calibration on the besis of extreme value theory, in order to derive probabilistic forecast for extreme local temperature. The dowscaling applies to NCEP/NCAR re-analysis, in order to derive estimates of daily temperature at Brazilian northeastern region weather stations
Resumo:
In this work we have elaborated a spline-based method of solution of inicial value problems involving ordinary differential equations, with emphasis on linear equations. The method can be seen as an alternative for the traditional solvers such as Runge-Kutta, and avoids root calculations in the linear time invariant case. The method is then applied on a central problem of control theory, namely, the step response problem for linear EDOs with possibly varying coefficients, where root calculations do not apply. We have implemented an efficient algorithm which uses exclusively matrix-vector operations. The working interval (till the settling time) was determined through a calculation of the least stable mode using a modified power method. Several variants of the method have been compared by simulation. For general linear problems with fine grid, the proposed method compares favorably with the Euler method. In the time invariant case, where the alternative is root calculation, we have indications that the proposed method is competitive for equations of sifficiently high order.
Resumo:
Among several theorems which are taught in basic education some of them can be proved in the classroom and others do not, because the degree of difficulty of its formal proof. A classic example is the Fundamental Theorem of Algebra which is not proved, it is necessary higher-level knowledge in mathematics. In this paper, we justify the validity of this theorem intuitively using the software Geogebra. And, based on [2] we will present a clear formal proof of this theorem that is addressed to school teachers and undergraduate students in mathematics
Resumo:
This study will present the results of an investigation of how the history of mathematics and theater can contribute to the construction of mathematical knowledge of students in the 9th year of elementary school, through the experience, preparation and execution of a play, beyond presentation of the script. This brings a historical approach, defining space and time of events, putting the reader and viewer to do the route in the biography of Thales of Miletus (624-546 a.C), creating situations that led to the study and discussion of the content related to the episode possible to measure the height of the pyramid Khufu and the Theorem of Thales. That said, the pedagogical proposal implemented in this work was based on theoretical and methodological assumptions of the History of Mathematics and Theatre, drawing upon authors such as Mendes (2006), Miguel (1993), Gutierre (2010), Desgrandes (2011), Cabral (2012). Regarding the methodological procedures used qualitative research because it responds to particular issues, analyzing and interpreting the data generated in the research field. As methodological tools we have used participant observation, the questionnaire given to the students, field diary and dissertativos texts produced by students. The processing and analysis of data collected through the questionnaires were organized, classified and quantified in tables and graphs for easy viewing, interpretation, understanding and analysis of data. Data analysis corroborated our hypothesis and contributed to improving the use and display of the play as a motivating activity in mathematics classrooms. Thus, we consider that the script developed, ie the educational product proposed will bring significant contributions to the teaching of Mathematics in Primary Education
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
Resumo:
Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.
Resumo:
Dark matter is a fundamental ingredient of the modern Cosmology. It is necessary in order to explain the process of structures formation in the Universe, rotation curves of galaxies and the mass discrepancy in clusters of galaxies. However, although many efforts, in both aspects, theoretical and experimental, have been made, the nature of dark matter is still unknown and the only convincing evidence for its existence is gravitational. This rises doubts about its existence and, in turn, opens the possibility that the Einstein’s gravity needs to be modified at some scale. We study, in this work, the possibility that the Eddington-Born-Infeld (EBI) modified gravity provides en alternative explanation for the mass discrepancy in clusters of galaxies. For this purpose we derive the modified Einstein field equations and find their solutions to a spherical system of identical and collisionless point particles. Then, we took into account the collisionless relativistic Boltzmann equation and using some approximations and assumptions for weak gravitational field, we derived the generalized virial theorem in the framework of EBI gravity. In order to compare the predictions of EBI gravity with astrophysical observations we estimated the order of magnitude of the geometric mass, showing that it is compatible with present observations. Finally, considering a power law for the density of galaxies in the cluster, we derived expressions for the radial velocity dispersion of the galaxies, which can be used for testing some features of the EBI gravity.
Resumo:
Untreated effluents that reach surface water affect the aquatic life and humans. This study aimed to evaluate the wastewater s toxicity (municipal, industrial and shrimp pond effluents) released in the Estuarine Complex of Jundiaí- Potengi, Natal/RN, through chronic quantitative e qualitative toxicity tests using the test organism Mysidopsis Juniae, CRUSTACEA, MYSIDACEA (Silva, 1979). For this, a new methodology for viewing chronic effects on organisms of M. juniae was used (only renewal), based on another existing methodology to another testorganism very similar to M. Juniae, the M. Bahia (daily renewal).Toxicity tests 7 days duration were used for detecting effects on the survival and fecundity in M. juniae. Lethal Concentration 50% (LC50%) was determined by the Trimmed Spearman-Karber; Inhibition Concentration 50% (IC50%) in fecundity was determined by Linear Interpolation. ANOVA (One Way) tests (p = 0.05) were used to determinate the No Observed Effect Concentration (NOEC) and Low Observed Effect Concentration (LOEC). Effluents flows were measured and the toxic load of the effluents was estimated. Multivariate analysis - Principal Component Analysis (PCA) and Correspondence Analysis (CA) - identified the physic-chemical parameters better explain the patterns of toxicity found in survival and fecundity of M. juniae. We verified the feasibility of applying the only renewal system in chronic tests with M. Juniae. Most efluentes proved toxic on the survival and fecundity of M. Juniae, except for some shrimp pond effluents. The most toxic effluent was ETE Lagoa Aerada (LC50, 6.24%; IC50, 4.82%), ETE Quintas (LC50, 5.85%), Giselda Trigueiro Hospital (LC50, 2.05%), CLAN (LC50, 2.14%) and COTEMINAS (LC50, IC50 and 38.51%, 6.94%). The greatest toxic load was originated from ETE inefficient high flow effluents, textile effluents and CLAN. The organic load was related to the toxic effects of wastewater and hospital effluents in survival of M. Juniae, as well as heavy metals, total residual chlorine and phenols. In industrial effluents was found relationship between toxicity and organic load, phenols, oils and greases and benzene. The effects on fertility were related, in turn, with chlorine and heavy metals. Toxicity tests using other organisms of different trophic levels, as well as analysis of sediment toxicity are recommended to confirm the patterns found with M. Juniae. However, the results indicate the necessity for implementation and improvement of sewage treatment systems affluent to the Potengi s estuary
Resumo:
Trigonometry, branch of mathematics related to the study of triangles, developed from practical needs, especially relating to astronomy, Surveying and Navigation. Johann Müller, the Regiomontanus (1436-1476) mathematician and astronomer of the fifteenth century played an important role in the development of this science. His work titled De Triangulis Omnimodis Libri Quinque written around 1464, and published posthumously in 1533, presents the first systematic exposure of European plane and spherical trigonometry, a treatment independent of astronomy. In this study we present a description, translation and analysis of some aspects of this important work in the history of trigonometry. Therefore, the translation was performed using a version of the book Regiomontanus on Triangles of Barnabas Hughes, 1967. In it you will find the original work in Latin and an English translation. For this study, we use for most of our translation in Portuguese, the English version, but some doubt utterance, statement and figures were made by the original Latin. In this work, we can see that trigonometry is considered as a branch of mathematics which is subordinated to geometry, that is, toward the study of triangles. Regiomontanus provides a large number of theorems as the original trigonometric formula for the area of a triangle. Use algebra to solve geometric problems and mainly shows the first practical theorem for the law of cosines in spherical trigonometry. Thus, this study shows some of the development of the trigonometry in the fifteenth century, especially with regard to concepts such as sine and cosine (sine reverse), the work discussed above, is of paramount importance for the research in the history of mathematics more specifically in the area of historical analysis and critique of literary sources or studying the work of a particular mathematician
Resumo:
The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work
Resumo:
This work develops a robustness analysis with respect to the modeling errors, being applied to the strategies of indirect control using Artificial Neural Networks - ANN s, belong to the multilayer feedforward perceptron class with on-line training based on gradient method (backpropagation). The presented schemes are called Indirect Hybrid Control and Indirect Neural Control. They are presented two Robustness Theorems, being one for each proposed indirect control scheme, which allow the computation of the maximum steady-state control error that will occur due to the modeling error what is caused by the neural identifier, either for the closed loop configuration having a conventional controller - Indirect Hybrid Control, or for the closed loop configuration having a neural controller - Indirect Neural Control. Considering that the robustness analysis is restrict only to the steady-state plant behavior, this work also includes a stability analysis transcription that is suitable for multilayer perceptron class of ANN s trained with backpropagation algorithm, to assure the convergence and stability of the used neural systems. By other side, the boundness of the initial transient behavior is assured by the assumption that the plant is BIBO (Bounded Input, Bounded Output) stable. The Robustness Theorems were tested on the proposed indirect control strategies, while applied to regulation control of simulated examples using nonlinear plants, and its results are presented
Resumo:
The use of Geographic Information Systems (GIS) has becoming very important in fields where detailed and precise study of earth surface features is required. Applications in environmental protection are such an example that requires the use of GIS tools for analysis and decision by managers and enrolled community of protected areas. In this specific field, a challenge that remains is to build a GIS that can be dynamically fed with data, allowing researchers and other agents to recover actual and up to date information. In some cases, data is acquired in several ways and come from different sources. To solve this problem, some tools were implemented that includes a model for spatial data treatment on the Web. The research issues involved start with the feeding and processing of environmental control data collected in-loco as biotic and geological variables and finishes with the presentation of all information on theWeb. For this dynamic processing, it was developed some tools that make MapServer more flexible and dynamic, allowing data uploading by the proper users. Furthermore, it was also developed a module that uses interpolation to aiming spatial data analysis. A complex application that has validated this research is to feed the system with data coming from coral reef regions located in northeast of Brazil. The system was implemented using the best interactivity concept provided by the AJAX model and resulted in a substantial contribution for efficiently accessing information, being an essential mechanism for controlling events in the environmental monitoring
Resumo:
The Support Vector Machines (SVM) has attracted increasing attention in machine learning area, particularly on classification and patterns recognition. However, in some cases it is not easy to determinate accurately the class which given pattern belongs. This thesis involves the construction of a intervalar pattern classifier using SVM in association with intervalar theory, in order to model the separation of a pattern set between distinct classes with precision, aiming to obtain an optimized separation capable to treat imprecisions contained in the initial data and generated during the computational processing. The SVM is a linear machine. In order to allow it to solve real-world problems (usually nonlinear problems), it is necessary to treat the pattern set, know as input set, transforming from nonlinear nature to linear problem. The kernel machines are responsible to do this mapping. To create the intervalar extension of SVM, both for linear and nonlinear problems, it was necessary define intervalar kernel and the Mercer s theorem (which caracterize a kernel function) to intervalar function
Resumo:
Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to Fuzzy intelligent systems are present in a variety of equipment ranging from household appliances to small devices such as digital cameras and cell phones being used primarily for dealing with the uncertainties in the modeling of real systems. However, commercial implementations of Fuzzy systems are not general purpose and do not have portability to different hardware platforms. Thinking about these issues this work presents the implementation of an open source development environment that consists of a desktop system capable of generate Graphically a general purpose Fuzzy controller and export these parameters for an embedded system with a Fuzzy controller written in Java Platform Micro Edition To (J2ME), whose modular design makes it portable to any mobile device that supports J2ME. Thus, the proposed development platform is capable of generating all the parameters of a Fuzzy controller and export it in XML file, and the code responsible for the control logic that is embedded in the mobile device is able to read this file and start the controller. All the parameters of a Fuzzy controller are configurable using the desktop system, since the membership functions and rule base, even the universe of discourse of the linguistic terms of output variables. This system generates Fuzzy controllers for the interpolation model of Takagi-Sugeno. As the validation process and testing of the proposed solution the Fuzzy controller was embedded on the mobile device Sun SPOT ® and used to control a plant-level Quanser®, and to compare the Fuzzy controller generated by the system with other types of controllers was implemented and embedded in sun spot a PID controller to control the same level plant of Quanser®