63 resultados para Mathematical representations
Resumo:
The aim of this present work was to compare planialtimetric charts obtained from different risings using two different theodolite types, a total station, and a precision level, used as control. Using a total station, an area was marked with clear variations of relief, following a grid, with a distance of 20 meters among stakes. After that, the stakes were read by the total station and two theodolites of different precisions. The geometric leveling was done by a precision level. The data were input in the DataGeosis software and the numerical modelling of the land was made with mesh of maximum rigidity, generating planialtimetric representation for each rising. It was verified, through comparison of the four representations that little variations occur in relation to the control. The closest representation of the control was the planialtimetry based on the data from the total station, in which the representations obtained from the theodolites were identical among themselves. It was concluded that in the process of obtaining detailed planialtimetry of small areas, submitted to the grid, it was not necessary to use composed geometric leveling, reducing work to the exclusive use of a total station or conventional theodolite.
Resumo:
Research on the influence of multiple representations in mathematics education gained new momentum when personal computers and software started to become available in the mid-1980s. It became much easier for students who were not fond of algebraic representations to work with concepts such as function using graphs or tables. Research on how students use such software showed that they shaped the tools to their own needs, resulting in an intershaping relationship in which tools shape the way students know at the same time the students shape the tools and influence the design of the next generation of tools. This kind of research led to the theoretical perspective presented in this paper: knowledge is constructed by collectives of humans-with-media. In this paper, I will discuss how media have shaped the notions of problem and knowledge, and a parallel will be developed between the way that software has brought new possibilities to mathematics education and the changes that the Internet may bring to mathematics education. This paper is, therefore, a discussion about the future of mathematics education. Potential scenarios for the future of mathematics education, if the Internet becomes accepted in the classroom, will be discussed. © FIZ Karlsruhe 2009.
Resumo:
In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.
Resumo:
This paper presents three methods for automatic detection of dust devils tracks in images of Mars. The methods are mainly based on Mathematical Morphology and results of their performance are analyzed and compared. A dataset of 21 images from the surface of Mars representative of the diversity of those track features were considered for developing, testing and evaluating our methods, confronting their outputs with ground truth images made manually. Methods 1 and 3, based on closing top-hat and path closing top-hat, respectively, showed similar mean accuracies around 90% but the time of processing was much greater for method 1 than for method 3. Method 2, based on radial closing, was the fastest but showed worse mean accuracy. Thus, this was the tiebreak factor. © 2011 Springer-Verlag.
Resumo:
According to Peirce one of the most important philosophical problems is continuity. Consequently, he set forth an innovative and peculiar approach in order to elucidate at once its mathematical and metaphysical challenges through proper non-classical logical reasoning. I will restrain my argument to the definition of the different types of discrete collections according to Peirce, with a special regard to the phenomenon called premonition of continuity (Peirce, 1976, Vol. 3, p. 87, c. 1897). © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
This study was undertaken to characterize the effects of monotonous training at lactate minimum (LM) intensity on aerobic and anaerobic performances; glycogen concentrationsin the soleus muscle, the gastrocnemius muscle and the liver; and creatine kinase (CK), free fatty acids and glucose concentrations in rats. The rats were separated into trained (n =10), baseline (n = 10) and sedentary (n=10) groups. The trained group was submitted to the following: 60 min/day, 6 day/week and intensity equivalent to LM during the 12-week training period. The training volume was reduced after four weeks according to a sigmoid function. The total CK (U/L) increased in the trained group after 12 weeks (742.0±158.5) in comparison with the baseline (319.6±40.2) and the sedentary (261.6+42.2) groups. Free fatty acids and glycogen stores (liver, soleus muscle and gastrocnemius muscle) increased after 12 weeks of monotonous training but aerobic and anaerobic performances were unchanged in relation to the sedentary group. The monotonous training at LM increased the level of energy substrates, unchanged aerobic performance, reduced anaerobic capacity and increased the serum CK concentration; however, the rats did not achieve the predicted training volume.
Resumo:
Dosage and frequency of treatment schedules are important for successful chemotherapy. However, in this work we argue that cell-kill response and tumoral growth should not be seen as separate and therefore are essential in a mathematical cancer model. This paper presents a mathematical model for sequencing of cancer chemotherapy and surgery. Our purpose is to investigate treatments for large human tumours considering a suitable cell-kill dynamics. We use some biological and pharmacological data in a numerical approach, where drug administration occurs in cycles (periodic infusion) and surgery is performed instantaneously. Moreover, we also present an analysis of stability for a chemotherapeutic model with continuous drug administration. According to Norton & Simon [22], our results indicate that chemotherapy is less eficient in treating tumours that have reached a plateau level of growing and that a combination with surgical treatment can provide better outcomes.
Resumo:
In this paper is reported the use of the chromatographic profiles of volatiles to determine disease markers in plants - in this case, leaves of Eucalyptus globulus contaminated by the necrotroph fungus Teratosphaeria nubilosa. The volatile fraction was isolated by headspace solid phase microextraction (HS-SPME) and analyzed by comprehensive two-dimensional gas chromatography-fast quadrupole mass spectrometry (GC. ×. GC-qMS). For the correlation between the metabolic profile described by the chromatograms and the presence of the infection, unfolded-partial least squares discriminant analysis (U-PLS-DA) with orthogonal signal correction (OSC) were employed. The proposed method was checked to be independent of factors such as the age of the harvested plants. The manipulation of the mathematical model obtained also resulted in graphic representations similar to real chromatograms, which allowed the tentative identification of more than 40 compounds potentially useful as disease biomarkers for this plant/pathogen pair. The proposed methodology can be considered as highly reliable, since the diagnosis is based on the whole chromatographic profile rather than in the detection of a single analyte. © 2013 Elsevier B.V..
Resumo:
An inclusive search for supersymmetric processes that produce final states with jets and missing transverse energy is performed in pp collisions at a centre-of-mass energy of 8 TeV. The data sample corresponds to an integrated luminosity of 11.7 fb-1 collected by the CMS experiment at the LHC. In this search, a dimensionless kinematic variable, αT, is used to discriminate between events with genuine and misreconstructed missing transverse energy. The search is based on an examination of the number of reconstructed jets per event, the scalar sum of transverse energies of these jets, and the number of these jets identified as originating from bottom quarks. No significant excess of events over the standard model expectation is found. Exclusion limits are set in the parameter space of simplified models, with a special emphasis on both compressed-spectrum scenarios and direct or gluino-induced production of third-generation squarks. For the case of gluino-mediated squark production, gluino masses up to 950-1125 GeV are excluded depending on the assumed model. For the direct pair-production of squarks, masses up to 450 GeV are excluded for a single light first- or second-generation squark, increasing to 600 GeV for bottom squarks. © 2013 CERN for the benefit of the CMS collaboration.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Educação Matemática - IGCE
Resumo:
Pós-graduação em Educação Matemática - IGCE
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)