973 resultados para Mathematical functions
Resumo:
Pós-graduação em Educação Matemática - IGCE
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Foraminiferal data were obtained from 66 samples of box cores on the southeastern Brazilian upper margin (between 23.8A degrees-25.9A degrees S and 42.8A degrees-46.13A degrees W) to evaluate the benthic foraminiferal fauna distribution and its relation to some selected abiotic parameters. We focused on areas with different primary production regimes on the southern Brazilian margin, which is generally considered as an oligotrophic region. The total density (D), richness (R), mean diversity (H) over bar`, average living depth (ALD(X) ) and percentages of specimens of different microhabitats (epifauna, shallow infauna, intermediate infauna and deep infauna) were analyzed. The dominant species identified were Uvigerina spp., Globocassidulina subglobosa, Bulimina marginata, Adercotryma wrighti, Islandiella norcrossi, Rhizammina spp. and Brizalina sp.. We also established a set of mathematical functions for analyzing the vertical foraminiferal distribution patterns, providing a quantitative tool that allows correlating the microfaunal density distributions with abiotic factors. In general, the cores that fit with pure exponential decaying functions were related to the oligotrophic conditions prevalent on the Brazilian margin and to the flow of the Brazilian Current (BC). Different foraminiferal responses were identified in cores located in higher productivity zones, such as the northern and the southern region of the study area, where high percentages of infauna were encountered in these cores, and the functions used to fit these profiles differ appreciably from a pure exponential function, as a response of the significant living fauna in deeper layers of the sediment. One of the main factors supporting the different foraminiferal assemblage responses may be related to the differences in primary productivity of the water column and, consequently, in the estimated carbon flux to the sea floor. Nevertheless, also bottom water velocities, substrate type and water depth need to be considered.
Resumo:
Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.
Resumo:
The aim of this study is to develop a new simple method for analyzing one-dimensional transcranial magnetic stimulation (TMS) mapping studies in humans. Motor evoked potentials (MEP) were recorded from the abductor pollicis brevis (APB) muscle during stimulation at nine different positions on the scalp along a line passing through the APB hot spot and the vertex. Non-linear curve fitting according to the Levenberg-Marquardt algorithm was performed on the averaged amplitude values obtained at all points to find the best-fitting symmetrical and asymmetrical peak functions. Several peak functions could be fitted to the experimental data. Across all subjects, a symmetric, bell-shaped curve, the complementary error function (erfc) gave the best results. This function is characterized by three parameters giving its amplitude, position, and width. None of the mathematical functions tested with less or more than three parameters fitted better. The amplitude and position parameters of the erfc were highly correlated with the amplitude at the hot spot and with the location of the center of gravity of the TMS curve. In conclusion, non-linear curve fitting is an accurate method for the mathematical characterization of one-dimensional TMS curves. This is the first method that provides information on amplitude, position and width simultaneously.
Resumo:
Knowledge of the time interval from death (post-mortem interval, PMI) has an enormous legal, criminological and psychological impact. Aiming to find an objective method for the determination of PMIs in forensic medicine, 1H-MR spectroscopy (1H-MRS) was used in a sheep head model to follow changes in brain metabolite concentrations after death. Following the characterization of newly observed metabolites (Ith et al., Magn. Reson. Med. 2002; 5: 915-920), the full set of acquired spectra was analyzed statistically to provide a quantitative estimation of PMIs with their respective confidence limits. In a first step, analytical mathematical functions are proposed to describe the time courses of 10 metabolites in the decomposing brain up to 3 weeks post-mortem. Subsequently, the inverted functions are used to predict PMIs based on the measured metabolite concentrations. Individual PMIs calculated from five different metabolites are then pooled, being weighted by their inverse variances. The predicted PMIs from all individual examinations in the sheep model are compared with known true times. In addition, four human cases with forensically estimated PMIs are compared with predictions based on single in situ MRS measurements. Interpretation of the individual sheep examinations gave a good correlation up to 250 h post-mortem, demonstrating that the predicted PMIs are consistent with the data used to generate the model. Comparison of the estimated PMIs with the forensically determined PMIs in the four human cases shows an adequate correlation. Current PMI estimations based on forensic methods typically suffer from uncertainties in the order of days to weeks without mathematically defined confidence information. In turn, a single 1H-MRS measurement of brain tissue in situ results in PMIs with defined and favorable confidence intervals in the range of hours, thus offering a quantitative and objective method for the determination of PMIs.
Resumo:
Motivated by these difficulties, Castillo et al. (2012) made some suggestions on how to build consistent stochastic models avoiding the selection of easy to use mathematical functions, which were replaced by those resulting from a set of properties to be satisfied by the model.
Resumo:
Many computer vision and human-computer interaction applications developed in recent years need evaluating complex and continuous mathematical functions as an essential step toward proper operation. However, rigorous evaluation of this kind of functions often implies a very high computational cost, unacceptable in real-time applications. To alleviate this problem, functions are commonly approximated by simpler piecewise-polynomial representations. Following this idea, we propose a novel, efficient, and practical technique to evaluate complex and continuous functions using a nearly optimal design of two types of piecewise linear approximations in the case of a large budget of evaluation subintervals. To this end, we develop a thorough error analysis that yields asymptotically tight bounds to accurately quantify the approximation performance of both representations. It provides an improvement upon previous error estimates and allows the user to control the trade-off between the approximation error and the number of evaluation subintervals. To guarantee real-time operation, the method is suitable for, but not limited to, an efficient implementation in modern Graphics Processing Units (GPUs), where it outperforms previous alternative approaches by exploiting the fixed-function interpolation routines present in their texture units. The proposed technique is a perfect match for any application requiring the evaluation of continuous functions, we have measured in detail its quality and efficiency on several functions, and, in particular, the Gaussian function because it is extensively used in many areas of computer vision and cybernetics, and it is expensive to evaluate.
Resumo:
This thesis presents an approach to cutting dynamics during turning based upon the mechanism of deformation of work material around the tool nose known as "ploughing". Starting from the shearing process in the cutting zone and accounting for "ploughing", new mathematical models relating turning force components to cutting conditions, tool geometry and tool vibration are developed. These models are developed separately for steady state and for oscillatory turning with new and worn tools. Experimental results are used to determine mathematical functions expressing the parameters introduced by the steady state model in the case of a new tool. The form of these functions are of general validity though their coefficients are dependent on work and tool materials. Good agreement is achieved between experimental and predicted forces. The model is extended on one hand to include different work material by introducing a hardness factor. The model provides good predictions when predicted forces are compared to present and published experimental results. On the other hand, the extension of the ploughing model to taming with a worn edge showed the ability of the model in predicting machining forces during steady state turning with the worn flank of the tool. In the development of the dynamic models, the dynamic turning force equations define the cutting process as being a system for which vibration of the tool tip in the feed direction is the input and measured forces are the output The model takes into account the shear plane oscillation and the cutting configuration variation in response to tool motion. Theoretical expressions of the turning forces are obtained for new and worn cutting edges. The dynamic analysis revealed the interaction between the cutting mechanism and the machine tool structure. The effect of the machine tool and tool post is accounted for by using experimental data of the transfer function of the tool post system. Steady state coefficients are corrected to include the changes in the cutting configuration with tool vibration and are used in the dynamic model. A series of oscillatory cutting tests at various conditions and various tool flank wear levels are carried out and experimental results are compared with model—predicted forces. Good agreement between predictions and experiments were achieved over a wide range of cutting conditions. This research bridges the gap between the analysis of vibration and turning forces in turning. It offers an explicit expression of the dynamic turning force generated during machining and highlights the relationships between tool wear, tool vibration and turning force. Spectral analysis of tool acceleration and turning force components led to define an "Inertance Power Ratio" as a flank wear monitoring factor. A formulation of an on—line flank wear monitoring methodology is presented and shows how the results of the present model can be applied to practical in—process tool wear monitoring in • turning operations.
Resumo:
The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.
Resumo:
This dissertation introduces an integrated algorithm for a new application dedicated at discriminating between electrodes leading to a seizure onset and those that do not, using interictal subdural EEG data. The significance of this study is in determining among all of these channels, all containing interictal spikes, why some electrodes eventually lead to seizure while others do not. A first finding in the development process of the algorithm is that these interictal spikes had to be asynchronous and should be located in different regions of the brain, before any consequential interpretations of EEG behavioral patterns are possible. A singular merit of the proposed approach is that even when the EEG data is randomly selected (independent of the onset of seizure), we are able to classify those channels that lead to seizure from those that do not. It is also revealed that the region of ictal activity does not necessarily evolve from the tissue located at the channels that present interictal activity, as commonly believed.^ The study is also significant in terms of correlating clinical features of EEG with the patient's source of ictal activity, which is coming from a specific subset of channels that present interictal activity. The contributions of this dissertation emanate from (a) the choice made on the discriminating parameters used in the implementation, (b) the unique feature space that was used to optimize the delineation process of these two type of electrodes, (c) the development of back-propagation neural network that automated the decision making process, and (d) the establishment of mathematical functions that elicited the reasons for this delineation process. ^
Resumo:
Foraminiferal data were obtained from 66 samples of box cores on the southeastern Brazilian upper margin (between 23.8A degrees-25.9A degrees S and 42.8A degrees-46.13A degrees W) to evaluate the benthic foraminiferal fauna distribution and its relation to some selected abiotic parameters. We focused on areas with different primary production regimes on the southern Brazilian margin, which is generally considered as an oligotrophic region. The total density (D), richness (R), mean diversity (H) over bar', average living depth (ALD(X) ) and percentages of specimens of different microhabitats (epifauna, shallow infauna, intermediate infauna and deep infauna) were analyzed. The dominant species identified were Uvigerina spp., Globocassidulina subglobosa, Bulimina marginata, Adercotryma wrighti, Islandiella norcrossi, Rhizammina spp. and Brizalina sp.. We also established a set of mathematical functions for analyzing the vertical foraminiferal distribution patterns, providing a quantitative tool that allows correlating the microfaunal density distributions with abiotic factors. In general, the cores that fit with pure exponential decaying functions were related to the oligotrophic conditions prevalent on the Brazilian margin and to the flow of the Brazilian Current (BC). Different foraminiferal responses were identified in cores located in higher productivity zones, such as the northern and the southern region of the study area, where high percentages of infauna were encountered in these cores, and the functions used to fit these profiles differ appreciably from a pure exponential function, as a response of the significant living fauna in deeper layers of the sediment. One of the main factors supporting the different foraminiferal assemblage responses may be related to the differences in primary productivity of the water column and, consequently, in the estimated carbon flux to the sea floor. Nevertheless, also bottom water velocities, substrate type and water depth need to be considered.