852 resultados para mathematical competency
Resumo:
This paper seeks to apply a routine for highways detection through the mathematical morphology tools in high resolution image. The Mathematical Morphology theory consists of describing structures geometric presents quantitatively in the image (targets or features). This explains the use of the Mathematical Morphology in this work. As high resolution images will be used, the largest difficulty in the highways detection process is the presence of trees and automobiles in the borders tracks. Like this, for the obtaining of good results through the use of morphologic tools was necessary to choose the structuring element appropriately to be used in the functions. Through the appropriate choice of the morphologic operators and structuring elements it was possible to detect the highways tracks. The linear feature detection using mathematical morphology techniques, can contribute in cartographic applications, as cartographic products updating.
Resumo:
In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.
Resumo:
This paper presents three methods for automatic detection of dust devils tracks in images of Mars. The methods are mainly based on Mathematical Morphology and results of their performance are analyzed and compared. A dataset of 21 images from the surface of Mars representative of the diversity of those track features were considered for developing, testing and evaluating our methods, confronting their outputs with ground truth images made manually. Methods 1 and 3, based on closing top-hat and path closing top-hat, respectively, showed similar mean accuracies around 90% but the time of processing was much greater for method 1 than for method 3. Method 2, based on radial closing, was the fastest but showed worse mean accuracy. Thus, this was the tiebreak factor. © 2011 Springer-Verlag.
Resumo:
According to Peirce one of the most important philosophical problems is continuity. Consequently, he set forth an innovative and peculiar approach in order to elucidate at once its mathematical and metaphysical challenges through proper non-classical logical reasoning. I will restrain my argument to the definition of the different types of discrete collections according to Peirce, with a special regard to the phenomenon called premonition of continuity (Peirce, 1976, Vol. 3, p. 87, c. 1897). © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Includes bibliography
Resumo:
This study was undertaken to characterize the effects of monotonous training at lactate minimum (LM) intensity on aerobic and anaerobic performances; glycogen concentrationsin the soleus muscle, the gastrocnemius muscle and the liver; and creatine kinase (CK), free fatty acids and glucose concentrations in rats. The rats were separated into trained (n =10), baseline (n = 10) and sedentary (n=10) groups. The trained group was submitted to the following: 60 min/day, 6 day/week and intensity equivalent to LM during the 12-week training period. The training volume was reduced after four weeks according to a sigmoid function. The total CK (U/L) increased in the trained group after 12 weeks (742.0±158.5) in comparison with the baseline (319.6±40.2) and the sedentary (261.6+42.2) groups. Free fatty acids and glycogen stores (liver, soleus muscle and gastrocnemius muscle) increased after 12 weeks of monotonous training but aerobic and anaerobic performances were unchanged in relation to the sedentary group. The monotonous training at LM increased the level of energy substrates, unchanged aerobic performance, reduced anaerobic capacity and increased the serum CK concentration; however, the rats did not achieve the predicted training volume.
Resumo:
Dosage and frequency of treatment schedules are important for successful chemotherapy. However, in this work we argue that cell-kill response and tumoral growth should not be seen as separate and therefore are essential in a mathematical cancer model. This paper presents a mathematical model for sequencing of cancer chemotherapy and surgery. Our purpose is to investigate treatments for large human tumours considering a suitable cell-kill dynamics. We use some biological and pharmacological data in a numerical approach, where drug administration occurs in cycles (periodic infusion) and surgery is performed instantaneously. Moreover, we also present an analysis of stability for a chemotherapeutic model with continuous drug administration. According to Norton & Simon [22], our results indicate that chemotherapy is less eficient in treating tumours that have reached a plateau level of growing and that a combination with surgical treatment can provide better outcomes.
Resumo:
An inclusive search for supersymmetric processes that produce final states with jets and missing transverse energy is performed in pp collisions at a centre-of-mass energy of 8 TeV. The data sample corresponds to an integrated luminosity of 11.7 fb-1 collected by the CMS experiment at the LHC. In this search, a dimensionless kinematic variable, αT, is used to discriminate between events with genuine and misreconstructed missing transverse energy. The search is based on an examination of the number of reconstructed jets per event, the scalar sum of transverse energies of these jets, and the number of these jets identified as originating from bottom quarks. No significant excess of events over the standard model expectation is found. Exclusion limits are set in the parameter space of simplified models, with a special emphasis on both compressed-spectrum scenarios and direct or gluino-induced production of third-generation squarks. For the case of gluino-mediated squark production, gluino masses up to 950-1125 GeV are excluded depending on the assumed model. For the direct pair-production of squarks, masses up to 450 GeV are excluded for a single light first- or second-generation squark, increasing to 600 GeV for bottom squarks. © 2013 CERN for the benefit of the CMS collaboration.
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this study, the flocculation process in continuous systems with chambers in series was analyzed using the classical kinetic model of aggregation and break-up proposed by Argaman and Kaufman, which incorporates two main parameters: K (a) and K (b). Typical values for these parameters were used, i. e., K (a) = 3.68 x 10(-5)-1.83 x 10(-4) and K (b) = 1.83 x 10(-7)-2.30 x 10(-7) s(-1). The analysis consisted of performing simulations of system behavior under different operating conditions, including variations in the number of chambers used and the utilization of fixed or scaled velocity gradients in the units. The response variable analyzed in all simulations was the total retention time necessary to achieve a given flocculation efficiency, which was determined by means of conventional solution methods of nonlinear algebraic equations, corresponding to the material balances on the system. Values for the number of chambers ranging from 1 to 5, velocity gradients of 20-60 s(-1) and flocculation efficiencies of 50-90 % were adopted.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)