716 resultados para Mathematical computing
Resumo:
Using the results recently obtained for computing integrals over (non-minimal) pure spinor superspace, we compute the coefficient of the massless two-loop four-point amplitude from first principles. Contrasting with the mathematical difficulties in the RNS formalism where unknown normalizations of chiral determinant formulæ force the two-loop coefficient to be determined only indirectly through factorization, the computation in the pure spinor formalism can be smoothly carried out. © SISSA 2010.
Resumo:
This paper presents an individual designing prosthesis for surgical use and proposes a methodology for such design through mathematical extrapolation of data from digital images obtained via tomography of individual patient's bones. Individually tailored prosthesis designed to fit particular patient requirements as accurately as possible should result in more successful reconstruction, enable better planning before surgery and consequently fewer complications during surgery. Fast and accurate design and manufacture of personalized prosthesis for surgical use in bone replacement or reconstruction is potentially feasible through the application and integration of several different existing technologies, which are each at different stages of maturity. Initial case study experiments have been undertaken to validate the research concepts by making dimensional comparisons between a bone and a virtual model produced using the proposed methodology and a future research directions are discussed.
Resumo:
In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.
Resumo:
This paper presents three methods for automatic detection of dust devils tracks in images of Mars. The methods are mainly based on Mathematical Morphology and results of their performance are analyzed and compared. A dataset of 21 images from the surface of Mars representative of the diversity of those track features were considered for developing, testing and evaluating our methods, confronting their outputs with ground truth images made manually. Methods 1 and 3, based on closing top-hat and path closing top-hat, respectively, showed similar mean accuracies around 90% but the time of processing was much greater for method 1 than for method 3. Method 2, based on radial closing, was the fastest but showed worse mean accuracy. Thus, this was the tiebreak factor. © 2011 Springer-Verlag.
Resumo:
According to Peirce one of the most important philosophical problems is continuity. Consequently, he set forth an innovative and peculiar approach in order to elucidate at once its mathematical and metaphysical challenges through proper non-classical logical reasoning. I will restrain my argument to the definition of the different types of discrete collections according to Peirce, with a special regard to the phenomenon called premonition of continuity (Peirce, 1976, Vol. 3, p. 87, c. 1897). © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Includes bibliography
Resumo:
This study was undertaken to characterize the effects of monotonous training at lactate minimum (LM) intensity on aerobic and anaerobic performances; glycogen concentrationsin the soleus muscle, the gastrocnemius muscle and the liver; and creatine kinase (CK), free fatty acids and glucose concentrations in rats. The rats were separated into trained (n =10), baseline (n = 10) and sedentary (n=10) groups. The trained group was submitted to the following: 60 min/day, 6 day/week and intensity equivalent to LM during the 12-week training period. The training volume was reduced after four weeks according to a sigmoid function. The total CK (U/L) increased in the trained group after 12 weeks (742.0±158.5) in comparison with the baseline (319.6±40.2) and the sedentary (261.6+42.2) groups. Free fatty acids and glycogen stores (liver, soleus muscle and gastrocnemius muscle) increased after 12 weeks of monotonous training but aerobic and anaerobic performances were unchanged in relation to the sedentary group. The monotonous training at LM increased the level of energy substrates, unchanged aerobic performance, reduced anaerobic capacity and increased the serum CK concentration; however, the rats did not achieve the predicted training volume.
Resumo:
Dosage and frequency of treatment schedules are important for successful chemotherapy. However, in this work we argue that cell-kill response and tumoral growth should not be seen as separate and therefore are essential in a mathematical cancer model. This paper presents a mathematical model for sequencing of cancer chemotherapy and surgery. Our purpose is to investigate treatments for large human tumours considering a suitable cell-kill dynamics. We use some biological and pharmacological data in a numerical approach, where drug administration occurs in cycles (periodic infusion) and surgery is performed instantaneously. Moreover, we also present an analysis of stability for a chemotherapeutic model with continuous drug administration. According to Norton & Simon [22], our results indicate that chemotherapy is less eficient in treating tumours that have reached a plateau level of growing and that a combination with surgical treatment can provide better outcomes.
Resumo:
The present paper proposes a new hybrid multi-population genetic algorithm (HMPGA) as an approach to solve the multi-level capacitated lot sizing problem with backlogging. This method combines a multi-population based metaheuristic using fix-and-optimize heuristic and mathematical programming techniques. A total of four test sets from the MULTILSB (Multi-Item Lot-Sizing with Backlogging) library are solved and the results are compared with those reached by two other methods recently published. The results have shown that HMPGA had a better performance for most of the test sets solved, specially when longer computing time is given. © 2012 Elsevier Ltd.
Resumo:
An inclusive search for supersymmetric processes that produce final states with jets and missing transverse energy is performed in pp collisions at a centre-of-mass energy of 8 TeV. The data sample corresponds to an integrated luminosity of 11.7 fb-1 collected by the CMS experiment at the LHC. In this search, a dimensionless kinematic variable, αT, is used to discriminate between events with genuine and misreconstructed missing transverse energy. The search is based on an examination of the number of reconstructed jets per event, the scalar sum of transverse energies of these jets, and the number of these jets identified as originating from bottom quarks. No significant excess of events over the standard model expectation is found. Exclusion limits are set in the parameter space of simplified models, with a special emphasis on both compressed-spectrum scenarios and direct or gluino-induced production of third-generation squarks. For the case of gluino-mediated squark production, gluino masses up to 950-1125 GeV are excluded depending on the assumed model. For the direct pair-production of squarks, masses up to 450 GeV are excluded for a single light first- or second-generation squark, increasing to 600 GeV for bottom squarks. © 2013 CERN for the benefit of the CMS collaboration.
Resumo:
Presenta en forma sintetica un conjunto de sistemas computacionales que se utilizan actualmente en tareas estadisticas en los paises desarrollados, y describe las actividades que la CEPAL esta desarrollando en este campo y las posibilidades de cooperacion regional en que esta empenada.
Resumo:
This paper develops a novel full analytic model for vibration analysis of solid-state electronic components. The model is just as accurate as finite element models and numerically light enough to permit for quick design trade-offs and statistical analysis. The paper shows the development of the model, comparison to finite elements and an application to a common engineering problem. A gull-wing flat pack component was selected as the benchmark test case, although the presented methodology is applicable to a wide range of component packages. Results showed very good agreement between the presented method and finite elements and demonstrated the usefulness of the method in how to use standard test data for a general application. © 2013 Elsevier Ltd.
Resumo:
Parametric VaR (Value-at-Risk) is widely used due to its simplicity and easy calculation. However, the normality assumption, often used in the estimation of the parametric VaR, does not provide satisfactory estimates for risk exposure. Therefore, this study suggests a method for computing the parametric VaR based on goodness-of-fit tests using the empirical distribution function (EDF) for extreme returns, and compares the feasibility of this method for the banking sector in an emerging market and in a developed one. The paper also discusses possible theoretical contributions in related fields like enterprise risk management (ERM). © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Includes bibliography.