92 resultados para Mathematical computing
Resumo:
This study was undertaken to characterize the effects of monotonous training at lactate minimum (LM) intensity on aerobic and anaerobic performances; glycogen concentrationsin the soleus muscle, the gastrocnemius muscle and the liver; and creatine kinase (CK), free fatty acids and glucose concentrations in rats. The rats were separated into trained (n =10), baseline (n = 10) and sedentary (n=10) groups. The trained group was submitted to the following: 60 min/day, 6 day/week and intensity equivalent to LM during the 12-week training period. The training volume was reduced after four weeks according to a sigmoid function. The total CK (U/L) increased in the trained group after 12 weeks (742.0±158.5) in comparison with the baseline (319.6±40.2) and the sedentary (261.6+42.2) groups. Free fatty acids and glycogen stores (liver, soleus muscle and gastrocnemius muscle) increased after 12 weeks of monotonous training but aerobic and anaerobic performances were unchanged in relation to the sedentary group. The monotonous training at LM increased the level of energy substrates, unchanged aerobic performance, reduced anaerobic capacity and increased the serum CK concentration; however, the rats did not achieve the predicted training volume.
Resumo:
Dosage and frequency of treatment schedules are important for successful chemotherapy. However, in this work we argue that cell-kill response and tumoral growth should not be seen as separate and therefore are essential in a mathematical cancer model. This paper presents a mathematical model for sequencing of cancer chemotherapy and surgery. Our purpose is to investigate treatments for large human tumours considering a suitable cell-kill dynamics. We use some biological and pharmacological data in a numerical approach, where drug administration occurs in cycles (periodic infusion) and surgery is performed instantaneously. Moreover, we also present an analysis of stability for a chemotherapeutic model with continuous drug administration. According to Norton & Simon [22], our results indicate that chemotherapy is less eficient in treating tumours that have reached a plateau level of growing and that a combination with surgical treatment can provide better outcomes.
Resumo:
The present paper proposes a new hybrid multi-population genetic algorithm (HMPGA) as an approach to solve the multi-level capacitated lot sizing problem with backlogging. This method combines a multi-population based metaheuristic using fix-and-optimize heuristic and mathematical programming techniques. A total of four test sets from the MULTILSB (Multi-Item Lot-Sizing with Backlogging) library are solved and the results are compared with those reached by two other methods recently published. The results have shown that HMPGA had a better performance for most of the test sets solved, specially when longer computing time is given. © 2012 Elsevier Ltd.
Resumo:
An inclusive search for supersymmetric processes that produce final states with jets and missing transverse energy is performed in pp collisions at a centre-of-mass energy of 8 TeV. The data sample corresponds to an integrated luminosity of 11.7 fb-1 collected by the CMS experiment at the LHC. In this search, a dimensionless kinematic variable, αT, is used to discriminate between events with genuine and misreconstructed missing transverse energy. The search is based on an examination of the number of reconstructed jets per event, the scalar sum of transverse energies of these jets, and the number of these jets identified as originating from bottom quarks. No significant excess of events over the standard model expectation is found. Exclusion limits are set in the parameter space of simplified models, with a special emphasis on both compressed-spectrum scenarios and direct or gluino-induced production of third-generation squarks. For the case of gluino-mediated squark production, gluino masses up to 950-1125 GeV are excluded depending on the assumed model. For the direct pair-production of squarks, masses up to 450 GeV are excluded for a single light first- or second-generation squark, increasing to 600 GeV for bottom squarks. © 2013 CERN for the benefit of the CMS collaboration.
Resumo:
This paper develops a novel full analytic model for vibration analysis of solid-state electronic components. The model is just as accurate as finite element models and numerically light enough to permit for quick design trade-offs and statistical analysis. The paper shows the development of the model, comparison to finite elements and an application to a common engineering problem. A gull-wing flat pack component was selected as the benchmark test case, although the presented methodology is applicable to a wide range of component packages. Results showed very good agreement between the presented method and finite elements and demonstrated the usefulness of the method in how to use standard test data for a general application. © 2013 Elsevier Ltd.
Resumo:
Parametric VaR (Value-at-Risk) is widely used due to its simplicity and easy calculation. However, the normality assumption, often used in the estimation of the parametric VaR, does not provide satisfactory estimates for risk exposure. Therefore, this study suggests a method for computing the parametric VaR based on goodness-of-fit tests using the empirical distribution function (EDF) for extreme returns, and compares the feasibility of this method for the banking sector in an emerging market and in a developed one. The paper also discusses possible theoretical contributions in related fields like enterprise risk management (ERM). © 2013 Elsevier Ltd.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Agronomia (Genética e Melhoramento de Plantas) - FCAV
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this study, the flocculation process in continuous systems with chambers in series was analyzed using the classical kinetic model of aggregation and break-up proposed by Argaman and Kaufman, which incorporates two main parameters: K (a) and K (b). Typical values for these parameters were used, i. e., K (a) = 3.68 x 10(-5)-1.83 x 10(-4) and K (b) = 1.83 x 10(-7)-2.30 x 10(-7) s(-1). The analysis consisted of performing simulations of system behavior under different operating conditions, including variations in the number of chambers used and the utilization of fixed or scaled velocity gradients in the units. The response variable analyzed in all simulations was the total retention time necessary to achieve a given flocculation efficiency, which was determined by means of conventional solution methods of nonlinear algebraic equations, corresponding to the material balances on the system. Values for the number of chambers ranging from 1 to 5, velocity gradients of 20-60 s(-1) and flocculation efficiencies of 50-90 % were adopted.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this article, the authors investigate, from an interdisciplinary perspective, possible ethical implications of the presence of ubiquitous computing systems in human perception/action. The term ubiquitous computing is used to characterize information-processing capacity from computers that are available everywhere and all the time, integrated into everyday objects and activities. The contrast in approach to aspects of ubiquitous computing between traditional considerations of ethical issues and the Ecological Philosophy view concerning its possible consequences in the context of perception/action are the underlying themes of this paper. The focus is on an analysis of how the generalized dissemination of microprocessors in embedded systems, commanded by a ubiquitous computing system, can affect the behaviour of people considered as embodied embedded agents.