864 resultados para Production Inventory Model with Switching Time
Resumo:
We use ideas on integrability in higher dimensions to define Lorentz invariant field theories with an infinite number of local conserved currents. The models considered have a two-dimensional target space. Requiring the existence of lagrangean and the stability of static solutions singles out a class of models which have an additional conformal symmetry. That is used to explain the existence of an ansatz leading to solutions with non-trivial Hopf charges. © SISSA/ISAS 2002.
Resumo:
Includes bibliography
Resumo:
In this work we study a Hořava-like 5-dimensional model in the context of braneworld theory. The equations of motion of such model are obtained and, within the realm of warped geometry, we show that the model is consistent if and only if λ takes its relativistic value 1. Furthermore, we show that the elimination of problematic terms involving the warp factor second order derivatives are eliminated by imposing detailed balance condition in the bulk. Afterwards, Israel's junction conditions are computed, allowing the attainment of an effective Lagrangian in the visible brane. In particular, we show that the resultant effective Lagrangian in the brane corresponds to a (3 + 1)-dimensional Hořava-like model with an emergent positive cosmological constant but without detailed balance condition. Now, restoration of detailed balance condition, at this time imposed over the brane, plays an interesting role by fitting accordingly the sign of the arbitrary constant β, insuring a positive brane tension and a real energy for the graviton within its dispersion relation. Also, the brane consistency equations are obtained and, as a result, the model admits positive brane tensions in the compactification scheme if, and only if, β is negative and the detailed balance condition is imposed. © 2013 Springer-Verlag Berlin Heidelberg and Società Italiana di Fisica.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this paper, we propose a cure rate survival model by assuming the number of competing causes of the event of interest follows the Geometric distribution and the time to event follow a Birnbaum Saunders distribution. We consider a frequentist analysis for parameter estimation of a Geometric Birnbaum Saunders model with cure rate. Finally, to analyze a data set from the medical area. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.
Resumo:
Most superdiffusive Non-Markovian random walk models assume that correlations are maintained at all time scales, e. g., fractional Brownian motion, Levy walks, the Elephant walk and Alzheimer walk models. In the latter two models the random walker can always "remember" the initial times near t = 0. Assuming jump size distributions with finite variance, the question naturally arises: is superdiffusion possible if the walker is unable to recall the initial times? We give a conclusive answer to this general question, by studying a non-Markovian model in which the walker's memory of the past is weighted by a Gaussian centered at time t/2, at which time the walker had one half the present age, and with a standard deviation sigma t which grows linearly as the walker ages. For large widths we find that the model behaves similarly to the Elephant model, but for small widths this Gaussian memory profile model behaves like the Alzheimer walk model. We also report that the phenomenon of amnestically induced persistence, known to occur in the Alzheimer walk model, arises in the Gaussian memory profile model. We conclude that memory of the initial times is not a necessary condition for generating (log-periodic) superdiffusion. We show that the phenomenon of amnestically induced persistence extends to the case of a Gaussian memory profile.
Resumo:
The Standard Model of elementary particle physics was developed to describe the fundamental particles which constitute matter and the interactions between them. The Large Hadron Collider (LHC) at CERN in Geneva was built to solve some of the remaining open questions in the Standard Model and to explore physics beyond it, by colliding two proton beams at world-record centre-of-mass energies. The ATLAS experiment is designed to reconstruct particles and their decay products originating from these collisions. The precise reconstruction of particle trajectories plays an important role in the identification of particle jets which originate from bottom quarks (b-tagging). This thesis describes the step-wise commissioning of the ATLAS track reconstruction and b-tagging software and one of the first measurements of the b-jet production cross section in pp collisions at sqrt(s)=7 TeV with the ATLAS detector. The performance of the track reconstruction software was studied in great detail, first using data from cosmic ray showers and then collisions at sqrt(s)=900 GeV and 7 TeV. The good understanding of the track reconstruction software allowed a very early deployment of the b-tagging algorithms. First studies of these algorithms and the measurement of the b-tagging efficiency in the data are presented. They agree well with predictions from Monte Carlo simulations. The b-jet production cross section was measured with the 2010 dataset recorded by the ATLAS detector, employing muons in jets to estimate the fraction of b-jets. The measurement is in good agreement with the Standard Model predictions.
Resumo:
A field of computational neuroscience develops mathematical models to describe neuronal systems. The aim is to better understand the nervous system. Historically, the integrate-and-fire model, developed by Lapique in 1907, was the first model describing a neuron. In 1952 Hodgkin and Huxley [8] described the so called Hodgkin-Huxley model in the article “A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve”. The Hodgkin-Huxley model is one of the most successful and widely-used biological neuron models. Based on experimental data from the squid giant axon, Hodgkin and Huxley developed their mathematical model as a four-dimensional system of first-order ordinary differential equations. One of these equations characterizes the membrane potential as a process in time, whereas the other three equations depict the opening and closing state of sodium and potassium ion channels. The membrane potential is proportional to the sum of ionic current flowing across the membrane and an externally applied current. For various types of external input the membrane potential behaves differently. This thesis considers the following three types of input: (i) Rinzel and Miller [15] calculated an interval of amplitudes for a constant applied current, where the membrane potential is repetitively spiking; (ii) Aihara, Matsumoto and Ikegaya [1] said that dependent on the amplitude and the frequency of a periodic applied current the membrane potential responds periodically; (iii) Izhikevich [12] stated that brief pulses of positive and negative current with different amplitudes and frequencies can lead to a periodic response of the membrane potential. In chapter 1 the Hodgkin-Huxley model is introduced according to Izhikevich [12]. Besides the definition of the model, several biological and physiological notes are made, and further concepts are described by examples. Moreover, the numerical methods to solve the equations of the Hodgkin-Huxley model are presented which were used for the computer simulations in chapter 2 and chapter 3. In chapter 2 the statements for the three different inputs (i), (ii) and (iii) will be verified, and periodic behavior for the inputs (ii) and (iii) will be investigated. In chapter 3 the inputs are embedded in an Ornstein-Uhlenbeck process to see the influence of noise on the results of chapter 2.
Resumo:
The 1-D 1/2-spin XXZ model with staggered external magnetic field, when restricting to low field, can be mapped into the quantum sine-Gordon model through bosonization: this assures the presence of soliton, antisoliton and breather excitations in it. In particular, the action of the staggered field opens a gap so that these physical objects are stable against energetic fluctuations. In the present work, this model is studied both analytically and numerically. On the one hand, analytical calculations are made to solve exactly the model through Bethe ansatz: the solution for the XX + h staggered model is found first by means of Jordan-Wigner transformation and then through Bethe ansatz; after this stage, efforts are made to extend the latter approach to the XXZ + h staggered model (without finding its exact solution). On the other hand, the energies of the elementary soliton excitations are pinpointed through static DMRG (Density Matrix Renormalization Group) for different values of the parameters in the hamiltonian. Breathers are found to be in the antiferromagnetic region only, while solitons and antisolitons are present both in the ferromagnetic and antiferromagnetic region. Their single-site z-magnetization expectation values are also computed to see how they appear in real space, and time-dependent DMRG is employed to realize quenches on the hamiltonian parameters to monitor their time-evolution. The results obtained reveal the quantum nature of these objects and provide some information about their features. Further studies and a better understanding of their properties could bring to the realization of a two-level state through a soliton-antisoliton pair, in order to implement a qubit.
Resumo:
Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed modesl and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated marginal residual vector by the Cholesky decomposition of the inverse of the estimated marginal variance matrix. Linear functions or the resulting "rotated" residuals are used to construct an empirical cumulative distribution function (ECDF), whose stochastic limit is characterized. We describe a resampling technique that serves as a computationally efficient parametric bootstrap for generating representatives of the stochastic limit of the ECDF. Through functionals, such representatives are used to construct global tests for the hypothesis of normal margional errors. In addition, we demonstrate that the ECDF of the predicted random effects, as described by Lange and Ryan (1989), can be formulated as a special case of our approach. Thus, our method supports both omnibus and directed tests. Our method works well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series).