963 resultados para continuous-time models
Resumo:
A compositional time series is obtained when a compositional data vector is observed atdifferent points in time. Inherently, then, a compositional time series is a multivariatetime series with important constraints on the variables observed at any instance in time.Although this type of data frequently occurs in situations of real practical interest, atrawl through the statistical literature reveals that research in the field is very much in itsinfancy and that many theoretical and empirical issues still remain to be addressed. Anyappropriate statistical methodology for the analysis of compositional time series musttake into account the constraints which are not allowed for by the usual statisticaltechniques available for analysing multivariate time series. One general approach toanalyzing compositional time series consists in the application of an initial transform tobreak the positive and unit sum constraints, followed by the analysis of the transformedtime series using multivariate ARIMA models. In this paper we discuss the use of theadditive log-ratio, centred log-ratio and isometric log-ratio transforms. We also presentresults from an empirical study designed to explore how the selection of the initialtransform affects subsequent multivariate ARIMA modelling as well as the quality ofthe forecasts
Resumo:
The composition of the labour force is an important economic factor for a country.Often the changes in proportions of different groups are of interest.I this paper we study a monthly compositional time series from the Swedish LabourForce Survey from 1994 to 2005. Three models are studied: the ILR-transformed series,the ILR-transformation of the compositional differenced series of order 1, and the ILRtransformationof the compositional differenced series of order 12. For each of thethree models a VAR-model is fitted based on the data 1994-2003. We predict the timeseries 15 steps ahead and calculate 95 % prediction regions. The predictions of thethree models are compared with actual values using MAD and MSE and the predictionregions are compared graphically in a ternary time series plot.We conclude that the first, and simplest, model possesses the best predictive power ofthe three models
Resumo:
BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.
Resumo:
BACKGROUND: The ideal local anesthetic regime for femoral nerve block that balances analgesia with mobility after total knee arthroplasty (TKA) remains undefined. QUESTIONS/PURPOSES: We compared two volumes and concentrations of a fixed dose of ropivacaine for continuous femoral nerve block after TKA to a single injection femoral nerve block with ropivacaine to determine (1) time to discharge readiness; (2) early pain scores and analgesic consumption; and (3) functional outcomes, including range of motion and WOMAC scores at the time of recovery. METHODS: Ninety-nine patients were allocated to one of three continuous femoral nerve block groups for this randomized, placebo-controlled, double-blind trial: a high concentration group (ropivacaine 0.2% infusion), a low concentration group (ropivacaine 0.1% infusion), or a placebo infusion group (saline 0.9% infusion). Infusions were discontinued on postoperative Day (POD) 2. The primary outcome was time to discharge readiness. Secondary outcomes included opioid consumption, pain, and functional outcomes. Ninety-three patients completed the study protocol; the study was halted early because of unanticipated changes to pain protocols at the host institution, by which time only 61% of the required number of patients had been enrolled. RESULTS: With the numbers available, the mean time to discharge readiness was not different between groups (high concentration group, 62 hours [95% confidence interval [CI], 51-72 hours]; low concentration group, 73 hours [95% CI, 63-83 hours]; placebo infusion group 65 hours [95% CI, 56-75 hours]; p = 0.27). Patients in the low concentration group consumed significantly less morphine during the period of infusion (POD 1, high concentration group, 56 mg [95% CI, 42-70 mg]; low concentration group, 35 mg [95% CI, 27-43 mg]; placebo infusion group, 48 mg [95% CI, 38-59 mg], p = 0.02; POD 2, high concentration group, 50 mg [95% CI, 41-60 mg]; low concentration group, 33 mg [95% CI, 24-42 mg]; placebo infusion group, 39 mg [95% CI, 30-48 mg], p = 0.04); however, there were no important differences in pain scores or opioid-related side effects with the numbers available. Likewise, there were no important differences in functional outcomes between groups. CONCLUSIONS: Based on this study, which was terminated prematurely before the desired sample size could be achieved, we were unable to demonstrate that varying the concentration and volume of a fixed-dose ropivacaine infusion for continuous femoral nerve block influences time to discharge readiness when compared with a conventional single-injection femoral nerve block after TKA. A low concentration of ropivacaine infusion can reduce postoperative opioid consumption but without any important differences in pain scores, side effects, or functional outcomes. These pilot data may be used to inform the statistical power of future randomized trials. LEVEL OF EVIDENCE: Level II, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Resumo:
Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.
Resumo:
We analyse in a unified way how the presence of a trader with privilege information makes the market to be efficient when the release time is known. We establish a general relation between the problem of finding an equilibrium and the problem of enlargement of filtrations. We also consider the case where the time of announcement is random. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time.
Resumo:
BACKGROUND Several evidences indicate that gut microbiota is involved in the control of host energy metabolism. OBJECTIVE To evaluate the differences in the composition of gut microbiota in rat models under different nutritional status and physical activity and to identify their associations with serum leptin and ghrelin levels. METHODS IN A CASE CONTROL STUDY, FORTY MALE RATS WERE RANDOMLY ASSIGNED TO ONE OF THESE FOUR EXPERIMENTAL GROUPS: ABA group with food restriction and free access to exercise; control ABA group with food restriction and no access to exercise; exercise group with free access to exercise and feed ad libitum and ad libitum group without access to exercise and feed ad libitum. The fecal bacteria composition was investigated by PCR-denaturing gradient gel electrophoresis and real-time qPCR. RESULTS In restricted eaters, we have found a significant increase in the number of Proteobacteria, Bacteroides, Clostridium, Enterococcus, Prevotella and M. smithii and a significant decrease in the quantities of Actinobacteria, Firmicutes, Bacteroidetes, B. coccoides-E. rectale group, Lactobacillus and Bifidobacterium with respect to unrestricted eaters. Moreover, a significant increase in the number of Lactobacillus, Bifidobacterium and B. coccoides-E. rectale group was observed in exercise group with respect to the rest of groups. We also found a significant positive correlation between the quantity of Bifidobacterium and Lactobacillus and serum leptin levels, and a significant and negative correlation among the number of Clostridium, Bacteroides and Prevotella and serum leptin levels in all experimental groups. Furthermore, serum ghrelin levels were negatively correlated with the quantity of Bifidobacterium, Lactobacillus and B. coccoides-Eubacterium rectale group and positively correlated with the number of Bacteroides and Prevotella. CONCLUSIONS Nutritional status and physical activity alter gut microbiota composition affecting the diversity and similarity. This study highlights the associations between gut microbiota and appetite-regulating hormones that may be important in terms of satiety and host metabolism.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
The administration of selective serotonin reuptake inhibitors (SSRIs) typically used as antidepressants increases alcohol consumption after an alcohol deprivation period in rats. However, the appearance of this effect after the treatment with selective noradrenaline reuptake inhibitors (SNRIs) has not been studied. In the present work we examined the effects of a 15-d treatment with the SNRI atomoxetine (1, 3 and 10 mg/kg, i.p.) in male rats trained to drink alcohol solutions in a 4-bottle choice test. The treatment with atomoxetine (10 mg/kg, i.p.) during an alcohol deprivation period increased alcohol consumption after relapse. This effect only lasted one week, disappearing thereafter. Treatment with atomoxetine did not cause a behavioral sensitized response to a challenge dose of amphetamine (1.5 mg/kg, i.p.), indicating the absence of a supersensitive dopaminergic transmission. This effect is markedly different from that of SSRI antidepressants that produced both long-lasting increases in alcohol consumption and behavioral sensitization. Clinical implications are discussed.
Resumo:
Genes affect not only the behavior and fitness of their carriers but also that of other individuals. According to Hamilton's rule, whether a mutant gene will spread in the gene pool depends on the effects of its carrier on the fitness of all individuals in the population, each weighted by its relatedness to the carrier. However, social behaviors may affect not only recipients living in the generation of the actor but also individuals living in subsequent generations. In this note, I evaluate space-time relatedness coefficients for localized dispersal. These relatedness coefficients weight the selection pressures on long-lasting behaviors, which stem from a multigenerational gap between phenotypic expression by actors and the resulting environmental feedback on the fitness of recipients. Explicit values of space-time relatedness coefficients reveal that they can be surprisingly large for typical dispersal rates, even for hundreds of generations in the future.
Resumo:
Piecewise linear models systems arise as mathematical models of systems in many practical applications, often from linearization for nonlinear systems. There are two main approaches of dealing with these systems according to their continuous or discrete-time aspects. We propose an approach which is based on the state transformation, more particularly the partition of the phase portrait in different regions where each subregion is modeled as a two-dimensional linear time invariant system. Then the Takagi-Sugeno model, which is a combination of local model is calculated. The simulation results show that the Alpha partition is well-suited for dealing with such a system
Resumo:
Sackung is a widespread post-glacial morphological feature affecting Alpine mountains and creating characteristic geomorphological expression that can be detected from topography. Over long time evolution, internal deformation can lead to the formation of rapidly moving phenomena such as a rock-slide or rock avalanche. In this study, a detailed description of the Sierre rock-avalanche (SW Switzerland) is presented. This convex-shaped postglacial instability is one of the larger rock-avalanche in the Alps, involving more than 1.5 billion m3 with a run-out distance of about 14 km and extremely low Fahrböschung angle. This study presents comprehensive analyses of the structural and geological characteristics leading to the development of the Sierre rock-avalanche. In particular, by combining field observations, digital elevation model analyses and numerical modelling, the strong influence of both ductile and brittle tectonic structures on the failure mechanism and on the failure surface geometry is highlighted. The detection of pre-failure deformation indicates that the development of the rock avalanche corresponds to the last evolutionary stage of a pre-existing deep seated gravitational slope instability. These analyses accompanied by the dating and the characterization of rock avalanche deposits, allow the proposal of a destabilization model that clarifies the different phases leading to the development of the Sierre rock avalanche.
Resumo:
We present a detailed analytical and numerical study of the avalanche distributions of the continuous damage fiber bundle model CDFBM . Linearly elastic fibers undergo a series of partial failure events which give rise to a gradual degradation of their stiffness. We show that the model reproduces a wide range of mechanical behaviors. We find that macroscopic hardening and plastic responses are characterized by avalanche distributions, which exhibit an algebraic decay with exponents between 5/2 and 2 different from those observed in mean-field fiber bundle models. We also derive analytically the phase diagram of a family of CDFBM which covers a large variety of potential avalanche size distributions. Our results provide a unified view of the statistics of breaking avalanches in fiber bundle models
Resumo:
BACKGROUND: Good adherence to antiretroviral therapy (ART) is critical for successful HIV treatment. However, some patients remain virologically suppressed despite suboptimal adherence. We hypothesized that this could result from host genetic factors influencing drug levels. METHODS: Eligible individuals were Caucasians treated with efavirenz (EFV) and/or boosted lopinavir (LPV/r) with self-reported poor adherence, defined as missing doses of ART at least weekly for more than 6 months. Participants were genotyped for single nucleotide polymorphisms (SNPs) in candidate genes previously reported to decrease EFV (rs3745274, rs35303484, rs35979566 in CYP2B6) and LPV/r clearance (rs4149056 in SLCO1B1, rs6945984 in CYP3A, rs717620 in ABCC2). Viral suppression was defined as having HIV-1 RNA <400 copies/ml throughout the study period. RESULTS: From January 2003 until May 2009, 37 individuals on EFV (28 suppressed and 9 not suppressed) and 69 on LPV/r (38 suppressed and 31 not suppressed) were eligible. The poor adherence period was a median of 32 weeks with 18.9% of EFV and 20.3% of LPV/r patients reporting missed doses on a daily basis. The tested SNPs were not determinant for viral suppression. Reporting missing >1 dose/week was associated with a lower probability of viral suppression compared to missing 1 dose/week (EFV: odds ratio (OR) 0.11, 95% confidence interval (CI): 0.01-0.99; LPV/r: OR 0.29, 95% CI: 0.09-0.94). In both groups, the probability of remaining suppressed increased with the duration of continuous suppression prior to the poor adherence period (EFV: OR 3.40, 95% CI: 0.62-18.75; LPV/r: OR 5.65, 95% CI: 1.82-17.56). CONCLUSIONS: The investigated genetic variants did not play a significant role in the sustained viral suppression of individuals with suboptimal adherence. Risk of failure decreased with longer duration of viral suppression in this population.