890 resultados para Discrete-time sliding mode control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple model of diffusion of innovations in a social network with upgrading costs is introduced. Agents are characterized by a single real variable, their technological level. According to local information, agents decide whether to upgrade their level or not, balancing their possible benefit with the upgrading cost. A critical point where technological avalanches display a power-law behavior is also found. This critical point is characterized by a macroscopic observable that turns out to optimize technological growth in the stationary state. Analytical results supporting our findings are found for the globally coupled case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[eng] We consider a discrete time, pure exchange infinite horizon economy with two or more consumers and at least one concumption good per period. Within the framework of decentralized mechanisms, we show that for a given consumption trade at any period of time, say at time one, the consumers will need, in general, an infinite dimensional (informational) space to identigy such a trade as an intemporal Walrasian one. However, we show and characterize a set of enviroments where the Walrasian trades at each period of time can be achieved as the equilibrium trades of a sequence of decentralized competitive mechanisms, using only both current prices and quantities to coordinate decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To evaluate the feasibility of a comprehensive, interdisciplinary adherence program aimed at HIV patients. Setting Two centers of the Swiss HIV Cohort Study: Lausanne and Basel. Method 6-month, pilot, quasi-experimental, 2-arm design (control and intervention). Patients starting a first or second combined antiretroviral therapy line were invited to participate in the study. Patients entering the intervention arm were proposed a multifactorial intervention along with an electronic drug monitor. It consisted of a maximum of six 30-min sessions with the interventionist coinciding with routine HIV check-up. The sessions relied on individualized semi-structured motivational interviews. Patients in the control arm used directly blinded EDM and did not participate in motivational interviews. Main outcome measures Rate of patients' acceptance to take part in the HIV-adherence program and rate of patients' retention in this program assessed in both intervention and control groups. Persistence, execution and adherence. Results The study was feasible in one center but not in the other one. Hence, the control group previously planned in Basel was recruited in Lausanne. Inclusion rate was 84% (n = 21) in the intervention versus 52% (n = 11) in the control group (P = 0.027). Retention rate was 91% in the intervention versus 82% in the control group (P = ns). Regarding adherence, execution was high in both groups (97 vs. 95%). Interestingly, the statistical model showed that adherence decreased more quickly in the control versus the intervention group (interaction group × time P < 0.0001). Conclusion The encountered difficulties rely on the implementation, i.e., on the program and the health care system levels rather than on the patient level. Implementation needs to be evaluated further; to be feasible a new adherence program needs to fit into the daily routine of the centre and has to be supported by all trained healthcare providers. However, this study shows that patients' adherence behavior evolved differently in both groups; it decreased more quickly over time in the control than in the intervention group. RCTs are eventually needed to assess the clinical impact of such an adherence program and to verify whether skilled pharmacists can ensure continuity of care for HIV outpatients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"MotionMaker (TM)" is a stationary programmable test and training system for the lower limbs developed at the 'Ecole Polytechnique Federale de Lausanne' with the 'Fondation Suisse pour les Cybertheses'.. The system is composed of two robotic orthoses comprising motors and sensors, and a control unit managing the trans-cutaneous electrical muscle stimulation with real-time regulation. The control of the Functional Electrical Stimulation (FES) induced muscle force necessary to mimic natural exercise is ensured by the control unit which receives a continuous input from the position and force sensors mounted on the robot. First results with control subjects showed the feasibility of creating movements by such closed-loop controlled FES induced muscle contractions. To make exercising with the MotionMaker (TM) safe for clinical trials with Spinal Cord Injured (SCI) volunteers, several original safety features have been introduced. The MotionMaker (TM) is able to identify and manage the occurrence of spasms. Fatigue can also be detected and overfatigue during exercise prevented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work was to select an appropriate digital filter for a servo application and to filter the noise from the measurement devices. Low pass filter attenuates the high frequency noise beyond the specified cut-off frequency. Digital lowpass filters in both IIR and FIR responses were designed and experimentally compared to understand their characteristics from the corresponding step responses of the system. Kaiser Windowing and Equiripple methods were selected for FIR response, whereas Butterworth, Chebyshev, InverseChebyshev and Elliptic methods were designed for IIR case. Limitations in digital filter design for a servo system were analysed. Especially the dynamic influences of each designed filter on the control stabilityof the electrical servo drive were observed. The criterion for the selection ofparameters in designing digital filters for servo systems was studied. Control system dynamics was given significant importance and the use of FIR and IIR responses in different situations were compared to justify the selection of suitableresponse in each case. The software used in the filter design was MatLab/Simulink® and dSPACE's DSP application. A speed controlled Permanent Magnet Linear synchronous Motor was used in the experimental work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we consider a discrete-time risk process allowing for delay in claim settlement, which introduces a certain type of dependence in the process. From martingale theory, an expression for the ultimate ruin probability is obtained, and Lundberg-type inequalities are derived. The impact of delay in claim settlement is then investigated. To this end, a convex order comparison of the aggregate claim amounts is performed with the corresponding non-delayed risk model, and numerical simulations are carried out with Belgian market data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The worldwide antibiotic crisis has led to a renewed interest in phage therapy. Since time immemorial phages control bacterial populations on Earth. Potent lytic phages against bacterial pathogens can be isolated from the environment or selected from a collection in a matter of days. In addition, phages have the capacity to rapidly overcome bacterial resistances, which will inevitably emerge. To maximally exploit these advantage phages have over conventional drugs such as antibiotics, it is important that sustainable phage products are not submitted to the conventional long medicinal product development and licensing pathway. There is a need for an adapted framework, including realistic production and quality and safety requirements, that allowsa timely supplying of phage therapy products for 'personalized therapy' or for public health or medical emergencies. This paper enumerates all phage therapy product related quality and safety risks known to the authors, as well as the tests that can be performed to minimize these risks, only to the extent needed to protect the patients and to allow and advance responsible phage therapy and research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study discrete-time models in which death benefits can depend on a stock price index, the logarithm of which is modeled as a random walk. Examples of such benefit payments include put and call options, barrier options, and lookback options. Because the distribution of the curtate-future-lifetime can be approximated by a linear combination of geometric distributions, it suffices to consider curtate-future-lifetimes with a geometric distribution. In binomial and trinomial tree models, closed-form expressions for the expectations of the discounted benefit payment are obtained for a series of options. They are based on results concerning geometric stopping of a random walk, in particular also on a version of the Wiener-Hopf factorization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we focus our attention on a particle that follows a unidirectional quantum walk, an alternative version of the currently widespread discrete-time quantum walk on a line. Here the walker at each time step can either remain in place or move in a fixed direction, e.g., rightward or upward. While both formulations are essentially equivalent, the present approach leads us to consider discrete Fourier transforms, which eventually results in obtaining explicit expressions for the wave functions in terms of finite sums and allows the use of efficient algorithms based on the fast Fourier transform. The wave functions here obtained govern the probability of finding the particle at any given location but determine as well the exit-time probability of the walker from a fixed interval, which is also analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, models of ecological systems can be broadly categorized as ’top-down’ or ’bottom-up’ models, based on the hierarchical level that the model processes are formulated on. The structure of a top-down, also known as phenomenological, population model can be interpreted in terms of population characteristics, but it typically lacks an interpretation on a more basic level. In contrast, bottom-up, also known as mechanistic, population models are derived from assumptions and processes on a more basic level, which allows interpretation of the model parameters in terms of individual behavior. Both approaches, phenomenological and mechanistic modelling, can have their advantages and disadvantages in different situations. However, mechanistically derived models might be better at capturing the properties of the system at hand, and thus give more accurate predictions. In particular, when models are used for evolutionary studies, mechanistic models are more appropriate, since natural selection takes place on the individual level, and in mechanistic models the direct connection between model parameters and individual properties has already been established. The purpose of this thesis is twofold. Firstly, a systematical way to derive mechanistic discrete-time population models is presented. The derivation is based on combining explicitly modelled, continuous processes on the individual level within a reproductive period with a discrete-time maturation process between reproductive periods. Secondly, as an example of how evolutionary studies can be carried out in mechanistic models, the evolution of the timing of reproduction is investigated. Thus, these two lines of research, derivation of mechanistic population models and evolutionary studies, are complementary to each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.