928 resultados para Discrete-time control


Relevância:

80.00% 80.00%

Publicador:

Resumo:

We analyze second birth decisions within the theoretical framework of joint household decision making, comparing two countires that represent the international extremes in terms of women's career behaviour, Denmark and Spain. Using all 8 ECHP panels we apply discrete time estimations of the likelihood of a second birth and show that in Spain, fertility behaviour continues to conform to the classic "Becker model" while in Denmark we identify a radically new behavioral pattern according to which career-women's fertility is conditional of their partners' contribution to care for the children.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the framework of the classical compound Poisson process in collective risk theory, we study a modification of the horizontal dividend barrier strategy by introducing random observation times at which dividends can be paid and ruin can be observed. This model contains both the continuous-time and the discrete-time risk model as a limit and represents a certain type of bridge between them which still enables the explicit calculation of moments of total discounted dividend payments until ruin. Numerical illustrations for several sets of parameters are given and the effect of random observation times on the performance of the dividend strategy is studied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper looks at the dynamic management of risk in an economy with discrete time consumption and endowments and continuous trading. I study how agents in such an economy deal with all the risk in the economy and attain their Pareto optimal allocations by trading in a few natural securities: private insurance contracts and a common set of derivatives on the aggregate endowment. The parsimonious nature ofthe implied securities needed for Pareto optimality suggests that insuch contexts complete markets is a very reasonable assumption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A high-resolution three-dimensional (3D) seismic reflection system for small-scale targets in lacustrine settings has been developed. Its main characteristics include navigation and shot-triggering software that fires the seismic source at regular distance intervals (max. error of 0.25 m) with real-time control on navigation using differential GPS (Global Positioning System). Receiver positions are accurately calculated (error < 0.20 m) with the aid of GPS antennas attached to the end of each of three 24-channel streamers. Two telescopic booms hold the streamers at a distance of 7.5 m from each other. With a receiver spacing of 2.5 m, the bin dimension is 1.25 m in inline and 3.75 m in crossline direction. To test the system, we conducted a 3D survey of about 1 km(2) in Lake Geneva, Switzerland, over a complex fault zone. A 5-m shot spacing resulted in a nominal fold of 6. A double-chamber bubble-cancelling 15/15 in(3) air gun (40-650 Hz) operated at 80 bars and 1 m depth gave a signal penetration of 300 m below water bottom and a best vertical resolution of 1.1 m. Processing followed a conventional scheme, but had to be adapted to the high sampling rates, and our unconventional navigation data needed conversion to industry standards. The high-quality data enabled us to construct maps of seismic horizons and fault surfaces in three dimensions. The system proves to be well adapted to investigate complex structures by providing non-aliased images of reflectors with dips up to 30 degrees.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The analysis of multiexponential decays is challenging because of their complex nature. When analyzing these signals, not only the parameters, but also the orders of the models, have to be estimated. We present an improved spectroscopic technique specially suited for this purpose. The proposed algorithm combines an iterative linear filter with an iterative deconvolution method. A thorough analysis of the noise effect is presented. The performance is tested with synthetic and experimental data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[eng] We consider a discrete time, pure exchange infinite horizon economy with two or more consumers and at least one concumption good per period. Within the framework of decentralized mechanisms, we show that for a given consumption trade at any period of time, say at time one, the consumers will need, in general, an infinite dimensional (informational) space to identigy such a trade as an intemporal Walrasian one. However, we show and characterize a set of enviroments where the Walrasian trades at each period of time can be achieved as the equilibrium trades of a sequence of decentralized competitive mechanisms, using only both current prices and quantities to coordinate decisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A simple model of diffusion of innovations in a social network with upgrading costs is introduced. Agents are characterized by a single real variable, their technological level. According to local information, agents decide whether to upgrade their level or not, balancing their possible benefit with the upgrading cost. A critical point where technological avalanches display a power-law behavior is also found. This critical point is characterized by a macroscopic observable that turns out to optimize technological growth in the stationary state. Analytical results supporting our findings are found for the globally coupled case.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Acute normocapnic hypoxemia can cause functional renal insufficiency by increasing renal vascular resistance (RVR), leading to renal hypoperfusion and decreased glomerular filtration rate (GFR). Insulin-like growth factor 1 (IGF-1) activity is low in fetuses and newborns and further decreases during hypoxia. IGF-1 administration to humans and adult animals induces pre- and postglomerular vasodilation, thereby increasing GFR and renal blood flow (RBF). A potential protective effect of IGF-1 on renal function was evaluated in newborn rabbits with hypoxemia-induced renal insufficiency. Renal function and hemodynamic parameters were assessed in 17 anesthetized and mechanically ventilated newborn rabbits. After hypoxemia stabilization, saline solution (time control) or IGF-1 (1 mg/kg) was given as an intravenous (i.v.) bolus, and renal function was determined for six 30-min periods. Normocapnic hypoxemia significantly increased RVR (+16%), leading to decreased GFR (-14%), RBF (-19%) and diuresis (-12%), with an increased filtration fraction (FF). Saline solution resulted in a worsening of parameters affected by hypoxemia. Contrarily, although mean blood pressure decreased slightly but significantly, IGF-1 prevented a further increase in RVR, with subsequent improvement of GFR, RBF and diuresis. FF indicated relative postglomerular vasodilation. Although hypoxemia-induced acute renal failure was not completely prevented, IGF-1 elicited efferent vasodilation, thereby precluding a further decline in renal function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[eng] We consider a discrete time, pure exchange infinite horizon economy with two or more consumers and at least one concumption good per period. Within the framework of decentralized mechanisms, we show that for a given consumption trade at any period of time, say at time one, the consumers will need, in general, an infinite dimensional (informational) space to identigy such a trade as an intemporal Walrasian one. However, we show and characterize a set of enviroments where the Walrasian trades at each period of time can be achieved as the equilibrium trades of a sequence of decentralized competitive mechanisms, using only both current prices and quantities to coordinate decisions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Hierarchical modeling has been proposed as a solution to the multiple exposure problem. We estimate associations between metabolic syndrome and different components of antiretroviral therapy using both conventional and hierarchical models. STUDY DESIGN AND SETTING: We use discrete time survival analysis to estimate the association between metabolic syndrome and cumulative exposure to 16 antiretrovirals from four drug classes. We fit a hierarchical model where the drug class provides a prior model of the association between metabolic syndrome and exposure to each antiretroviral. RESULTS: One thousand two hundred and eighteen patients were followed for a median of 27 months, with 242 cases of metabolic syndrome (20%) at a rate of 7.5 cases per 100 patient years. Metabolic syndrome was more likely to develop in patients exposed to stavudine, but was less likely to develop in those exposed to atazanavir. The estimate for exposure to atazanavir increased from hazard ratio of 0.06 per 6 months' use in the conventional model to 0.37 in the hierarchical model (or from 0.57 to 0.81 when using spline-based covariate adjustment). CONCLUSION: These results are consistent with trials that show the disadvantage of stavudine and advantage of atazanavir relative to other drugs in their respective classes. The hierarchical model gave more plausible results than the equivalent conventional model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we consider a discrete-time risk process allowing for delay in claim settlement, which introduces a certain type of dependence in the process. From martingale theory, an expression for the ultimate ruin probability is obtained, and Lundberg-type inequalities are derived. The impact of delay in claim settlement is then investigated. To this end, a convex order comparison of the aggregate claim amounts is performed with the corresponding non-delayed risk model, and numerical simulations are carried out with Belgian market data.