937 resultados para Discrete time inventory models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background- An elevated resting heart rate is associated with rehospitalization for heart failure and is a modifiable risk factor in heart failure patients. We aimed to examine the association between resting heart rate and incident heart failure in a population-based cohort study of healthy adults without pre-existing overt heart disease. Methods and Results- We studied 4768 men and women aged ≥55 years from the population-based Rotterdam Study. We excluded participants with prevalent heart failure, coronary heart disease, pacemaker, atrial fibrillation, atrioventricular block, and those using β-blockers or calcium channel blockers. We used extended Cox models allowing for time-dependent variation of resting heart rate along follow-up. During a median of 14.6 years of follow-up, 656 participants developed heart failure. The risk of heart failure was higher in men with higher resting heart rate. For each increment of 10 beats per minute, the multivariable adjusted hazard ratios in men were 1.16 (95% confidence interval, 1.05-1.28; P=0.005) in the time-fixed heart rate model and 1.13 (95% confidence interval, 1.02-1.25; P=0.017) in the time-dependent heart rate model. The association could not be demonstrated in women (P for interaction=0.004). Censoring participants for incident coronary heart disease or using time-dependent models to account for the use of β-blockers or calcium channel blockers during follow-up did not alter the results. Conclusions- Baseline or persistent higher resting heart rate is an independent risk factor for the development of heart failure in healthy older men in the general population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple model of diffusion of innovations in a social network with upgrading costs is introduced. Agents are characterized by a single real variable, their technological level. According to local information, agents decide whether to upgrade their level or not, balancing their possible benefit with the upgrading cost. A critical point where technological avalanches display a power-law behavior is also found. This critical point is characterized by a macroscopic observable that turns out to optimize technological growth in the stationary state. Analytical results supporting our findings are found for the globally coupled case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interfacial hydrodynamic instabilities arise in a range of chemical systems. One mechanism for instability is the occurrence of unstable density gradients due to the accumulation of reaction products. In this paper we conduct two-dimensional nonlinear numerical simulations for a member of this class of system: the methylene-blue¿glucose reaction. The result of these reactions is the oxidation of glucose to a relatively, but marginally, dense product, gluconic acid, that accumulates at oxygen permeable interfaces, such as the surface open to the atmosphere. The reaction is catalyzed by methylene-blue. We show that simulations help to disassemble the mechanisms responsible for the onset of instability and evolution of patterns, and we demonstrate that some of the results are remarkably consistent with experiments. We probe the impact of the upper oxygen boundary condition, for fixed flux, fixed concentration, or mixed boundary conditions, and find significant qualitative differences in solution behavior; structures either attract or repel one another depending on the boundary condition imposed. We suggest that measurement of the form of the boundary condition is possible via observation of oxygen penetration, and improved product yields may be obtained via proper control of boundary conditions in an engineering setting. We also investigate the dependence on parameters such as the Rayleigh number and depth. Finally, we find that pseudo-steady linear and weakly nonlinear techniques described elsewhere are useful tools for predicting the behavior of instabilities beyond their formal range of validity, as good agreement is obtained with the simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[eng] We consider a discrete time, pure exchange infinite horizon economy with two or more consumers and at least one concumption good per period. Within the framework of decentralized mechanisms, we show that for a given consumption trade at any period of time, say at time one, the consumers will need, in general, an infinite dimensional (informational) space to identigy such a trade as an intemporal Walrasian one. However, we show and characterize a set of enviroments where the Walrasian trades at each period of time can be achieved as the equilibrium trades of a sequence of decentralized competitive mechanisms, using only both current prices and quantities to coordinate decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with the derivation of new estimators and performance bounds for the problem of timing estimation of (linearly) digitally modulated signals. The conditional maximum likelihood (CML) method is adopted, in contrast to the classical low-SNR unconditional ML (UML) formulationthat is systematically applied in the literature for the derivationof non-data-aided (NDA) timing-error-detectors (TEDs). A new CML TED is derived and proved to be self-noise free, in contrast to the conventional low-SNR-UML TED. In addition, the paper provides a derivation of the conditional Cramér–Rao Bound (CRB ), which is higher (less optimistic) than the modified CRB (MCRB)[which is only reached by decision-directed (DD) methods]. It is shown that the CRB is a lower bound on the asymptotic statisticalaccuracy of the set of consistent estimators that are quadratic with respect to the received signal. Although the obtained boundis not general, it applies to most NDA synchronizers proposed in the literature. A closed-form expression of the conditional CRBis obtained, and numerical results confirm that the CML TED attains the new bound for moderate to high Eg/No.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Wigner higher order moment spectra (WHOS)are defined as extensions of the Wigner-Ville distribution (WD)to higher order moment spectra domains. A general class oftime-frequency higher order moment spectra is also defined interms of arbitrary higher order moments of the signal as generalizations of the Cohen’s general class of time-frequency representations. The properties of the general class of time-frequency higher order moment spectra can be related to theproperties of WHOS which are, in fact, extensions of the properties of the WD. Discrete time and frequency Wigner higherorder moment spectra (DTF-WHOS) distributions are introduced for signal processing applications and are shown to beimplemented with two FFT-based algorithms. One applicationis presented where the Wigner bispectrum (WB), which is aWHOS in the third-order moment domain, is utilized for thedetection of transient signals embedded in noise. The WB iscompared with the WD in terms of simulation examples andanalysis of real sonar data. It is shown that better detectionschemes can be derived, in low signal-to-noise ratio, when theWB is applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työssä tarkastellaan yleisiä menetelmiä säätöpiirien suorituskyvyn analysointiin ja sovelletaan niitä jatkuvatoimisen sellukeittimen säätöihin. Esitellyt menetelmät tarjoavat keinoja myös huonon säätötuloksen syyn selvittämiseen ja vinkkejä paremman suorituskyvyn saavuttamiseksi. Analyysissä edettiin top-down periaatteen mukaisesti lähtien liikkeelle keittimen tärkeimmästä säädöstä eli kappaluvun säädöstä. Sitten etsittiin tähän vaikuttavia tekijöitä mitatuista suureista. Seuraavaksi arvioitiin tärkeimmäksi katsotun tekijän (hakepinnankorkeus) säädön suorituskyky, jossa havaittiin parannettavaa. Lopuksi hakepinnankorkeuden säädön viritystämuutettiin ja tehtiin identifiointikoe säätörakenteen uudelleen järjestelyä varten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pulsewidth-modulated (PWM) rectifier technology is increasingly used in industrial applications like variable-speed motor drives, since it offers several desired features such as sinusoidal input currents, controllable power factor, bidirectional power flow and high quality DC output voltage. To achieve these features,however, an effective control system with fast and accurate current and DC voltage responses is required. From various control strategies proposed to meet these control objectives, in most cases the commonly known principle of the synchronous-frame current vector control along with some space-vector PWM scheme have been applied. Recently, however, new control approaches analogous to the well-established direct torque control (DTC) method for electrical machines have also emerged to implement a high-performance PWM rectifier. In this thesis the concepts of classical synchronous-frame current control and DTC-based PWM rectifier control are combined and a new converter-flux-based current control (CFCC) scheme is introduced. To achieve sufficient dynamic performance and to ensure a stable operation, the proposed control system is thoroughly analysed and simple rules for the controller design are suggested. Special attention is paid to the estimationof the converter flux, which is the key element of converter-flux-based control. Discrete-time implementation is also discussed. Line-voltage-sensorless reactive reactive power control methods for the L- and LCL-type line filters are presented. For the L-filter an open-loop control law for the d-axis current referenceis proposed. In the case of the LCL-filter the combined open-loop control and feedback control is proposed. The influence of the erroneous filter parameter estimates on the accuracy of the developed control schemes is also discussed. A newzero vector selection rule for suppressing the zero-sequence current in parallel-connected PWM rectifiers is proposed. With this method a truly standalone and independent control of the converter units is allowed and traditional transformer isolation and synchronised-control-based solutions are avoided. The implementation requires only one additional current sensor. The proposed schemes are evaluated by the simulations and laboratory experiments. A satisfactory performance and good agreement between the theory and practice are demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Purpose- There is a lack of studies on tourism demand forecasting that use non-linear models. The aim of this paper is to introduce consumer expectations in time-series models in order to analyse their usefulness to forecast tourism demand. Design/methodology/approach- The paper focuses on forecasting tourism demand in Catalonia for the four main visitor markets (France, the UK, Germany and Italy) combining qualitative information with quantitative models: autoregressive (AR), autoregressive integrated moving average (ARIMA), self-exciting threshold autoregressions (SETAR) and Markov switching regime (MKTAR) models. The forecasting performance of the different models is evaluated for different time horizons (one, two, three, six and 12 months). Findings- Although some differences are found between the results obtained for the different countries, when comparing the forecasting accuracy of the different techniques, ARIMA and Markov switching regime models outperform the rest of the models. In all cases, forecasts of arrivals show lower root mean square errors (RMSE) than forecasts of overnight stays. It is found that models with consumer expectations do not outperform benchmark models. These results are extensive to all time horizons analysed. Research limitations/implications- This study encourages the use of qualitative information and more advanced econometric techniques in order to improve tourism demand forecasting. Originality/value- This is the first study on tourism demand focusing specifically on Catalonia. To date, there have been no studies on tourism demand forecasting that use non-linear models such as self-exciting threshold autoregressions (SETAR) and Markov switching regime (MKTAR) models. This paper fills this gap and analyses forecasting performance at a regional level. Keywords Tourism, Forecasting, Consumers, Spain, Demand management Paper type Research paper

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we focus our attention on a particle that follows a unidirectional quantum walk, an alternative version of the currently widespread discrete-time quantum walk on a line. Here the walker at each time step can either remain in place or move in a fixed direction, e.g., rightward or upward. While both formulations are essentially equivalent, the present approach leads us to consider discrete Fourier transforms, which eventually results in obtaining explicit expressions for the wave functions in terms of finite sums and allows the use of efficient algorithms based on the fast Fourier transform. The wave functions here obtained govern the probability of finding the particle at any given location but determine as well the exit-time probability of the walker from a fixed interval, which is also analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Huoli ympäristön tilasta ja fossiilisten polttoaineiden hinnan nousu ovat vauhdittaneet tutkimusta uusien energialähteiden löytämiseksi. Polttokennot ovat yksi lupaavimmista tekniikoista etenkin hajautetun energiantuotannon, varavoimalaitosten sekä liikennevälineiden alueella. Polttokenno on tehonlähteenä kuitenkin hyvin epäideaalinen, ja se asettaa tehoelektroniikalle lukuisia erityisvaatimuksia. Polttokennon kytkeminen sähköverkkoon on tavallisesti toteutettu käyttämällä galvaanisesti erottavaa DC/DC hakkuria sekä vaihtosuuntaajaa sarjassa. Polttokennon kulumisen estämiseksi tehoelektroniikalta vaaditaan tarkkaa polttokennon lähtövirran hallintaa. Perinteisesti virran hallinta on toteutettu säätämällä hakkurin tulovirtaa PI (Proportional and Integral) tai PID (Proportional, Integral and Derivative) -säätimellä. Hakkurin epälineaarisuudesta johtuen tällainen ratkaisu ei välttämättä toimi kaukana linearisointipisteestä. Lisäksi perinteiset säätimet ovat herkkiä mallinnusvirheille. Tässä diplomityössä on esitetty polttokennon jännitettä nostavan hakkurin tilayhtälökeskiarvoistusmenetelmään perustuva malli, sekä malliin perustuva diskreettiaikainen integroiva liukuvan moodin säätö. Esitetty säätö on luonteeltaan epälineaarinen ja se soveltuu epälineaaristen ja heikosti tunnettujen järjestelmien säätämiseen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal and air conditions inside animal facilities change during the day due to the influence of the external environment. For statistical and geostatistical analyses to be representative, a large number of points spatially distributed in the facility area must be monitored. This work suggests that the time variation of environmental variables of interest for animal production, monitored within animal facility, can be modeled accurately from discrete-time records. The aim of this study was to develop a numerical method to correct the temporal variations of these environmental variables, transforming the data so that such observations are independent of the time spent during the measurement. The proposed method approached values recorded with time delays to those expected at the exact moment of interest, if the data were measured simultaneously at the moment at all points distributed spatially. The correction model for numerical environmental variables was validated for environmental air temperature parameter, and the values corrected by the method did not differ by Tukey's test at 5% significance of real values recorded by data loggers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En option är ett finansiellt kontrakt som ger dess innehavare en rättighet (men medför ingen skyldighet) att sälja eller köpa någonting (till exempel en aktie) till eller från säljaren av optionen till ett visst pris vid en bestämd tidpunkt i framtiden. Den som säljer optionen binder sig till att gå med på denna framtida transaktion ifall optionsinnehavaren längre fram bestämmer sig för att inlösa optionen. Säljaren av optionen åtar sig alltså en risk av att den framtida transaktion som optionsinnehavaren kan tvinga honom att göra visar sig vara ofördelaktig för honom. Frågan om hur säljaren kan skydda sig mot denna risk leder till intressanta optimeringsproblem, där målet är att hitta en optimal skyddsstrategi under vissa givna villkor. Sådana optimeringsproblem har studerats mycket inom finansiell matematik. Avhandlingen "The knapsack problem approach in solving partial hedging problems of options" inför en ytterligare synpunkt till denna diskussion: I en relativt enkel (ändlig och komplett) marknadsmodell kan nämligen vissa partiella skyddsproblem beskrivas som så kallade kappsäcksproblem. De sistnämnda är välkända inom en gren av matematik som heter operationsanalys. I avhandlingen visas hur skyddsproblem som tidigare lösts på andra sätt kan alternativt lösas med hjälp av metoder som utvecklats för kappsäcksproblem. Förfarandet tillämpas även på helt nya skyddsproblem i samband med så kallade amerikanska optioner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pulse Response Based Control (PRBC) is a recently developed minimum time control method for flexible structures. The flexible behavior of the structure is represented through a set of discrete time sequences, which are the responses of the structure due to rectangular force pulses. The rectangular force pulses are given by the actuators that control the structure. The set of pulse responses, desired outputs, and force bounds form a numerical optimization problem. The solution of the optimization problem is a minimum time piecewise constant control sequence for driving the system to a desired final state. The method was developed for driving positive semi-definite systems. In case the system is positive definite, some final states of the system may not be reachable. Necessary conditions for reachability of the final states are derived for systems with a finite number of degrees of freedom. Numerical results are presented that confirm the derived analytical conditions. Numerical simulations of maneuvers of distributed parameter systems have shown a relationship between the error in the estimated minimum control time and sampling interval