1000 resultados para Consumption optimality


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a system with energy harvesting (EH) nodes, the design focus shifts from minimizing energy consumption by infrequently transmitting less information to making the best use of available energy to efficiently deliver data while adhering to the fundamental energy neutrality constraint. We address the problem of maximizing the throughput of a system consisting of rate-adaptive EH nodes that transmit to a destination. Unlike related literature, we focus on the practically important discrete-rate adaptation model. First, for a single EH node, we propose a discrete-rate adaptation rule and prove its optimality for a general class of stationary and ergodic EH and fading processes. We then study a general system with multiple EH nodes in which one is opportunistically selected to transmit. We first derive a novel and throughput-optimal joint selection and rate adaptation rule (TOJSRA) when the nodes are subject to a weaker average power constraint. We then propose a novel rule for a multi-EH node system that is based on TOJSRA, and we prove its optimality for stationary and ergodic EH and fading processes. We also model the various energy overheads of the EH nodes and characterize their effect on the adaptation policy and the system throughput.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article considers a semi-infinite mathematical programming problem with equilibrium constraints (SIMPEC) defined as a semi-infinite mathematical programming problem with complementarity constraints. We establish necessary and sufficient optimality conditions for the (SIMPEC). We also formulate Wolfe- and Mond-Weir-type dual models for (SIMPEC) and establish weak, strong and strict converse duality theorems for (SIMPEC) and the corresponding dual problems under invexity assumptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a server serving a time-slotted queued system of multiple packet-based flows, where not more than one flow can be serviced in a single time slot. The flows have exogenous packet arrivals and time-varying service rates. At each time, the server can observe instantaneous service rates for only a subset of flows ( selected from a fixed collection of observable subsets) before scheduling a flow in the subset for service. We are interested in queue length aware scheduling to keep the queues short. The limited availability of instantaneous service rate information requires the scheduler to make a careful choice of which subset of service rates to sample. We develop scheduling algorithms that use only partial service rate information from subsets of channels, and that minimize the likelihood of queue overflow in the system. Specifically, we present a new joint subset-sampling and scheduling algorithm called Max-Exp that uses only the current queue lengths to pick a subset of flows, and subsequently schedules a flow using the Exponential rule. When the collection of observable subsets is disjoint, we show that Max-Exp achieves the best exponential decay rate, among all scheduling algorithms that base their decision on the current ( or any finite past history of) system state, of the tail of the longest queue. To accomplish this, we employ novel analytical techniques for studying the performance of scheduling algorithms using partial state, which may be of independent interest. These include new sample-path large deviations results for processes obtained by non-random, predictable sampling of sequences of independent and identically distributed random variables. A consequence of these results is that scheduling with partial state information yields a rate function significantly different from scheduling with full channel information. In the special case when the observable subsets are singleton flows, i.e., when there is effectively no a priori channel state information, Max-Exp reduces to simply serving the flow with the longest queue; thus, our results show that to always serve the longest queue in the absence of any channel state information is large deviations optimal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concentration of Nitrogen Oxides (NOx) in engines which use biodiesel as fuel is higher compared to conventional diesel engine exhaust. In this paper, an attempt has been made to treat this exhaust using a combination of High frequency AC (HFAC) plasma and an industrial waste, Red Mud which shows proclivity towards Nitrogen dioxide (NO2) adsorption. The high frequency AC source in combination with the proposed compact double dielectric plasma reactors is relatively more efficient in converting Nitric Oxide (NO) to NO2. It has been shown that the plasma treated gas enhances the activity of red mud as an adsorbent/catalyst and about 60-72% NOx removal efficiency was observed at a specific energy of 250 J/L. The advantage in this method is the cost effectiveness and abundant availability of the waste red mud in the industry. Further, power estimation studies were carried out using Manley's equation for the two reactors employed in the experiment and a close agreement between experimental and predicted powers was observed. (C) 2015 The Authors. Published by Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we first derive a necessary and sufficient condition for a stationary strategy to be the Nash equilibrium of discounted constrained stochastic game under certain assumptions. In this process we also develop a nonlinear (non-convex) optimization problem for a discounted constrained stochastic game. We use the linear best response functions of every player and complementary slackness theorem for linear programs to derive both the optimization problem and the equivalent condition. We then extend this result to average reward constrained stochastic games. Finally, we present a heuristic algorithm motivated by our necessary and sufficient conditions for a discounted cost constrained stochastic game. We numerically observe the convergence of this algorithm to Nash equilibrium. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the soft lunar landing with minimum fuel expenditure is formulated as a nonlinear optimal guidance problem. The realization of pinpoint soft landing with terminal velocity and position constraints is achieved using Model Predictive Static Programming (MPSP). The high accuracy of the terminal conditions is ensured as the formulation of the MPSP inherently poses final conditions as a set of hard constraints. The computational efficiency and fast convergence make the MPSP preferable for fixed final time onboard optimal guidance algorithm. It has also been observed that the minimum fuel requirement strongly depends on the choice of the final time (a critical point that is not given due importance in many literature). Hence, to optimally select the final time, a neural network is used to learn the mapping between various initial conditions in the domain of interest and the corresponding optimal flight time. To generate the training data set, the optimal final time is computed offline using a gradient based optimization technique. The effectiveness of the proposed method is demonstrated with rigorous simulation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, micro gas sensor was fabricated using indium oxide nanowire for effective gas detection and monitoring system. Indium oxide nanowire was grown using thermal CVD, and their structural properties were examined by the SEM, XRD and TEM. The electric properties for microdropped indium oxide nanowire device were measured, and gas response characteristics were examined for CO gas. Sensors showed high sensitivity and stability for CO gas. And with below 20 mw power consumption, 5 ppm CO could be detected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluate the management of the Northern Stock of Hake during 1986-2001. A stochastic bioeconomic model is calibrated to match the main features of this fishing ground. We show how catches, biomass stock and profits would have been if the optimal Common Fisheries Policy (CFP) consistent with the target biomass implied by the Fischler’s Recovery Plan had been implemented. The main finding are: i) an optimal CFP would have generated profits of more than 667 millions euros, ii) if side-payments are allowed (implemented by ITQ’s, for example) these profits increase 26%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study the effect of population age distribution upon private consumption expenditure in Spain from 1964 to 1997 using aggregate data. We obtain four main results. First, changes in the population pyramid have substantial effects upon the behaviour of private consumption. Second, the pattern of the coefficients of the demographic variables is not consistent with the simplest version of the life cycle hypothesis. Third, we estimate the impact of the demographic transition upon consumption and find positive values associated with episodes in which the shares of groups of individuals with expenditure levels higher (lower) than the mean increased (decreased). Fourth, the results are robust to alternative specifications for the population age distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend the classic Merton (1969, 1971) problem that investigates the joint consumption-savings and portfolio-selection problem under capital risk by assuming sophisticated but time-inconsistent agents. We introduce stochastic hyperbolic preferences as in Harris and Laibson (2013) and find closed-form solutions for Merton's optimal consumption and portfolio selection problem in continuous time. We find that the portfolio rule remains identical to the time-consistent solution with power utility and no borrowing constraints. However,the marginal propensity to consume out of wealth is unambiguously greater than the time-consistent, exponential case and,importantly, it is also more responsive to changes in risk. These results suggest that hyperbolic discounting with sophisticated agents offers promise for contributing to explaining important aspects of asset market data.