916 resultados para Linear programming


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traffic safety in rural highways can be considered as a constant source of concern in many countries. Nowadays, transportation professionals widely use Intelligent Transportation Systems (ITS) to address safety issues. However, compared to metropolitan applications, the rural highway (non-urban) ITS applications are still not well defined. This paper provides a comprehensive review on the existing ITS safety solutions for rural highways. This research is mainly focused on the infrastructure-based control and surveillance ITS technology, such as Crash Prevention and Safety, Road Weather Management and other applications, that is directly related to the reduction of frequency and severity of accidents. The main outcome of this research is the development of a ‘ITS control and surveillance device locating model’ to achieve the maximum safety benefit for rural highways. Using cost and benefits databases of ITS, an integer linear programming method is utilized as an optimization technique to choose the most suitable set of ITS devices. Finally, computational analysis is performed on an existing highway in Iran, to validate the effectiveness of the proposed locating model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past few years, there has been a steady increase in the attention, importance and focus of green initiatives related to data centers. While various energy aware measures have been developed for data centers, the requirement of improving the performance efficiency of application assignment at the same time has yet to be fulfilled. For instance, many energy aware measures applied to data centers maintain a trade-off between energy consumption and Quality of Service (QoS). To address this problem, this paper presents a novel concept of profiling to facilitate offline optimization for a deterministic application assignment to virtual machines. Then, a profile-based model is established for obtaining near-optimal allocations of applications to virtual machines with consideration of three major objectives: energy cost, CPU utilization efficiency and application completion time. From this model, a profile-based and scalable matching algorithm is developed to solve the profile-based model. The assignment efficiency of our algorithm is then compared with that of the Hungarian algorithm, which does not scale well though giving the optimal solution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Circular shortest paths represent a powerful methodology for image segmentation. The circularity condition ensures that the contour found by the algorithm is closed, a natural requirement for regular objects. Several implementations have been proposed in the past that either promise closure with high probability or ensure closure strictly, but with a mild computational efficiency handicap. Circularity can be viewed as a priori information that helps recover the correct object contour. Our "observation" is that circularity is only one among many possible constraints that can be imposed on shortest paths to guide them to a desirable solution. In this contribution, we illustrate this opportunity under a volume constraint but the concept is generally applicable. We also describe several adornments to the circular shortest path algorithm that proved useful in applications. © 2011 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Organisations are constantly seeking new ways to improve operational efficiencies. This study investigates a novel way to identify potential efficiency gains in business operations by observing how they were carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how these trade-offs can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A number of optimisation techniques are proposed to explore and assess alternative execution scenarios. The objective function is represented by a cost structure that captures different process dimensions. An experimental evaluation is conducted to analyse the performance and scalability of the optimisation techniques: integer linear programming (ILP), hill climbing, tabu search, and our earlier proposed hybrid genetic algorithm approach. The findings demonstrate that the hybrid genetic algorithm is scalable and performs better compared to other techniques. Moreover, we argue that the use of ILP is unrealistic in this setup and cannot handle complex cost functions such as the ones we propose. Finally, we show how cost-related insights can be gained from improved execution scenarios and how these can be utilised to put forward recommendations for reducing process-related cost and overhead within organisations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study estimates the environmental efficiency of international listed firms in 10 worldwide sectors from 2007 to 2013 by applying an order-m method, a non-parametric approach based on free disposal hull with subsampling bootstrapping. Using a conventional output of gross profit and two conventional inputs of labor and capital, this study examines the order-m environmental efficiency accounting for the presence of each of 10 undesirable inputs/outputs and measures the shadow prices of each undesirable input and output. The results show that there is greater potential for the reduction of undesirable inputs rather than bad outputs. On average, total energy, electricity, or water usage has the potential to be reduced by 50%. The median shadow prices of undesirable inputs, however, are much higher than the surveyed representative market prices. Approximately 10% of the firms in the sample appear to be potential sellers or production reducers in terms of undesirable inputs/outputs, which implies that the price of each item at the current level has little impact on most of the firms. Moreover, this study shows that the environmental, social, and governance activities of a firm do not considerably affect environmental efficiency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a chance-constrained linear programming formulation for reservoir operation of a multipurpose reservoir. The release policy is defined by a chance constraint that the probability of irrigation release in any period equalling or exceeding the irrigation demand is at least equal to a specified value P (called reliability level). The model determines the maximum annual hydropower produced while meeting the irrigation demand at a specified reliability level. The model considers variation in reservoir water level elevation and also the operating range within which the turbine operates. A linear approximation for nonlinear power production function is assumed and the solution obtained within a specified tolerance limit. The inflow into the reservoir is considered random. The chance constraint is converted into its deterministic equivalent using a linear decision rule and inflow probability distribution. The model application is demonstrated through a case study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents three methodologies for determining optimum locations and magnitudes of reactive power compensation in power distribution systems. Method I and Method II are suitable for complex distribution systems with a combination of both radial and ring-main feeders and having different voltage levels. Method III is suitable for low-tension single voltage level radial feeders. Method I is based on an iterative scheme with successive powerflow analyses, with formulation and solution of the optimization problem using linear programming. Method II and Method III are essentially based on the steady state performance of distribution systems. These methods are simple to implement and yield satisfactory results comparable with the results of Method I. The proposed methods have been applied to a few distribution systems, and results obtained for two typical systems are presented for illustration purposes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Adaptions of weighted rank regression to the accelerated failure time model for censored survival data have been successful in yielding asymptotically normal estimates and flexible weighting schemes to increase statistical efficiencies. However, for only one simple weighting scheme, Gehan or Wilcoxon weights, are estimating equations guaranteed to be monotone in parameter components, and even in this case are step functions, requiring the equivalent of linear programming for computation. The lack of smoothness makes standard error or covariance matrix estimation even more difficult. An induced smoothing technique overcame these difficulties in various problems involving monotone but pure jump estimating equations, including conventional rank regression. The present paper applies induced smoothing to the Gehan-Wilcoxon weighted rank regression for the accelerated failure time model, for the more difficult case of survival time data subject to censoring, where the inapplicability of permutation arguments necessitates a new method of estimating null variance of estimating functions. Smooth monotone parameter estimation and rapid, reliable standard error or covariance matrix estimation is obtained.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two algorithms that improve upon the sequent-peak procedure for reservoir capacity calculation are presented. The first incorporates storage-dependent losses (like evaporation losses) exactly as the standard linear programming formulation does. The second extends the first so as to enable designing with less than maximum reliability even when allowable shortfall in any failure year is also specified. Together, the algorithms provide a more accurate, flexible and yet fast method of calculating the storage capacity requirement in preliminary screening and optimization models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article contributes an original integrated model of an open-pit coal mine for supporting energy-efficient decisions. Mixed integer linear programming is used to formulate a general integrated model of the operational energy consumption of four common open-pit coal mining subsystems: excavation and haulage, stockpiles, processing plants and belt conveyors. Mines are represented as connected instances of the four subsystems, in a flow sheet manner, which are then fitted to data provided by the mine operators. Solving the integrated model ensures the subsystems’ operations are synchronised and whole-of-mine energy efficiency is encouraged. An investigation on a case study of an open-pit coal mine is conducted to validate the proposed methodology. Opportunities are presented for using the model to aid energy-efficient decision-making at various levels of a mine, and future work to improve the approach is described.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bearing capacity factor N-c for axially loaded piles in clays whose cohesion increases linearly with depth has been estimated numerically under undrained (phi=0) condition. The Study follows the lower bound limit analysis in conjunction With finite elements and linear programming. A new formulation is proposed for solving an axisymmetric geotechnical stability problem. The variation of N-c with embedment ratio is obtained for several rates of the increase of soil cohesion with depth; a special case is also examined when the pile base was placed on the stiff clay stratum overlaid by a soft clay layer. It was noticed that the magnitude of N-c reaches almost a constant value for embedment ratio greater than unity. The roughness of the pile base and shaft affects marginally the magnitudes of N-c. The results obtained from the present study are found to compare quite well with the different numerical solutions reported in the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The vertical uplift resistance of two interfering rigid rough strip anchors embedded horizontally in sand at shallow depths has been examined. The analysis is performed by using an upper bound theorem o limit analysis in combination with finite elements and linear programming. It is specified that both the anchors are loaded to failure simultaneously at the same magnitude of the failure load. For different clear spacing (S) between the anchors, the magnitude of the efficiency factor (xi(gamma)) is determined. On account of interference, the magnitude of xi(gamma) is found to reduce continuously with a decrease in the spacing between the anchors. The results from the numerical analysis were found to compare reasonably well with the available theoretical data from the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The simultaneous state and parameter estimation problem for a linear discrete-time system with unknown noise statistics is treated as a large-scale optimization problem. The a posterioriprobability density function is maximized directly with respect to the states and parameters subject to the constraint of the system dynamics. The resulting optimization problem is too large for any of the standard non-linear programming techniques and hence an hierarchical optimization approach is proposed. It turns out that the states can be computed at the first levelfor given noise and system parameters. These, in turn, are to be modified at the second level.The states are to be computed from a large system of linear equations and two solution methods are considered for solving these equations, limiting the horizon to a suitable length. The resulting algorithm is a filter-smoother, suitable for off-line as well as on-line state estimation for given noise and system parameters. The second level problem is split up into two, one for modifying the noise statistics and the other for modifying the system parameters. An adaptive relaxation technique is proposed for modifying the noise statistics and a modified Gauss-Newton technique is used to adjust the system parameters.