940 resultados para Complex network. Optimal path. Optimal path cracks
Resumo:
Emerging vehicular comfort applications pose a host of completely new set of requirements such as maintaining end-to-end connectivity, packet routing, and reliable communication for internet access while on the move. One of the biggest challenges is to provide good quality of service (QoS) such as low packet delay while coping with the fast topological changes. In this paper, we propose a clustering algorithm based on minimal path loss ratio (MPLR) which should help in spectrum efficiency and reduce data congestion in the network. The vehicular nodes which experience minimal path loss are selected as the cluster heads. The performance of the MPLR clustering algorithm is calculated by rate of change of cluster heads, average number of clusters and average cluster size. Vehicular traffic models derived from the Traffic Wales data are fed as input to the motorway simulator. A mathematical analysis for the rate of change of cluster head is derived which validates the MPLR algorithm and is compared with the simulated results. The mathematical and simulated results are in good agreement indicating the stability of the algorithm and the accuracy of the simulator. The MPLR system is also compared with V2R system with MPLR system performing better. © 2013 IEEE.
Resumo:
Emerging vehicular comfort applications pose a host of completely new set of requirements such as maintaining end-to-end connectivity, packet routing, and reliable communication for internet access while on the move. One of the biggest challenges is to provide good quality of service (QoS) such as low packet delay while coping with the fast topological changes. In this paper, we propose a clustering algorithm based on minimal path loss ratio (MPLR) which should help in spectrum efficiency and reduce data congestion in the network. The vehicular nodes which experience minimal path loss are selected as the cluster heads. The performance of the MPLR clustering algorithm is calculated by rate of change of cluster heads, average number of clusters and average cluster size. Vehicular traffic models derived from the Traffic Wales data are fed as input to the motorway simulator. A mathematical analysis for the rate of change of cluster head is derived which validates the MPLR algorithm and is compared with the simulated results. The mathematical and simulated results are in good agreement indicating the stability of the algorithm and the accuracy of the simulator. The MPLR system is also compared with V2R system with MPLR system performing better. © 2013 IEEE.
Resumo:
Fifty seven short fatigue cracks in the Ni-base superalloy AP1 have been examined, to ascertain how the paths taken by growing fatigue cracks are determined. The observations were made on the surface of a smooth specimen, and on the exposed fracture surfaces. Three dimensional reconstructions of the vulnerable microstructures in the vicinity of the cracks were produced. Initiation occurred in mode II, with the lines of intersection of the initiation sites with the specimen top surface orientated at approximately 45° to the tensile axis. These initiation sites developed in slip bands which crossed a large grain and at least one other grain via a grain boundary with a low angle of misorientation. 'River markings' on one of the initiation facets, indicated that the crack first opened from the top centre of the initiation grain. Subsequent to initiation, the growth paths of these cracks are related to the misorientations of the grains and the progress of the crack front.
Resumo:
Renewable energy forms have been widely used in the past decades highlighting a "green" shift in energy production. An actual reason behind this turn to renewable energy production is EU directives which set the Union's targets for energy production from renewable sources, greenhouse gas emissions and increase in energy efficiency. All member countries are obligated to apply harmonized legislation and practices and restructure their energy production networks in order to meet EU targets. Towards the fulfillment of 20-20-20 EU targets, in Greece a specific strategy which promotes the construction of large scale Renewable Energy Source plants is promoted. In this paper, we present an optimal design of the Greek renewable energy production network applying a 0-1 Weighted Goal Programming model, considering social, environmental and economic criteria. In the absence of a panel of experts Data Envelopment Analysis (DEA) approach is used in order to filter the best out of the possible network structures, seeking for the maximum technical efficiency. Super-Efficiency DEA model is also used in order to reduce the solutions and find the best out of all the possible. The results showed that in order to achieve maximum efficiency, the social and environmental criteria must be weighted more than the economic ones.
Resumo:
The task of smooth and stable decision rules construction in logical recognition models is considered. Logical regularities of classes are defined as conjunctions of one-place predicates that determine the membership of features values in an intervals of the real axis. The conjunctions are true on a special no extending subsets of reference objects of some class and are optimal. The standard approach of linear decision rules construction for given sets of logical regularities consists in realization of voting schemes. The weighting coefficients of voting procedures are done as heuristic ones or are as solutions of complex optimization task. The modifications of linear decision rules are proposed that are based on the search of maximal estimations of standard objects for their classes and use approximations of logical regularities by smooth sigmoid functions.
Resumo:
This paper considers the global synchronisation of a stochastic version of coupled map lattices networks through an innovative stochastic adaptive linear quadratic pinning control methodology. In a stochastic network, each state receives only noisy measurement of its neighbours' states. For such networks we derive a generalised Riccati solution that quantifies and incorporates uncertainty of the forward dynamics and inverse controller in the derivation of the stochastic optimal control law. The generalised Riccati solution is derived using the Lyapunov approach. A probabilistic approximation type algorithm is employed to estimate the conditional distributions of the state and inverse controller from historical data and quantifying model uncertainties. The theoretical derivation is complemented by its validation on a set of representative examples.
Resumo:
Many practical routing algorithms are heuristic, adhoc and centralized, rendering generic and optimal path configurations difficult to obtain. Here we study a scenario whereby selected nodes in a given network communicate with fixed routers and employ statistical physics methods to obtain optimal routing solutions subject to a generic cost. A distributive message-passing algorithm capable of optimizing the path configuration in real instances is devised, based on the analytical derivation, and is greatly simplified by expanding the cost function around the optimized flow. Good algorithmic convergence is observed in most of the parameter regimes. By applying the algorithm, we study and compare the pros and cons of balanced traffic configurations to that of consolidated traffic, which provides important implications to practical communication and transportation networks. Interesting macroscopic phenomena are observed from the optimized states as an interplay between the communication density and the cost functions used. © 2013 IEEE.
Resumo:
In this paper shortest path games are considered. The transportation of a good in a network has costs and benet too. The problem is to divide the prot of the transportation among the players. Fragnelli et al (2000) introduce the class of shortest path games, which coincides with the class of monotone games. They also give a characterization of the Shapley value on this class of games. In this paper we consider further four characterizations of the Shapley value (Shapley (1953)'s, Young (1985)'s, Chun (1989)'s, and van den Brink (2001)'s axiomatizations), and conclude that all the mentioned axiomatizations are valid for shortest path games. Fragnelli et al (2000)'s axioms are based on the graph behind the problem, in this paper we do not consider graph specic axioms, we take TU axioms only, that is, we consider all shortest path problems and we take the view of abstract decision maker who focuses rather on the abstract problem than on the concrete situations.
Resumo:
In this paper shortest path games are considered. The transportation of a good in a network has costs and benet too. The problem is to divide the prot of the transportation among the players. Fragnelli et al (2000) introduce the class of shortest path games, which coincides with the class of monotone games. They also give a characterization of the Shapley value on this class of games. In this paper we consider further four characterizations of the Shapley value (Shapley (1953)'s, Young (1985)'s, Chun (1989)'s, and van den Brink (2001)'s axiomatizations), and conclude that all the mentioned axiomatizations are valid for shortest path games. Fragnelli et al (2000)'s axioms are based on the graph behind the problem, in this paper we do not consider graph specic axioms, we take TU axioms only, that is, we consider all shortest path problems and we take the view of abstract decision maker who focuses rather on the abstract problem than on the concrete situations.
Resumo:
This thesis is an analysis of the recruitment process of the Shining Path -SP- and Revolutionary Movement “Túpac Amaru” -MRTA- guerrilla groups. Although SP was considered more aggressive, it gained more followers than MRTA. This thesis tries to explain why. Social Revolution Theory and Social Movement Theory provide explanations based on issues of “poverty”, disregarding the specific characteristics of the guerrilla groups and their supporters, as well as the influence of specific persuasive processes between the leaders of the groups and their followers. Integrative complexity theory, on the contrary, provides a consistent method to analyze cognitive processes: because people tend to reject complex and sophisticated explanations that require mental efforts, simplicity was the key for success. To prove which guerrilla group provided a simpler worldview, a sample of official documents of SP and MRTA are compared. Finally, content analysis is applied through the Paragraph Completion Test (P.C.T.). ^
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^
Resumo:
This paper analyzes a manager's optimal ex-ante reporting system using a Bayesian persuasion approach (Kamenica and Gentzkow (2011)) in a setting where investors affect cash flows through their decision to finance the firm's investment opportunities, possibly assisted by the costly acquisition of additional information (inspection). I examine how the informativeness and the bias of the optimal system are determined by investors' inspection cost, the degree of incentive alignment between the manager and the investor, and the prior belief that the project is profitable. I find that a mis-aligned manager's system is informative
only when the market prior is pessimistic and is always positively biased; this bias decreases as investors' inspection cost decreases. In contrast, a well-aligned manager's system is fully revealing when investors' inspection cost is high, and is counter-cyclical to the market belief when the inspection cost is low: It is positively (negatively) biased when the market belief is pessimistic (optimistic). Furthermore, I explore the extent to which the results generalize to a case with managerial manipulation and discuss the implications for investment efficiency. Overall, the analysis describes the complex interactions among determinants of firm disclosures and governance, and offers explanations for the mixed empirical results in this area.
Resumo:
With increasing prevalence and capabilities of autonomous systems as part of complex heterogeneous manned-unmanned environments (HMUEs), an important consideration is the impact of the introduction of automation on the optimal assignment of human personnel. The US Navy has implemented optimal staffing techniques before in the 1990's and 2000's with a "minimal staffing" approach. The results were poor, leading to the degradation of Naval preparedness. Clearly, another approach to determining optimal staffing is necessary. To this end, the goal of this research is to develop human performance models for use in determining optimal manning of HMUEs. The human performance models are developed using an agent-based simulation of the aircraft carrier flight deck, a representative safety-critical HMUE. The Personnel Multi-Agent Safety and Control Simulation (PMASCS) simulates and analyzes the effects of introducing generalized maintenance crew skill sets and accelerated failure repair times on the overall performance and safety of the carrier flight deck. A behavioral model of four operator types (ordnance officers, chocks and chains, fueling officers, plane captains, and maintenance operators) is presented here along with an aircraft failure model. The main focus of this work is on the maintenance operators and aircraft failure modeling, since they have a direct impact on total launch time, a primary metric for carrier deck performance. With PMASCS I explore the effects of two variables on total launch time of 22 aircraft: 1) skill level of maintenance operators and 2) aircraft failure repair times while on the catapult (referred to as Phase 4 repair times). It is found that neither introducing a generic skill set to maintenance crews nor introducing a technology to accelerate Phase 4 aircraft repair times improves the average total launch time of 22 aircraft. An optimal manning level of 3 maintenance crews is found under all conditions, the point at which any additional maintenance crews does not reduce the total launch time. An additional discussion is included about how these results change if the operations are relieved of the bottleneck of installing the holdback bar at launch time.
Resumo:
I explore and analyze a problem of finding the socially optimal capital requirements for financial institutions considering two distinct channels of contagion: direct exposures among the institutions, as represented by a network and fire sales externalities, which reflect the negative price impact of massive liquidation of assets.These two channels amplify shocks from individual financial institutions to the financial system as a whole and thus increase the risk of joint defaults amongst the interconnected financial institutions; this is often referred to as systemic risk. In the model, there is a trade-off between reducing systemic risk and raising the capital requirements of the financial institutions. The policymaker considers this trade-off and determines the optimal capital requirements for individual financial institutions. I provide a method for finding and analyzing the optimal capital requirements that can be applied to arbitrary network structures and arbitrary distributions of investment returns.
In particular, I first consider a network model consisting only of direct exposures and show that the optimal capital requirements can be found by solving a stochastic linear programming problem. I then extend the analysis to financial networks with default costs and show the optimal capital requirements can be found by solving a stochastic mixed integer programming problem. The computational complexity of this problem poses a challenge, and I develop an iterative algorithm that can be efficiently executed. I show that the iterative algorithm leads to solutions that are nearly optimal by comparing it with lower bounds based on a dual approach. I also show that the iterative algorithm converges to the optimal solution.
Finally, I incorporate fire sales externalities into the model. In particular, I am able to extend the analysis of systemic risk and the optimal capital requirements with a single illiquid asset to a model with multiple illiquid assets. The model with multiple illiquid assets incorporates liquidation rules used by the banks. I provide an optimization formulation whose solution provides the equilibrium payments for a given liquidation rule.
I further show that the socially optimal capital problem using the ``socially optimal liquidation" and prioritized liquidation rules can be formulated as a convex and convex mixed integer problem, respectively. Finally, I illustrate the results of the methodology on numerical examples and
discuss some implications for capital regulation policy and stress testing.
Resumo:
Recently honeycomb meshes have been considered as alternative candidates for interconnection networks in parallel and distributed computer systems. This paper presents a solution to one of the open problems about honeycomb meshes—the so-called three disjoint path problem. The problem requires minimizing the length of the longest of any three disjoint paths between 3-degree nodes. This solution provides information on the re-routing of traffic along the network in the presence of faults.