39 resultados para Tactical allocation
Resumo:
We use a unique dataset with bank clients’ security holdings for all German banks to examine how macroeconomic shocks affect asset allocation preferences of households and non-financial firms. Our analysis focuses on two alternative mechanisms which can influence portfolio choice: wealth shocks, which are represented by the sovereign debt crisis in the Eurozone, and credit-supply shocks which arise from reductions in borrowing abilities during bank distress. We document het- erogeneous responses to these two types of shocks. While households with large holdings of secu- rities from stressed Eurozone countries (Greece, Ireland, Italy, Portugal, and Spain) decrease the degree of concentration in their security portfolio as a result of the Eurozone crisis, non-financial firms with similar levels of holdings from stressed Eurozone countries do not. Credit-supply shocks at the bank level (caused by bank distress) result in lower concentration, for both households and non-financial corporations. We also show that only shocks to corporate credit bear ramifications on bank clients’ portfolio concentration, while shocks in retail credit are inconsequential. Our results are robust to falsification tests, propensity score matching techniques, and instrumental variables estimation.
Resumo:
The increase in renewable energy generators introduced into the electricity grid is putting pressure on its stability and management as predictions of renewable energy sources cannot be accurate or fully controlled. This, with the additional pressure of fluctuations in demand, presents a problem more complex than the current methods of controlling electricity distribution were designed for. A global approximate and distributed optimisation method for power allocation that accommodates uncertainties and volatility is suggested and analysed. It is based on a probabilistic method known as message passing [1], which has deep links to statistical physics methodology. This principled method of optimisation is based on local calculations and inherently accommodates uncertainties; it is of modest computational complexity and provides good approximate solutions.We consider uncertainty and fluctuations drawn from a Gaussian distribution and incorporate them into the message-passing algorithm. We see the effect that increasing uncertainty has on the transmission cost and how the placement of volatile nodes within a grid, such as renewable generators or consumers, effects it.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
This paper proposes an allocation Malmquist index which is inspired by the work on the non-parametric cost Malmquist index. We first show that how to decompose the cost Malmquist index into the input-oriented Malmquist index and the allocation Malmquist index. An application in corporate management of the China securities industry with the panel data set of 40 securities companies during the period 2005–2011 shows the practicality of the propose model.
Resumo:
Non-orthogonal multiple access (NOMA) is emerging as a promising multiple access technology for the fifth generation cellular networks to address the fast growing mobile data traffic. It applies superposition coding in transmitters, allowing simultaneous allocation of the same frequency resource to multiple intra-cell users. Successive interference cancellation is used at the receivers to cancel intra-cell interference. User pairing and power allocation (UPPA) is a key design aspect of NOMA. Existing UPPA algorithms are mainly based on exhaustive search method with extensive computation complexity, which can severely affect the NOMA performance. A fast proportional fairness (PF) scheduling based UPPA algorithm is proposed to address the problem. The novel idea is to form user pairs around the users with the highest PF metrics with pre-configured fixed power allocation. Systemlevel simulation results show that the proposed algorithm is significantly faster (seven times faster for the scenario with 20 users) with a negligible throughput loss than the existing exhaustive search algorithm.