940 resultados para Exact constraint
Resumo:
A költségvetési korlát megkeményítése nem egyforma mértékben ment végbe minden posztszocialista gazdaságban. Egyes országokban messzire jutottak ebben a tekintetben, másokban viszont alig változott az indulóállapot. A tanulmány áttekinti a költségvetési korlát puhaságának különböző megnyilvánulásait: az állami támogatásokat, a puha adózást, a nem teljesítő bankkölcsönöket, a vállalatközi tartozások felgyülemlését és a kifizetetlen béreket. A jelenséget sokféle tényező okozza, amelyek többnyire együttesen jelentkeznek. Az állami tulajdon fenntartása kedvez a puha költségvetési szindróma megrögződésének, a privatizálás elősegíti a keményítést, de nem elégséges feltétele a kemény korlát érvényesítésének. Ehhez megfelelő politikai, jogi és gazdasági környezetet kell céltudatosan kialakítani. A posztszocialista átmenet kezdetén sokan azt hitték, hogy a hatékony piacgazdaság létrehozásához elegendő lesz megvalósítani a liberalizáció, privatizáció és stabilizáció "szentháromságát". Mára már kiderült, hogy a költségvetési korlát megkeményítése az említett három feladattal egyenrangúan fontos. Ahol ez nem valósul meg (például Oroszország), ott a privatizáció nem hozza meg a várt eredményt. ___________________ The budget constraint has not hardened to equal degrees in the various post-socialist countries. In some of them, a great deal has been done in this respect, while in others there has been hardly any change from the initial state. This study surveys the typical manifestations of softness of the budget constraint, such as state subsidies, soft taxation, non-performing loans, the accumulation of trade arrears between firms, and the build-up of wage arrears. Softness of the budget constraint is caused by several factors that tend to act in combination. Thus retention of state ownership helps to preserve the soft budget-constraint syndrome, while privatization encourages the budget constraint to harden, although it does not form a sufficient condition for it to happen. Purposeful development of the requisite political, legal and economic conditions is also required. It was widely maintained at the outset of the post-socialist transition that the 'Holy Trinity' of liberalization, privatization and stabilization would suffice to produce an efficient market economy. Since then, it has become clear that hardening the budget constraint needs to be given equal priority with these. Otherwise, the effects of privatization will fall short of expectations, as they have in Russia, for example.
Resumo:
A kutatások eddig főképpen azt vizsgálták, hogyan jelenik meg a puha költségvetési korlát szindrómája a vállalati szférában és a hitelrendszerben. A jelen cikk a kórházi szektorra összpontosítja a figyelmet. Leírja az események öt főszereplőjének, a betegnek, az orvosnak, a kórházigazgatónak, a politikusnak és a kórház tulajdonosának motivációit és magatartásuk ellentmondásos jellegét. A motivációk magyarázzák, miért olyan erőteljes a túlköltési hajlam és a költségvetési korlát felpuhulásának tendenciája. A döntési és finanszírozási folyamatok minden szintjén felfelé hárítják a túlköltés és eladósodás terheit. A cikk kitér a különböző tulajdonformák (állami, nonprofit és forprofit nem állami tulajdonformák) és a puha költségvetési korlát szindrómájának kapcsolatára. Végül normatív szempontból vizsgálja a jelenséget: melyek a költségvetési korlát megkeményítésének kedvező és kedvezőtlen következményei, és hogyan tükröződnek a normatív dilemmák az események résztvevőinek tudatában. ___________ Researches so far have examined mainly how the soft budget constraint syndrome appears in the corporate sphere and the credit system. This article concentrates on the hospital sector. It describes the motivations and the contradictory behaviour of the five main types of participant in the events: patients, doctors, hospital managers, politicians, and hospital owners. The motivations explain why the propensity to overspend and the tendency to soften the budget constraint are so strong. The burdens of overspending and indebtedness are pushed upwards at every level of the decision-making and funding processes. The article considers the connection between the soft budget constraint syn-drome and the various forms of ownership (state ownership and the non-profit and for-profit forms of non-state ownership). Finally, the phenomenon is examined from the normative point of view: what are the favourable and unfavourable consequences of hardening the budget constraint and how these are reflected in the consciousness of the participants in the normative dilemmas and events.
Resumo:
The author’s ideas on the soft budget constraint (SBC) were first expressed in 1976. Much progress has been made in understanding the problem over the ensuing four decades. The study takes issue with those who confine the concept to the process of bailing out loss-making socialist firms. It shows how the syndrome can appear in various organizations and forms in many spheres of the economy and points to the various means available for financial rescue. Single bailouts do not as such generate the SBC syndrome. It develops where the SBC becomes built into expectations. Special heed is paid to features generated by the syndrome in rescuer and rescuee organizations. The study reports on the spread of the syndrome in various periods of the socialist and the capitalist system, in various sectors. The author expresses his views on normative questions and on therapies against the harmful effects. He deals first with actual practice, then places the theory of the SBC in the sphere of ideas and models, showing how it relates to other theoretical trends, including institutional and behavioural economics and theories of moral hazard and inconsistency in time. He shows how far the intellectual apparatus of the SBC has spread in theoretical literature and where it has reached in the process of “canonization” by the economics profession. Finally, he reviews the main research tasks ahead.
Resumo:
Public management reforms are usually underpinned by arguments that they will make the public administration system more effective and efficient. In practice, however, it is very hard to determine whether a given reform will improve the efficiency and effectiveness of the public administration system in the long run. Here, I shall examine how the concept of the soft budget constraint (SBC) introduced by János Kornai (Kornai 1979, 1986; Kornai, Maskin & Roland 2003) can be applied to this problem. In the following, I shall describe the Hungarian public administration reforms implemented by the Orbán government from 2010 onward and analyze its reforms, focusing on which measures harden and which ones soften the budget constraint of the actors of the Hungarian public administration system. In the literature of economics, there is some evidence-based knowledge on how to harden/soften the budget constraint, which improves/reduces the effectiveness and hence the efficiency of the given system. By using the concept of SBC, I also hope to shed some light on the rationale behind the Hungarian government’s introduction of such a contradictory reform package. Previously, the concept of SBC was utilized narrowly in public management studies, mostly in the field of fiscal federalism. My goal is to apply the concept to a broader area of public management studies. My conclusion is that the concept of SBC can significantly contribute to public management studies by deepening our knowledge on the reasons behind the success and failure of public administration reforms.
Resumo:
This paper introduces a screw theory based method termed constraint and position identification (CPI) approach to synthesize decoupled spatial translational compliant parallel manipulators (XYZ CPMs) with consideration of actuation isolation. The proposed approach is based on a systematic arrangement of rigid stages and compliant modules in a three-legged XYZ CPM system using the constraint spaces and the position spaces of the compliant modules. The constraint spaces and the position spaces are firstly derived based on the screw theory instead of using the rigid-body mechanism design experience. Additionally, the constraint spaces are classified into different constraint combinations, with typical position spaces depicted via geometric entities. Furthermore, the systematic synthesis process based on the constraint combinations and the geometric entities is demonstrated via several examples. Finally, several novel decoupled XYZ CPMs with monolithic configurations are created and verified by finite elements analysis. The present CPI approach enables experts and beginners to synthesize a variety of decoupled XYZ CPMs with consideration of actuation isolation by selecting an appropriate constraint and an optimal position for each of the compliant modules according to a specific application.
Resumo:
Numerous works have been conducted on modelling basic compliant elements such as wire beams, and closed-form analytical models of most basic compliant elements have been well developed. However, the modelling of complex compliant mechanisms is still a challenging work. This paper proposes a constraint-force-based (CFB) modelling approach to model compliant mechanisms with a particular emphasis on modelling complex compliant mechanisms. The proposed CFB modelling approach can be regarded as an improved free-body- diagram (FBD) based modelling approach, and can be extended to a development of the screw-theory-based design approach. A compliant mechanism can be decomposed into rigid stages and compliant modules. A compliant module can offer elastic forces due to its deformation. Such elastic forces are regarded as variable constraint forces in the CFB modelling approach. Additionally, the CFB modelling approach defines external forces applied on a compliant mechanism as constant constraint forces. If a compliant mechanism is at static equilibrium, all the rigid stages are also at static equilibrium under the influence of the variable and constant constraint forces. Therefore, the constraint force equilibrium equations for all the rigid stages can be obtained, and the analytical model of the compliant mechanism can be derived based on the constraint force equilibrium equations. The CFB modelling approach can model a compliant mechanism linearly and nonlinearly, can obtain displacements of any points of the rigid stages, and allows external forces to be exerted on any positions of the rigid stages. Compared with the FBD based modelling approach, the CFB modelling approach does not need to identify the possible deformed configuration of a complex compliant mechanism to obtain the geometric compatibility conditions and the force equilibrium equations. Additionally, the mathematical expressions in the CFB approach have an easily understood physical meaning. Using the CFB modelling approach, the variable constraint forces of three compliant modules, a wire beam, a four-beam compliant module and an eight-beam compliant module, have been derived in this paper. Based on these variable constraint forces, the linear and non-linear models of a decoupled XYZ compliant parallel mechanism are derived, and verified by FEA simulations and experimental tests.
Resumo:
The generalized KP (GKP) equations with an arbitrary nonlinear term model and characterize many nonlinear physical phenomena. The symmetries of GKP equation with an arbitrary nonlinear term are obtained. The condition that must satisfy for existence the symmetries group of GKP is derived and also the obtained symmetries are classified according to different forms of the nonlinear term. The resulting similarity reductions are studied by performing the bifurcation and the phase portrait of GKP and also the corresponding solitary wave solutions of GKP
equation are constructed.
Resumo:
This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.
Resumo:
In this paper, we propose three relay selection schemes for full-duplex heterogeneous networks in the presence of multiple cognitive radio eavesdroppers. In this setup, the cognitive small-cell nodes (secondary network) can share the spectrum licensed to the macro-cell system (primary network) on the condition that the quality-of-service of the primary network is always satisfied subjected to its outage probability constraint. The messages are delivered from one small-cell base station to the destination with the help of full-duplex small-cell base stations, which act as relay nodes. Based on the availability of the network’s channel state information at the secondary information source, three different selection criteria for full-duplex relays, namely: 1) partial relay selection; 2) optimal relay selection; and 3) minimal self-interference relay selection, are proposed. We derive the exact closed-form and asymptotic expressions of the secrecy outage probability for the three criteria under the attack of non-colluding/colluding eavesdroppers. We demonstrate that the optimal relay selection scheme outperforms the partial relay selection and minimal self-interference relay selection schemes at the expense of acquiring full channel state information knowledge. In addition, increasing the number of the full-duplex small-cell base stations can improve the security performance. At the illegitimate side, deploying colluding eavesdroppers and increasing the number of eavesdroppers put the confidential information at a greater risk. Besides, the transmit power and the desire outage probability of the primary network have great influences on the secrecy outage probability of the secondary network.
Resumo:
Many countries have set challenging wind power targets to achieve by 2020. This paper implements a realistic analysis of curtailment and constraint of wind energy at a nodal level using a unit commitment and economic dispatch model of the Irish Single Electricity Market in 2020. The key findings show that significant reduction in curtailment can be achieved when the system non-synchronous penetration limit increases from 65% to 75%. For the period analyzed, this results in a decreased total generation cost and a reduction in the dispatch-down of wind. However, some nodes experience significant dispatch-down of wind, which can be in the order of 40%. This work illustrates the importance of implementing analysis at a nodal level for the purpose of power system planning.
Resumo:
This work examines analytically the forced convection in a channel partially filled with a porous material and subjected to constant wall heat flux. The Darcy–Brinkman–Forchheimer model is used to represent the fluid transport through the porous material. The local thermal non-equilibrium, two-equation model is further employed as the solid and fluid heat transport equations. Two fundamental models (models A and B) represent the thermal boundary conditions at the interface between the porous medium and the clear region. The governing equations of the problem are manipulated, and for each interface model, exact solutions, for the solid and fluid temperature fields, are developed. These solutions incorporate the porous material thickness, Biot number, fluid to solid thermal conductivity ratio and Darcy number as parameters. The results can be readily used to validate numerical simulations. They are, further, applicable to the analysis of enhanced heat transfer, using porous materials, in heat exchangers.
Resumo:
The bond formation between an oxide surface and oxygen, which is of importance for numerous surface reactions including catalytic reactions, is investigated within the framework of hybrid density functional theory that includes nonlocal Fock exchange. We show that there exists a linear correlation between the adsorption energies of oxygen on LaMO3 (M = Sc–Cu) surfaces obtained using a hybrid functional (e.g., Heyd–Scuseria–Ernzerhof) and those obtained using a semilocal density functional (e.g., Perdew–Burke–Ernzerhof) through the magnetic properties of the bulk phase as determined with a hybrid functional. The energetics of the spin-polarized surfaces follows the same trend as corresponding bulk systems, which can be treated at a much lower computational cost. The difference in adsorption energy due to magnetism is linearly correlated to the magnetization energy of bulk, that is, the energy difference between the spin-polarized and the non-spin-polarized solutions. Hence, one can estimate the correction ...
Resumo:
The problem addressed concerns the determination of the average numberof successive attempts of guessing a word of a certain length consisting of letters withgiven probabilities of occurrence. Both first- and second-order approximations to a naturallanguage are considered. The guessing strategy used is guessing words in decreasing orderof probability. When word and alphabet sizes are large, approximations are necessary inorder to estimate the number of guesses. Several kinds of approximations are discusseddemonstrating moderate requirements regarding both memory and central processing unit(CPU) time. When considering realistic sizes of alphabets and words (100), the numberof guesses can be estimated within minutes with reasonable accuracy (a few percent) andmay therefore constitute an alternative to, e.g., various entropy expressions. For manyprobability distributions, the density of the logarithm of probability products is close to anormal distribution. For those cases, it is possible to derive an analytical expression for theaverage number of guesses. The proportion of guesses needed on average compared to thetotal number decreases almost exponentially with the word length. The leading term in anasymptotic expansion can be used to estimate the number of guesses for large word lengths.Comparisons with analytical lower bounds and entropy expressions are also provided.
Resumo:
Les jeux de policiers et voleurs sont étudiés depuis une trentaine d’années en informatique et en mathématiques. Comme dans les jeux de poursuite en général, des poursuivants (les policiers) cherchent à capturer des évadés (les voleurs), cependant ici les joueurs agissent tour à tour et sont contraints de se déplacer sur une structure discrète. On suppose toujours que les joueurs connaissent les positions exactes de leurs opposants, autrement dit le jeu se déroule à information parfaite. La première définition d’un jeu de policiers-voleurs remonte à celle de Nowakowski et Winkler [39] et, indépendamment, Quilliot [46]. Cette première définition présente un jeu opposant un seul policier et un seul voleur avec des contraintes sur leurs vitesses de déplacement. Des extensions furent graduellement proposées telles que l’ajout de policiers et l’augmentation des vitesses de mouvement. En 2014, Bonato et MacGillivray [6] proposèrent une généralisation des jeux de policiers-voleurs pour permettre l’étude de ceux-ci dans leur globalité. Cependant, leur modèle ne couvre aucunement les jeux possédant des composantes stochastiques tels que ceux dans lesquels les voleurs peuvent bouger de manière aléatoire. Dans ce mémoire est donc présenté un nouveau modèle incluant des aspects stochastiques. En second lieu, on présente dans ce mémoire une application concrète de l’utilisation de ces jeux sous la forme d’une méthode de résolution d’un problème provenant de la théorie de la recherche. Alors que les jeux de policiers et voleurs utilisent l’hypothèse de l’information parfaite, les problèmes de recherches ne peuvent faire cette supposition. Il appert cependant que le jeu de policiers et voleurs peut être analysé comme une relaxation de contraintes d’un problème de recherche. Ce nouvel angle de vue est exploité pour la conception d’une borne supérieure sur la fonction objectif d’un problème de recherche pouvant être mise à contribution dans une méthode dite de branch and bound.