879 resultados para Egocentric Constraint
Resumo:
A költségvetési korlát megkeményítése nem egyforma mértékben ment végbe minden posztszocialista gazdaságban. Egyes országokban messzire jutottak ebben a tekintetben, másokban viszont alig változott az indulóállapot. A tanulmány áttekinti a költségvetési korlát puhaságának különböző megnyilvánulásait: az állami támogatásokat, a puha adózást, a nem teljesítő bankkölcsönöket, a vállalatközi tartozások felgyülemlését és a kifizetetlen béreket. A jelenséget sokféle tényező okozza, amelyek többnyire együttesen jelentkeznek. Az állami tulajdon fenntartása kedvez a puha költségvetési szindróma megrögződésének, a privatizálás elősegíti a keményítést, de nem elégséges feltétele a kemény korlát érvényesítésének. Ehhez megfelelő politikai, jogi és gazdasági környezetet kell céltudatosan kialakítani. A posztszocialista átmenet kezdetén sokan azt hitték, hogy a hatékony piacgazdaság létrehozásához elegendő lesz megvalósítani a liberalizáció, privatizáció és stabilizáció "szentháromságát". Mára már kiderült, hogy a költségvetési korlát megkeményítése az említett három feladattal egyenrangúan fontos. Ahol ez nem valósul meg (például Oroszország), ott a privatizáció nem hozza meg a várt eredményt. ___________________ The budget constraint has not hardened to equal degrees in the various post-socialist countries. In some of them, a great deal has been done in this respect, while in others there has been hardly any change from the initial state. This study surveys the typical manifestations of softness of the budget constraint, such as state subsidies, soft taxation, non-performing loans, the accumulation of trade arrears between firms, and the build-up of wage arrears. Softness of the budget constraint is caused by several factors that tend to act in combination. Thus retention of state ownership helps to preserve the soft budget-constraint syndrome, while privatization encourages the budget constraint to harden, although it does not form a sufficient condition for it to happen. Purposeful development of the requisite political, legal and economic conditions is also required. It was widely maintained at the outset of the post-socialist transition that the 'Holy Trinity' of liberalization, privatization and stabilization would suffice to produce an efficient market economy. Since then, it has become clear that hardening the budget constraint needs to be given equal priority with these. Otherwise, the effects of privatization will fall short of expectations, as they have in Russia, for example.
Resumo:
A kutatások eddig főképpen azt vizsgálták, hogyan jelenik meg a puha költségvetési korlát szindrómája a vállalati szférában és a hitelrendszerben. A jelen cikk a kórházi szektorra összpontosítja a figyelmet. Leírja az események öt főszereplőjének, a betegnek, az orvosnak, a kórházigazgatónak, a politikusnak és a kórház tulajdonosának motivációit és magatartásuk ellentmondásos jellegét. A motivációk magyarázzák, miért olyan erőteljes a túlköltési hajlam és a költségvetési korlát felpuhulásának tendenciája. A döntési és finanszírozási folyamatok minden szintjén felfelé hárítják a túlköltés és eladósodás terheit. A cikk kitér a különböző tulajdonformák (állami, nonprofit és forprofit nem állami tulajdonformák) és a puha költségvetési korlát szindrómájának kapcsolatára. Végül normatív szempontból vizsgálja a jelenséget: melyek a költségvetési korlát megkeményítésének kedvező és kedvezőtlen következményei, és hogyan tükröződnek a normatív dilemmák az események résztvevőinek tudatában. ___________ Researches so far have examined mainly how the soft budget constraint syndrome appears in the corporate sphere and the credit system. This article concentrates on the hospital sector. It describes the motivations and the contradictory behaviour of the five main types of participant in the events: patients, doctors, hospital managers, politicians, and hospital owners. The motivations explain why the propensity to overspend and the tendency to soften the budget constraint are so strong. The burdens of overspending and indebtedness are pushed upwards at every level of the decision-making and funding processes. The article considers the connection between the soft budget constraint syn-drome and the various forms of ownership (state ownership and the non-profit and for-profit forms of non-state ownership). Finally, the phenomenon is examined from the normative point of view: what are the favourable and unfavourable consequences of hardening the budget constraint and how these are reflected in the consciousness of the participants in the normative dilemmas and events.
Resumo:
The author’s ideas on the soft budget constraint (SBC) were first expressed in 1976. Much progress has been made in understanding the problem over the ensuing four decades. The study takes issue with those who confine the concept to the process of bailing out loss-making socialist firms. It shows how the syndrome can appear in various organizations and forms in many spheres of the economy and points to the various means available for financial rescue. Single bailouts do not as such generate the SBC syndrome. It develops where the SBC becomes built into expectations. Special heed is paid to features generated by the syndrome in rescuer and rescuee organizations. The study reports on the spread of the syndrome in various periods of the socialist and the capitalist system, in various sectors. The author expresses his views on normative questions and on therapies against the harmful effects. He deals first with actual practice, then places the theory of the SBC in the sphere of ideas and models, showing how it relates to other theoretical trends, including institutional and behavioural economics and theories of moral hazard and inconsistency in time. He shows how far the intellectual apparatus of the SBC has spread in theoretical literature and where it has reached in the process of “canonization” by the economics profession. Finally, he reviews the main research tasks ahead.
Resumo:
Public management reforms are usually underpinned by arguments that they will make the public administration system more effective and efficient. In practice, however, it is very hard to determine whether a given reform will improve the efficiency and effectiveness of the public administration system in the long run. Here, I shall examine how the concept of the soft budget constraint (SBC) introduced by János Kornai (Kornai 1979, 1986; Kornai, Maskin & Roland 2003) can be applied to this problem. In the following, I shall describe the Hungarian public administration reforms implemented by the Orbán government from 2010 onward and analyze its reforms, focusing on which measures harden and which ones soften the budget constraint of the actors of the Hungarian public administration system. In the literature of economics, there is some evidence-based knowledge on how to harden/soften the budget constraint, which improves/reduces the effectiveness and hence the efficiency of the given system. By using the concept of SBC, I also hope to shed some light on the rationale behind the Hungarian government’s introduction of such a contradictory reform package. Previously, the concept of SBC was utilized narrowly in public management studies, mostly in the field of fiscal federalism. My goal is to apply the concept to a broader area of public management studies. My conclusion is that the concept of SBC can significantly contribute to public management studies by deepening our knowledge on the reasons behind the success and failure of public administration reforms.
Resumo:
This paper introduces a screw theory based method termed constraint and position identification (CPI) approach to synthesize decoupled spatial translational compliant parallel manipulators (XYZ CPMs) with consideration of actuation isolation. The proposed approach is based on a systematic arrangement of rigid stages and compliant modules in a three-legged XYZ CPM system using the constraint spaces and the position spaces of the compliant modules. The constraint spaces and the position spaces are firstly derived based on the screw theory instead of using the rigid-body mechanism design experience. Additionally, the constraint spaces are classified into different constraint combinations, with typical position spaces depicted via geometric entities. Furthermore, the systematic synthesis process based on the constraint combinations and the geometric entities is demonstrated via several examples. Finally, several novel decoupled XYZ CPMs with monolithic configurations are created and verified by finite elements analysis. The present CPI approach enables experts and beginners to synthesize a variety of decoupled XYZ CPMs with consideration of actuation isolation by selecting an appropriate constraint and an optimal position for each of the compliant modules according to a specific application.
Resumo:
Numerous works have been conducted on modelling basic compliant elements such as wire beams, and closed-form analytical models of most basic compliant elements have been well developed. However, the modelling of complex compliant mechanisms is still a challenging work. This paper proposes a constraint-force-based (CFB) modelling approach to model compliant mechanisms with a particular emphasis on modelling complex compliant mechanisms. The proposed CFB modelling approach can be regarded as an improved free-body- diagram (FBD) based modelling approach, and can be extended to a development of the screw-theory-based design approach. A compliant mechanism can be decomposed into rigid stages and compliant modules. A compliant module can offer elastic forces due to its deformation. Such elastic forces are regarded as variable constraint forces in the CFB modelling approach. Additionally, the CFB modelling approach defines external forces applied on a compliant mechanism as constant constraint forces. If a compliant mechanism is at static equilibrium, all the rigid stages are also at static equilibrium under the influence of the variable and constant constraint forces. Therefore, the constraint force equilibrium equations for all the rigid stages can be obtained, and the analytical model of the compliant mechanism can be derived based on the constraint force equilibrium equations. The CFB modelling approach can model a compliant mechanism linearly and nonlinearly, can obtain displacements of any points of the rigid stages, and allows external forces to be exerted on any positions of the rigid stages. Compared with the FBD based modelling approach, the CFB modelling approach does not need to identify the possible deformed configuration of a complex compliant mechanism to obtain the geometric compatibility conditions and the force equilibrium equations. Additionally, the mathematical expressions in the CFB approach have an easily understood physical meaning. Using the CFB modelling approach, the variable constraint forces of three compliant modules, a wire beam, a four-beam compliant module and an eight-beam compliant module, have been derived in this paper. Based on these variable constraint forces, the linear and non-linear models of a decoupled XYZ compliant parallel mechanism are derived, and verified by FEA simulations and experimental tests.
Resumo:
This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.
Resumo:
Many countries have set challenging wind power targets to achieve by 2020. This paper implements a realistic analysis of curtailment and constraint of wind energy at a nodal level using a unit commitment and economic dispatch model of the Irish Single Electricity Market in 2020. The key findings show that significant reduction in curtailment can be achieved when the system non-synchronous penetration limit increases from 65% to 75%. For the period analyzed, this results in a decreased total generation cost and a reduction in the dispatch-down of wind. However, some nodes experience significant dispatch-down of wind, which can be in the order of 40%. This work illustrates the importance of implementing analysis at a nodal level for the purpose of power system planning.
Resumo:
This article reports the results of a survey of the pearl oyster industry in French Polynesia territory. Its purpose is to examine the perceptions of the priorities for the development of this industry towards sustainable development. These perceptions were apprehended by a survey of pearl oyster farmers and other stakeholders of the sector (management authorities, scientists). After describing the methodological protocol of these investigations, it comes to confront the priorities chosen by professionals (i.e. pearl farmers) concerning sustainable development, with the perceptions of others stakeholders in the sector. Secondly it comes to build a typology of the priorities of pearl farmers concerning sustainable development. This analysis enables the assessment of the degree of convergence within the sector, which is the base material for defining a shared action plan at the territory scale. This is the first study compiling data of surveys of various professionals and stakeholders of the pearl farming industry in such a large area in French Polynesia.
Resumo:
Context. In February-March 2014, the MAGIC telescopes observed the high-frequency peaked BL Lac 1ES 1011+496 (z=0.212) in flaring state at very-high energy (VHE, E>100GeV). The flux reached a level more than 10 times higher than any previously recorded flaring state of the source. Aims. Description of the characteristics of the flare presenting the light curve and the spectral parameters of the night-wise spectra and the average spectrum of the whole period. From these data we aim at detecting the imprint of the Extragalactic Background Light (EBL) in the VHE spectrum of the source, in order to constrain its intensity in the optical band. Methods. We analyzed the gamma-ray data from the MAGIC telescopes using the standard MAGIC software for the production of the light curve and the spectra. For the constraining of the EBL we implement the method developed by the H.E.S.S. collaboration in which the intrinsic energy spectrum of the source is modeled with a simple function (< 4 parameters), and the EBL-induced optical depth is calculated using a template EBL model. The likelihood of the observed spectrum is then maximized, including a normalization factor for the EBL opacity among the free parameters. Results. The collected data allowed us to describe the flux changes night by night and also to produce di_erential energy spectra for all nights of the observed period. The estimated intrinsic spectra of all the nights could be fitted by power-law functions. Evaluating the changes in the fit parameters we conclude that the spectral shape for most of the nights were compatible, regardless of the flux level, which enabled us to produce an average spectrum from which the EBL imprint could be constrained. The likelihood ratio test shows that the model with an EBL density 1:07 (-0.20,+0.24)stat+sys, relative to the one in the tested EBL template (Domínguez et al. 2011), is preferred at the 4:6 σ level to the no-EBL hypothesis, with the assumption that the intrinsic source spectrum can be modeled as a log-parabola. This would translate into a constraint of the EBL density in the wavelength range [0.24 μm,4.25 μm], with a peak value at 1.4 μm of λF_ = 12:27^(+2:75)_ (-2:29) nW m^(-2) sr^(-1), including systematics.
Resumo:
Projeto de Graduação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Licenciada em Fisioterapia
Resumo:
Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.
Resumo:
Solving a complex Constraint Satisfaction Problem (CSP) is a computationally hard task which may require a considerable amount of time. Parallelism has been applied successfully to the job and there are already many applications capable of harnessing the parallel power of modern CPUs to speed up the solving process. Current Graphics Processing Units (GPUs), containing from a few hundred to a few thousand cores, possess a level of parallelism that surpasses that of CPUs and there are much less applications capable of solving CSPs on GPUs, leaving space for further improvement. This paper describes work in progress in the solving of CSPs on GPUs, CPUs and other devices, such as Intel Many Integrated Cores (MICs), in parallel. It presents the gains obtained when applying more devices to solve some problems and the main challenges that must be faced when using devices with as different architectures as CPUs and GPUs, with a greater focus on how to effectively achieve good load balancing between such heterogeneous devices.
Resumo:
A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.