10 resultados para Dominant logic

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The elusive fiction of J. M. Coetzee is not a work in which you can read fixed ethical stances. I suggest testing the potentialities of a logic based on frames and double binds in Coetzee's novels. A double bind is a dilemma in communication which consists on tho conflicting messages, with the result that you can’t successfully respond to neither. Jacques Derrida highlighted the strategic value of a way of thinking based on the double bind (but on frames as well), which enables to escape binary thinking and so it opens an ethical space, where you can make a choice out of a set of fixed rules and take responsibility for it. In Coetzee’s fiction the author himself can be considered in a double bind, seeing that he is a white South African writer who feels that his “task” can’t be as simply as choosing to represent faithfully the violence and the racism of the apartheid or of choosing to give a voice to the oppressed. Good intentions alone do not ensure protection against entering unwittingly into complicity with the dominant discourse, and this is why is important to make the frame in which one is always situated clearly visible and explicit. The logic of the double bind becomes the way in which moral problem are staged in Coetzee’s fiction as well: the opportunity to give a voice to the oppressed through the same language which co-opted to serve the cause of oppression, a relation with the otherness never completed, or the representability of evil in literature, of the secret and of the paradoxical implications of confession and forgiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mitochondrion is an essential cytoplasmic organelle that provides most of the energy necessary for eukaryotic cell physiology. Mitochondrial structure and functions are maintained by proteins of both mitochondrial and nuclear origin. These organelles are organized in an extended network that dynamically fuses and divides. Mitochondrial morphology results from the equilibrium between fusion and fission processes, controlled by a family of “mitochondria-shaping” proteins. It is becoming clear that defects in mitochondrial dynamics can impair mitochondrial respiration, morphology and motility, leading to apoptotic cell death in vitro and more or less severe neurodegenerative disorders in vivo in humans. Mutations in OPA1, a nuclear encoded mitochondrial protein, cause autosomal Dominant Optic Atrophy (DOA), a heterogeneous blinding disease characterized by retinal ganglion cell degeneration leading to optic neuropathy (Delettre et al., 2000; Alexander et al., 2000). OPA1 is a mitochondrial dynamin-related guanosine triphosphatase (GTPase) protein involved in mitochondrial network dynamics, cytochrome c storage and apoptosis. This protein is anchored or associated on the inner mitochondrial membrane facing the intermembrane space. Eight OPA1 isoforms resulting from alternative splicing combinations of exon 4, 4b and 5b have been described (Delettre et al., 2001). These variants greatly vary among diverse organs and the presence of specific isoforms has been associated with various mitochondrial functions. The different spliced exons encode domains included in the amino-terminal region and contribute to determine OPA1 functions (Olichon et al., 2006). It has been shown that exon 4, that is conserved throughout evolution, confers functions to OPA1 involved in maintenance of the mitochondrial membrane potential and in the fusion of the network. Conversely, exon 4b and exon 5b, which are vertebrate specific, are involved in regulation of cytochrome c release from mitochondria, and activation of apoptosis, a process restricted to vertebrates (Olichon et al., 2007). While Mgm1p has been identified thanks to its role in mtDNA maintenance, it is only recently that OPA1 has been linked to mtDNA stability. Missense mutations in OPA1 cause accumulation of multiple deletions in skeletal muscle. The syndrome associated to these mutations (DOA-1 plus) is complex, consisting of a combination of dominant optic atrophy, progressive external ophtalmoplegia, peripheral neuropathy, ataxia and deafness (Amati- Bonneau et al., 2008; Hudson et al., 2008). OPA1 is the fifth gene associated with mtDNA “breakage syndrome” together with ANT1, PolG1-2 and TYMP (Spinazzola et al., 2009). In this thesis we show for the first time that specific OPA1 isoforms associated to exon 4b are important for mtDNA stability, by anchoring the nucleoids to the inner mitochondrial membrane. Our results clearly demonstrate that OPA1 isoforms including exon 4b are intimately associated to the maintenance of the mitochondrial genome, as their silencing leads to mtDNA depletion. The mechanism leading to mtDNA loss is associated with replication inhibition in cells where exon 4b containing isoforms were down-regulated. Furthermore silencing of exon 4b associated isoforms is responsible for alteration in mtDNA-nucleoids distribution in the mitochondrial network. In this study it was evidenced that OPA1 exon 4b isoform is cleaved to provide a 10kd peptide embedded in the inner membrane by a second transmembrane domain, that seems to be crucial for mitochondrial genome maintenance and does correspond to the second transmembrane domain of the yeasts orthologue encoded by MGM1 or Msp1, which is also mandatory for this process (Diot et al., 2009; Herlan et al., 2003). Furthermore in this thesis we show that the NT-OPA1-exon 4b peptide co-immuno-precipitates with mtDNA and specifically interacts with two major components of the mitochondrial nucleoids: the polymerase gamma and Tfam. Thus, from these experiments the conclusion is that NT-OPA1- exon 4b peptide contributes to the nucleoid anchoring in the inner mitochondrial membrane, a process that is required for the initiation of mtDNA replication and for the distribution of nucleoids along the network. These data provide new crucial insights in understanding the mechanism involved in maintenance of mtDNA integrity, because they clearly demonstrate that, besides genes implicated in mtDNA replications (i.e. polymerase gamma, Tfam, twinkle and genes involved in the nucleotide pool metabolism), OPA1 and mitochondrial membrane dynamics play also an important role. Noticeably, the effect on mtDNA is different depending on the specific OPA1 isoforms down-regulated, suggesting the involvement of two different combined mechanisms. Over two hundred OPA1 mutations, spread throughout the coding region of the gene, have been described to date, including substitutions, deletions or insertions. Some mutations are predicted to generate a truncated protein inducing haploinsufficiency, whereas the missense nucleotide substitutions result in aminoacidic changes which affect conserved positions of the OPA1 protein. So far, the functional consequences of OPA1 mutations in cells from DOA patients are poorly understood. Phosphorus MR spectroscopy in patients with the c.2708delTTAG deletion revealed a defect in oxidative phosphorylation in muscles (Lodi et al., 2004). An energetic impairment has been also show in fibroblasts with the severe OPA1 R445H mutation (Amati-Bonneau et al., 2005). It has been previously reported by our group that OPA1 mutations leading to haploinsufficiency are associated in fibroblasts to an oxidative phosphorylation dysfunction, mainly involving the respiratory complex I (Zanna et al., 2008). In this study we have evaluated the energetic efficiency of a panel of skin fibroblasts derived from DOA patients, five fibroblast cell lines with OPA1 mutations causing haploinsufficiency (DOA-H) and two cell lines bearing mis-sense aminoacidic substitutions (DOA-AA), and compared with control fibroblasts. Although both types of DOA fibroblasts maintained a similar ATP content when incubated in a glucose-free medium, i.e. when forced to utilize the oxidative phosphorylation only to produce ATP, the mitochondrial ATP synthesis through complex I, measured in digitonin-permeabilized cells, was significantly reduced in cells with OPA1 haploinsufficiency only, whereas it was similar to controls in cells with the missense substitutions. Furthermore, evaluation of the mitochondrial membrane potential (DYm) in the two fibroblast lines DOA-AA and in two DOA-H fibroblasts, namely those bearing the c.2819-2A>C mutation and the c.2708delTTAG microdeletion, revealed an anomalous depolarizing response to oligomycin in DOA-H cell lines only. This finding clearly supports the hypothesis that these mutations cause a significant alteration in the respiratory chain function, which can be unmasked only when the operation of the ATP synthase is prevented. Noticeably, oligomycin-induced depolarization in these cells was almost completely prevented by preincubation with cyclosporin A, a well known inhibitor of the permeability transition pore (PTP). This results is very important because it suggests for the first time that the voltage threshold for PTP opening is altered in DOA-H fibroblasts. Although this issue has not yet been addressed in the present study, several are the mechanisms that have been proposed to lead to PTP deregulation, including in particular increased reactive oxygen species production and alteration of Ca2+ homeostasis, whose role in DOA fibroblasts PTP opening is currently under investigation. Identification of the mechanisms leading to altered threshold for PTP regulation will help our understanding of the pathophysiology of DOA, but also provide a strategy for therapeutic intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There have been almost fifty years since Harry Eckstein' s classic monograph, A Theory of Stable Democracy (Princeton, 1961), where he sketched out the basic tenets of the “congruence theory”, which was to become one of the most important and innovative contributions to understanding democratic rule. His next work, Division and Cohesion in Democracy, (Princeton University Press: 1966) is designed to serve as a plausibility probe for this 'theory' (ftn.) and is a case study of a Northern democratic system, Norway. What is more, this line of his work best exemplifies the contribution Eckstein brought to the methodology of comparative politics through his seminal article, “ “Case Study and Theory in Political Science” ” (in Greenstein and Polsby, eds., Handbook of Political Science, 1975), on the importance of the case study as an approach to empirical theory. This article demonstrates the special utility of “crucial case studies” in testing theory, thereby undermining the accepted wisdom in comparative research that the larger the number of cases the better. Although not along the same lines, but shifting the case study unit of research, I intend to take up here the challenge and build upon an equally unique political system, the Swedish one. Bearing in mind the peculiarities of the Swedish political system, my unit of analysis is going to be further restricted to the Swedish Social Democratic Party, the Svenska Arbetare Partiet. However, my research stays within the methodological framework of the case study theory inasmuch as it focuses on a single political system and party. The Swedish SAP endurance in government office and its electoral success throughout half a century (ftn. As of the 1991 election, there were about 56 years - more than half century - of interrupted social democratic "reign" in Sweden.) are undeniably a performance no other Social Democrat party has yet achieved in democratic conditions. Therefore, it is legitimate to inquire about the exceptionality of this unique political power combination. Which were the different components of this dominance power position, which made possible for SAP's governmental office stamina? I will argue here that it was the end-product of a combination of multifarious factors such as a key position in the party system, strong party leadership and organization, a carefully designed strategy regarding class politics and welfare policy. My research is divided into three main parts, the historical incursion, the 'welfare' part and the 'environment' part. The first part is a historical account of the main political events and issues, which are relevant for my case study. Chapter 2 is devoted to the historical events unfolding in the 1920-1960 period: the Saltsjoebaden Agreement, the series of workers' strikes in the 1920s and SAP's inception. It exposes SAP's ascent to power in the mid 1930s and the party's ensuing strategies for winning and keeping political office, that is its economic program and key economic goals. The following chapter - chapter 3 - explores the next period, i.e. the period from 1960s to 1990s and covers the party's troubled political times, its peak and the beginnings of the decline. The 1960s are relevant for SAP's planning of a long term economic strategy - the Rehn Meidner model, a new way of macroeconomic steering, based on the Keynesian model, but adapted to the new economic realities of welfare capitalist societies. The second and third parts of this study develop several hypotheses related to SAP's 'dominant position' (endurance in politics and in office) and test them afterwards. Mainly, the twin issues of economics and environment are raised and their political relevance for the party analyzed. On one hand, globalization and its spillover effects over the Swedish welfare system are important causal factors in explaining the transformative social-economic challenges the party had to put up with. On the other hand, Europeanization and environmental change influenced to a great deal SAP's foreign policy choices and its domestic electoral strategies. The implications of globalization on the Swedish welfare system will make the subject of two chapters - chapters four and five, respectively, whereupon the Europeanization consequences will be treated at length in the third part of this work - chapters six and seven, respectively. Apparently, at first sight, the link between foreign policy and electoral strategy is difficult to prove and uncanny, in the least. However, in the SAP's case there is a bulk of literature and public opinion statistical data able to show that governmental domestic policy and party politics are in a tight dependence to foreign policy decisions and sovereignty issues. Again, these country characteristics and peculiar causal relationships are outlined in the first chapters and explained in the second and third parts. The sixth chapter explores the presupposed relationship between Europeanization and environmental policy, on one hand, and SAP's environmental policy formulation and simultaneous agenda-setting at the international level, on the other hand. This chapter describes Swedish leadership in environmental policy formulation on two simultaneous fronts and across two different time spans. The last chapter, chapter eight - while trying to develop a conclusion, explores the alternative theories plausible in explaining the outlined hypotheses and points out the reasons why these theories do not fit as valid alternative explanation to my systemic corporatism thesis as the main causal factor determining SAP's 'dominant position'. Among the alternative theories, I would consider Traedgaardh L. and Bo Rothstein's historical exceptionalism thesis and the public opinion thesis, which alone are not able to explain the half century social democratic endurance in government in the Swedish case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'interrogativo da cui nasce la ricerca riguarda la possibilità di individuare, in controtendenza con la logica neoliberista, strategie per l'affermarsi di una cultura dello sviluppo che sia sostenibile per l'ambiente e rispettosa della dignità delle persone, in grado di valorizzarne le differenze e di farsi carico delle difficoltà che ognuno può incontrare nel corso della propria esistenza. Centrale è il tema del lavoro, aspetto decisivo delle condizioni di appartenenza sociale e di valorizzazione delle risorse umane. Vengono richiamati studi sulla realtà in cui siamo immersi, caratterizzata dal pensiero liberista diventato negli ultimi decenni dominante su scala globale e che ha comportato una concezione delle relazioni sociali basata su di una competitività esasperata e sull’esclusione di chi non sta al passo con le leggi di mercato: le conseguenze drammatiche dell'imbroglio liberista; la riduzione delle persone a consumatori; la fuga dalla comunità ed il rifugio in identità separate; il tempo del rischio, della paura e della separazione fra etica e affari. E gli studi che, in controtendenza, introducono a prospettive di ricerca di uno sviluppo inclusivo e umanizzante: le prospettive della decrescita, del business sociale, di una via cristiana verso un'economia giusta, della valorizzazione delle capacità delle risorse umane. Vengono poi indagati i collegamenti con le esperienze attive nel territorio della città di Bologna che promuovono, attraverso la collaborazione fra istituzioni, organizzazioni intermedie e cittadini, occasioni di un welfare comunitario che sviluppa competenze e diritti insieme a responsabilità: l'introduzione delle clausole sociali negli appalti pubblici per la realizzazione professionale delle persone svantaggiate; la promozione della responsabilità sociale d'impresa per l'inclusione socio-lavorativa; la valorizzazione delle risorse delle persone che vivono un’esperienza carceraria. Si tratta di esperienze ancora limitate, ma possono costituire un riferimento culturale e operativo di un modello di sviluppo possibile, che convenga a tutti, compatibile con i limiti ambientali e umanizzante.