29 resultados para Almost always propositional logic
Resumo:
The elusive fiction of J. M. Coetzee is not a work in which you can read fixed ethical stances. I suggest testing the potentialities of a logic based on frames and double binds in Coetzee's novels. A double bind is a dilemma in communication which consists on tho conflicting messages, with the result that you canât successfully respond to neither. Jacques Derrida highlighted the strategic value of a way of thinking based on the double bind (but on frames as well), which enables to escape binary thinking and so it opens an ethical space, where you can make a choice out of a set of fixed rules and take responsibility for it. In Coetzeeâs fiction the author himself can be considered in a double bind, seeing that he is a white South African writer who feels that his âtaskâ canât be as simply as choosing to represent faithfully the violence and the racism of the apartheid or of choosing to give a voice to the oppressed. Good intentions alone do not ensure protection against entering unwittingly into complicity with the dominant discourse, and this is why is important to make the frame in which one is always situated clearly visible and explicit. The logic of the double bind becomes the way in which moral problem are staged in Coetzeeâs fiction as well: the opportunity to give a voice to the oppressed through the same language which co-opted to serve the cause of oppression, a relation with the otherness never completed, or the representability of evil in literature, of the secret and of the paradoxical implications of confession and forgiveness.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.
Resumo:
The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.
Resumo:
The aim of this PhD thesis is to study accurately and in depth the figure and the literary production of the intellectual Jacopo Aconcio. This minor author of the 16th century has long been considered a sort of “enigmatic character”, a profile which results from the work of those who, for many centuries, have left his writing to its fate: a story of constant re-readings and equally incessant oversights. This is why it is necessary to re-read Aconcio’s production in its entirety and to devote to it a monographic study. Previous scholars’ interpretations will obviously be considered, but at the same time an effort will be made to go beyond them through the analysis of both published and manuscript sources, in the attempt to attain a deeper understanding of the figure of this man, who was a Christian, a military and hydraulic engineer and a political philosopher,. The title of the thesis was chosen to emphasise how, throughout the three years of the doctorate, my research concentrated in equal measure and with the same degree of importance on all the reflections and activities of Jacopo Aconcio. My object, in fact, was to establish how and to what extent the methodological thinking of the intellectual found application in, and at the same time guided, his theoretical and practical production. I did not mention in the title the author’s religious thinking, which has always been considered by everyone the most original and interesting element of his production, because religion, from the Reformation onwards, was primarily a political question and thus it was treated by almost all the authors involved in the Protestant movement - Aconcio in the first place. Even the remarks concerning the private, intimate sphere of faith have therefore been analysed in this light: only by acknowledging the centrality of the “problem of politics” in Aconcio’s theories, in fact, is it possible to interpret them correctly. This approach proves the truth of the theoretical premise to my research, that is to say the unity and orderliness of the author’s thought: in every field of knowledge, Aconcio applies the rules of the methodus resolutiva, as a means to achieve knowledge and elaborate models of pacific cohabitation in society. Aconcio’s continuous references to method can make his writing pedant and rather complex, but at the same time they allow for a consistent and valid analysis of different disciplines. I have not considered the fact that most of his reflections appear to our eyes as strongly conditioned by the time in which he lived as a limit. To see in him, as some have done, the forerunner of Descartes’ methodological discourse or, conversely, to judge his religious theories as not very modern, is to force the thought of an author who was first and foremost a Christian man of his own time. Aconcio repeats this himself several times in his writings: he wants to provide individuals with the necessary tools to reach a full-fledged scientific knowledge in the various fields, and also to enable them to seek truth incessantly in the religious domain, which is the duty of every human being. The will to find rules, instruments, effective solutions characterizes the whole of the author’s corpus: Aconcio feels he must look for truth in all the arts, aware as he is that anything can become science as long as it is analysed with method. Nevertheless, he remains a man of his own time, a Christian convinced of the existence of God, creator and governor of the world, to whom people must account for their own actions. To neglect this fact in order to construct a “character”, a generic forerunner, but not participant, of whatever philosophical current, is a dangerous and sidetracking operation. In this study, I have highlighted how Aconcio’s arguments only reveal their full meaning when read in the context in which they were born, without depriving them of their originality but also without charging them with meanings they do not possess. Through a historical-doctrinal approach, I have tried to analyse the complex web of theories and events which constitute the substratum of Aconcio’s reflection, in order to trace the correct relations between texts and contexts. The thesis is therefore organised in six chapters, dedicated respectively to Aconcio’s biography, to the methodological question, to the author’s engineering activity, to his historical knowledge and to his religious thinking, followed by a last section concerning his fortune throughout the centuries. The above-mentioned complexity is determined by the special historical moment in which the author lived. On the one hand, thanks to the new union between science and technique, the 16th century produces discoveries and inventions which make available a previously unthinkable number of notions and lead to a “revolution” in the way of studying and teaching the different subjects, which, by producing a new form of intellectual, involved in politics but also aware of scientific-technological issues, will contribute to the subsequent birth of modern science. On the other, the 16th century is ravaged by religious conflicts, which shatter the unity of the Christian world and generate theological-political disputes which will inform the history of European states for many decades. My aim is to show how Aconcio’s multifarious activity is the conscious fruit of this historical and religious situation, as well as the attempt of an answer to the request of a new kind of engagement on the intellectual’s behalf. Plunged in the discussions around methodus, employed in the most important European courts, involved in the abrupt acceleration of technical-scientific activities, and especially concerned by the radical religious reformation brought on by the Protestant movement, Jacopo Aconcio reflects this complex conjunction in his writings, without lacking in order and consistency, differently from what many scholars assume. The object of this work, therefore, is to highlight the unity of the author’s thought, in which science, technique, faith and politics are woven into a combination which, although it may appear illogical and confused, is actually tidy and methodical, and therefore in agreement with Aconcio’s own intentions and with the specific characters of European culture in the Renaissance. This theory is confirmed by the reading of the Ars muniendorum oppidorum, Aconcio’s only work which had been up till now unavailable. I am persuaded that only a methodical reading of Aconcio’s works, without forgetting nor glorifying any single one, respects the author’s will. From De methodo (1558) onwards, all his writings are summae, guides for the reader who wishes to approach the study of the various disciplines. Undoubtedly, Satan’s Stratagems (1565) is something more, not only because of its length, but because it deals with the author’s main interest: the celebration of doubt and debate as bases on which to build religious tolerance, which is the best method for pacific cohabitation in society. This, however, does not justify the total centrality which the Stratagems have enjoyed for centuries, at the expense of a proper understanding of the author’s will to offer examples of methodological rigour in all sciences. Maybe it is precisely because of the reforming power of Aconcio’s thought that, albeit often forgotten throughout the centuries, he has never ceased to reappear and continues to draw attention, both as a man and as an author. His ideas never stop stimulating the reader’s curiosity and this may ultimately be the best demonstration of their worth, independently from the historical moment in which they come back to the surface.
Resumo:
Purpose of this research is to deepen the study on the section in architecture. The survey aims as important elements in the project Teatro Domestico by Aldo Rossi built for the XVII Triennale di Milano in 1986 and, through the implementation on several topics of architecture, verify the timeliness and fertility in the new compositional exercises. Through the study of certain areas of the Rossi’s theory we tried to find a common thread for the reading of the theater project. The theater is the place of the ephemeral and the artificial, which is why his destiny is the end and the fatal loss. The design and construction of theater setting has always had a double meaning between the value of civil architecture and testing of new technologies available. Rossi's experience in this area are clear examples of the inseparable relationship between the representation of architecture as art and design of architecture as a model of reality. In the Teatro Domestico, the distinction between representation and the real world is constantly canceled and returned through the reversal of the meaning and through the skip of scale. At present, studies conducted on the work of Rossi concern the report that the architectural composition is the theory of form, focusing compositional development of a manufacturing process between the typological analysis and form invention. The research, through the analysis of some projects few designs, will try to analyze this issue through the rules of composition both graphical and concrete construction, hoping to decipher the mechanism underlying the invention. The almost total lack of published material on the project Teatro Domestico and the opportunity to visit the archives that preserve the drawings, has allowed the author of this study to deepen the internal issues in the project, thus placing this search as a first step toward possible further analysis on the works of Rossi linked to performance world. The final aim is therefore to produce material that can best describe the work of Rossi. Through the reading of the material published by the same author and the vision of unpublished material preserved in the archives, it was possible to develop new material and increasing knowledge about the work, otherwise difficult to analyze. The research is divided into two groups. The first, taking into account the close relationship most frequently mentioned by Rossi himself between archeology and architectural composition, stresses the importance of tipo such as urban composition reading system as well as open tool of invention. Resuming Ezio Bonfanti’s essay on the work of the architect we wanted to investigate how the paratactic method is applied to the early work conceived and, subsequently as the process reaches a complexity accentuated, while keeping stable the basic terms. Following a brief introduction related to the concept of the section and the different interpretations that over time the term had, we tried to identify with this facility a methodology for reading Rossi’s projects. The result is a constant typological interpretation of the term, not only related to the composition in plant but also through the elevation plans. The section is therefore intended as the overturning of such elevation is marked on the same plane of the terms used, there is a different approach, but a similarity of characters. The identification of architectural phonemes allows comparison with other arts. The research goes in the direction of language trying to identify the relationship between representation and construction, between the ephemeral and the real world. In this sense it will highlight the similarities between the graphic material produced by Ross and some important examples of contemporary author. The comparison between the composition system with the surrealist world of painting and literature will facilitate the understanding and identification of possible rules applied by Rossi. The second part of the research is characterized by a focus on the intent of the project chosen. Teatro Domestico embodies a number of elements that seem to conclude (assuming an end point but also to start) a curriculum author. With it, the experiments carried out on the theater started with the project for the Teatrino Scientifico (1978) through the project for the Teatro del Mondo (1979), into a Laic Tabernacle representative collective and private memory of the city. Starting from a reading of the draft, through the collection of published material, we’ve made an analysis on the explicit themes of the work, finding the conceptual references. Following the taking view of the original materials not published kept at Aldo Rossi's Archive Collection of the Canadian Center for Architecture in Montréal, will be implemented through the existing techniques for digital representation, a virtual reconstruction of the project, adding little to the material, a new element for future studies. The reconstruction is part of a larger research studies where the current technologies of composition and representation in architecture stand side by side with research on the method of composition of this architect. The results achieved are in addition to experiences in the past dealt with the reconstruction of some of the lost works of Aldo Rossi. A partial objective is to reactivate a discourse around this work is considered non-principal, among others born in the prolific activities. Reassessment of development projects which would bring the level of ephemeral works most frequented by giving them the value earned. In conclusion, the research aims to open a new field of interest on the part not only as a technical instrument of representation of an idea but as an actual mechanism through which composition is formed and the idea is developed.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
The PhD project was focused on the study of the poultry welfare conditions and improvements. The project work was divided into 3 main research activities. A) Field evaluation of chicken meat rearing conditions kept in intensive farms. Considering the lack of published reports concerning the overall Italian rearing conditions of broiler chickens, a survey was carried out to assess the welfare conditions of broiler reared in the most important poultry companies in Italy to verify if they are in accordance with the advices given in the European proposal COM (2005) 221 final. Chicken farm conditions, carcass lesions and meat quality were investigated. 1. The densities currently used in Italy are in accordance with the European proposal COM 221 final (2005) which suggests to keep broilers at a density lower than 30-32 kg live weight/m2 and to not exceed 38-40 kg live weight/m2. 2. The mortality rates in summer and winter agree with the mortality score calculated following the formula reported in the EU Proposal COM 221 final (2005). 3. The incidence of damaged carcasses was very low and did not seem related to the stocking density. 4. The FPD scores were generally above the maximum limit advised by the EU proposal COM 221 final (2005), although the stocking densities were lower than 30-32 kg live weight per m2. 5. It can be stated that the control of the environmental conditions, particularly litter quality, appears a key issue to control the onset of foot dermatitis. B) Manipulation of several farm parameters, such litter material and depth, stocking density and light regimen to improve the chicken welfare conditions, in winter season. 1. Even though 2 different stocking densities were established in this study, the performances achieved from the chickens were almost identical among groups. 2. The FCR was significantly better in Standard conditions contrarily to birds reared in Welfare conditions with lower stocking density, more litter material and with a light program of 16 hours light and 8 hours dark. 3. In our trial, in Standard groups we observed a higher content of moisture, nitrogen and ammonia released from the litter. Therefore it can be assumed that the environmental characteristics have been positively changed by the improvements of the rearing conditions adopted for Welfare groups. 4. In Welfare groups the exhausted litters of the pens were dryer and broilers showed a lower occurrence of FPD. 5. The prevalence of hock burn lesions, like FPD, is high with poor litter quality conditions. 6. The combined effect of a lower stocking density, a greater amount of litter material and a photoperiod similar to the natural one, have positively influenced the chickens welfare status, as a matter of fact the occurrence of FPD in Welfare groups was the lowest keeping the score under the European threshold of the proposal COM 221 final(2005). C) The purpose of the third research was to study the effect of high or low stocking density of broiler chickens, different types of litter and the adoption of short or long lighting regimen on broiler welfare through the evaluation of their productivity and incidence of foot pad dermatitis during the hot season. 1. The feed efficiency was better for the Low Density than for High Density broilers. 2. The appearance of FPD was not influenced by stocking density. 3. The foot examination revealed that the lesions occurred more in birds maintained on chopped wheat straw than on wood shaving. 4. In conclusion, the adoptions of a short light regimen similar to that occurring in nature during summer reduces the feed intake without modify the growth rate thus improving the feed efficiency. Foot pad lesion were not affected neither by stocking densities nor by light regimens whereas wood shavings exerted a favourable effect in preserving foot pad in good condition. D) A study was carried out to investigate more widely the possible role of 25-hydroxycholecalciferol supplemented in the diet of a laying hen commercial strain (Lohmann brown) in comparison of diets supplemented with D3 or with D3 + 25- hydroxycholecalciferol. Egg traits during a productive cycle as well as the bone characteristics of the layers have been as well evaluated to determine if there the vitamin D3 may enhance the welfare status of the birds. 1. The weight of the egg and of its components is often greater in hens fed a diet enriched with 25-hydroxycholecalciferol. 2. Since eggs of treated groups are heavier and a larger amount of shell is needed, a direct effect on shell strength is observed. 3. At 30 and at 50 wk of age hens fed 25 hydroxycholecalciferol exhibited greater values of bone breaking force. 4. Radiographic density values obtained in the trial are always higher in hens fed with 25-hydroxycholecalciferol of both treatments: supplemented for the whole laying cycle (25D3) or from 40 weeks of age onward (D3+25D3).
Resumo:
Il problema affrontato nel lavoro riguarda l'allocazione della spesa tra gruppi di beni alimentari (domestici ed extra-domestici) e le modificazioni che tale allocazione ha subito nell’arco dell’ultimo decennio. L’obiettivo principale dell'analisi proposta è, quindi, di spiegare come variazioni della quota di spesa destinata alle componenti del consumo alimentare siano attribuibili a fattori strettamente economici, oltre che alle caratteristiche struttura socio-demografiche dei consumatori. Allo scopo di valutare l’allocazione inter-temporale della spesa individuale viene proposto come schema di analisi il sistema di domanda Almost Ideal di Deaton e Muellbauer (AIDS).
Resumo:
La specificità dell'acquisizione di contenuti attraverso le interfacce digitali condanna l'agente epistemico a un'interazione frammentata, insufficiente da un punto di vista computazionale, mnemonico e temporale, rispetto alla mole informazionale oggi accessibile attraverso una qualunque implementazione della relazione uomo-computer, e invalida l'applicabilità del modello standard di conoscenza, come credenza vera e giustificata, sconfessando il concetto di credenza razionalmente fondata, per formare la quale, sarebbe invece richiesto all'agente di poter disporre appunto di risorse concettuali, computazionali e temporali inaccessibili. La conseguenza è che l'agente, vincolato dalle limitazioni ontologiche tipiche dell'interazione con le interfacce culturali, si vede costretto a ripiegare su processi ambigui, arbitrari e spesso più casuali di quanto creda, di selezione e gestione delle informazioni che danno origine a veri e propri ibridi (alla Latour) epistemologici, fatti di sensazioni e output di programmi, credenze non fondate e bit di testimonianze indirette e di tutta una serie di relazioni umano-digitali che danno adito a rifuggire in una dimensione trascendente che trova nel sacro il suo più immediato ambito di attuazione. Tutto ciò premesso, il presente lavoro si occupa di costruire un nuovo paradigma epistemologico di conoscenza proposizionale ottenibile attraverso un'interfaccia digitale di acquisizione di contenuti, fondato sul nuovo concetto di Tracciatura Digitale, definito come un un processo di acquisizione digitale di un insieme di tracce, ossia meta-informazioni di natura testimoniale. Tale dispositivo, una volta riconosciuto come un processo di comunicazione di contenuti, si baserà sulla ricerca e selezione di meta-informazioni, cioè tracce, che consentiranno l'implementazione di approcci derivati dall'analisi decisionale in condizioni di razionalità limitata, approcci che, oltre ad essere quasi mai utilizzati in tale ambito, sono ontologicamente predisposti per una gestione dell'incertezza quale quella riscontrabile nell'istanziazione dell'ibrido informazionale e che, in determinate condizioni, potranno garantire l'agente sulla bontà epistemica del contenuto acquisito.
Resumo:
Esta tesis presenta un breve estado de la cuestión de la sociología y la ética del deporte como nueva disciplina académica, desde su origen hasta el presente. En segundo lugar pretende ver que aporta la perspectiva de la sociologia relacional al estudio del deporte. La primera contribución es la “trans-discipliniariedad” entre disciplinas como la sociología y la ética, evitando caer en una ética “sociologizzata” o en una sociología “eticizzata”. Gracias a la relacionalidad se obtiene la des-mercantilización de bienestar. La lógica relacional busca el desarrollo económico sin olvida que la sociedad está formada por personas, que son lo realmente importantes por encima de los interés económicos. Otra contribución es la distinción entre la “sociedad humana” y la “sociedad de lo humano” que evita muchos de los actuales problemas que trae consigo las nuevas tecnologías. Y finalmente aporta el esquema “AGIL” como “brújula relacional” que ayuda a relaizar los objetivos, medios y reglas éticas dentro de la práctica deportiva.
Resumo:
Obiettivo del lavoro è migliorare la lettura della ruralità europea. A fronte delle profonde trasformazioni avvenute, oggi non è più possibile analizzare i territori rurali adottando un mero approccio dicotomico che semplicemente li distingua dalle città. Al contrario, il lavoro integra l’analisi degli aspetti socio-economici con quella degli elementi territoriali, esaltando le principali dimensioni che caratterizzano le tante tipologie di ruralità oggi presenti in Europa. Muovendo dal dibattito sulla classificazione delle aree rurali, si propone dapprima un indicatore sintetico di ruralità che, adottando la logica fuzzy, considera congiuntamente aspetti demografici (densità), settoriali (rilevanza dell’attività agricola), territoriali e geografici (accessibilità e uso del suolo). Tale tecnica permette di ricostruire un continuum di gradi di ruralità, distinguendo così, all’interno dell’Unione Europea (circa 1.300 osservazioni), le aree più centrali da quelle progressivamente più rurali e periferiche. Successivamente, attraverso un’analisi cluster vengono individuate tipologie di aree omogenee in termini di struttura economica, paesaggio, diversificazione dell’attività agricola. Tali cluster risentono anche della distribuzione geografica delle aree stesse: vengono infatti distinti gruppi di regioni centrali da gruppi di regioni più periferiche. Tale analisi evidenzia soprattutto come il binomio ruralità-arretratezza risulti ormai superato: alcune aree rurali, infatti, hanno tratto vantaggio dalle trasformazioni che hanno interessato l’Unione Europea negli ultimi decenni (diffusione dell’ICT o sviluppo della manifattura). L’ultima parte del lavoro offre strumenti di analisi a supporto dell’azione politica comunitaria, analizzando la diversa capacità delle regioni europee di rispondere alle sfide lanciate dalla Strategia Europa 2020. Un’analisi in componenti principali sintetizza le principali dimensioni di tale performance regionale: i risultati sono poi riletti alla luce delle caratteristiche strutturali dei territori europei. Infine, una più diretta analisi spaziale dei dati permette di evidenziare come la geografia influenzi ancora profondamente la capacità dei territori di rispondere alle nuove sfide del decennio.
Non-normal modal logics, quantification, and deontic dilemmas. A study in multi-relational semantics
Resumo:
This dissertation is devoted to the study of non-normal (modal) systems for deontic logics, both on the propositional level, and on the first order one. In particular we developed our study the Multi-relational setting that generalises standard Kripke Semantics. We present new completeness results concerning the semantic setting of several systems which are able to handle normative dilemmas and conflicts. Although primarily driven by issues related to the legal and moral field, these results are also relevant for the more theoretical field of Modal Logic itself, as we propose a syntactical, and semantic study of intermediate systems between the classical propositional calculus CPC and the minimal normal modal logic K.
Resumo:
Pediatric acute myeloid leukemia (AML) is a molecularly heterogeneous disease that arises from genetic alterations in pathways that regulate self-renewal and myeloid differentiation. While the majority of patients carry recurrent chromosomal translocations, almost 20% of childhood AML do not show any recognizable cytogenetic alteration and are defined as cytogenetically normal (CN)-AML. CN-AML patients have always showed a great variability in response to therapy and overall outcome, underlining the presence of unknown genetic changes, not detectable by conventional analyses, but relevant for pathogenesis, and outcome of AML. The development of novel genome-wide techniques such as next-generation sequencing, have tremendously improved our ability to interrogate the cancer genome. Based on this background, the aim of this research study was to investigate the mutational landscape of pediatric CN-AML patients negative for all the currently known somatic mutations reported in AML through whole-transcriptome sequencing (RNA-seq). RNA-seq performed on diagnostic leukemic blasts from 19 pediatric CN-AML cases revealed a considerable incidence of cryptic chromosomal rearrangements, with the identification of 21 putative fusion genes. Several of the fusion genes that were identified in this study are recurrent and might have a prognostic and/or therapeutic relevance. A paradigm of that is the CBFA2T3-GLIS2 fusion, which has been demonstrated to be a common alteration in pediatric CN-AML, predicting poor outcome. Important findings have been also obtained in the identification of novel therapeutic targets. On one side, the identification of NUP98-JARID1A fusion suggests the use of disulfiram; on the other, here we describe alteration-activating tyrosine kinases, providing functional data supporting the use of tyrosine kinase inhibitors to specifically inhibit leukemia cells. This study provides new insights in the knowledge of genetic alterations underlying pediatric AML, defines novel prognostic markers and putative therapeutic targets, and prospectively ensures a correct risk stratification and risk-adapted therapy also for the “all-neg” AML subgroup.