899 resultados para Almost always propositional logic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rational construction of the house. The writings and projects of Giuseppe Pagano Description, themes and research objectives The research aims at analysing the architecture of Giuseppe Pagano, which focuses on the theme of dwelling, through the reading of 3 of his house projects. On the one hand, these projects represent “minor” works not thoroughly known by Pagano’s contemporary critics; on the other they emphasise a particular methodological approach, which serves the author to explore a theme closely linked to his theoretical thought. The house project is a key to Pagano’s research, given its ties to the socio-cultural and political conditions in which the architect was working, so that it becomes a mirror of one of his specific and theoretical path, always in a state of becoming. Pagano understands architecture as a “servant of the human being”, subject to a “utilitarian slavery” since it is a clear, essential and “modest” answer to specific human needs, free from aprioristic aesthetic and formal choices. It is a rational architecture in sensu stricto; it constitutes a perfect synthesis between cause and effect and between function and form. The house needs to accommodate these principles because it is closely intertwined with human needs and intimately linked to a specific place, climatic conditions and technical and economical possibilities. Besides, differently from his public and common masterpieces such as the Palazzo Gualino, the Istituto di Fisica and the Università Commerciale Bocconi, the house projects are representative of a precise project will, which is expressed in a more authentic way, partially freed from political influences and dogmatic preoccupations and, therefore, far from the attempt to research a specific expressive language. I believe that the house project better represents that “ingenuity”, freshness and “sincerity” that Pagano identifies with the minor architecture, thereby revealing a more authentic expression of his understanding of a project. Therefore, the thesis, by tracing the theoretical research of Pagano through the analysis of some of his designed and built works, attempts to identify a specific methodological approach to Pagano’s project, which, developed through time, achieves a certain clarity in the 1930s. In fact, this methodological approach becomes more evident in his last projects, mainly regarding the house and the urban space. These reflect the attempt to respond to the new social needs and, at the same time, they also are an expression of a freer idea of built architecture, closely linked with the place and with the human being who dwells it. The three chosen projects (Villa Colli, La Casa a struttura d’acciaio and Villa Caraccio) make Pagano facing different places, different customers and different economic and technical conditions, which, given the author’s biography, correspond to important historical and political conditions. This is the reason why the projects become apparently distant works, both linguistically and conceptually, to the point that one can define them as ”eclectic”. However, I argue that this eclecticism is actually an added value to the architectural work of Pagano, steaming from the use of a method which, having as a basis the postulate of a rational architecture as essence and logic of building, finds specific variations depending on the multiple variables to be addressed by the project. This is the methodological heritage that Pagano learns from the tradition, especially that of the rural residential architecture, defined by Pagano as a “dictionary of the building logic of man”, as an “a-stylistic background”. For Pagano this traditional architecture is a clear expression of the relationships between a theme and its development, an architectural “fact” that is resolved with purely technical and utilitarian aims and with a spontaneous development far from any aprioristic theoretical principle. Architecture, therefore, cannot be an invention for Pagano and the personal contribution of each architect has to consider his/her close relationship with the specific historical context, place and new building methods. These are basic principles in the methodological approach that drives a great deal of his research and that also permits his thought to be modern. I argue that both ongoing and new collaborations with younger protagonists of the culture and architecture of the period are significant for the development of his methodology. These encounters represent the will to spread his own understanding of the “new architecture” as well as a way of self-renewal by confronting the self with new themes and realities and by learning from his collaborators. Thesis’ outline The thesis is divided in two principal parts, each articulated in four chapters attempting to offer a new reading of the theory and work of Pagano by emphasising the central themes of the research. The first chapter is an introduction to the thesis and to the theme of the rational house, as understood and developed in its typological and technical aspects by Pagano and by other protagonists of the Italian rationalism of the 1930s. Here the attention is on two different aspects defining, according to Pagano, the house project: on the one hand, the typological renewal, aimed at defining a “standard form” as a clear and essential answer to certain needs and variables of the project leading to different formal expressions. On the other, it focuses on the building, understood as a technique to “produce” architecture, where new technologies and new materials are not merely tools but also essential elements of the architectural work. In this way the villa becomes different from the theme of the common house or from that of the minimalist house, by using rules in the choice of material and in the techniques that are every time different depending on the theme under exploration and on the contingency of place. It is also visible the rigorous rationalism that distinguishes the author's appropriation of certain themes of rural architecture. The pages of “Casabella” and the events of the contemporary Triennali form the preliminary material for the writing of this chapter given that they are primary sources to individuate projects and writings produced by Pagano and contemporary architects on this theme. These writings and projects, when compared, reconstruct the evolution of the idea of the rational house and, specifically, of the personal research of Pagano. The second part regards the reading of three of Pagano’s projects of houses as a built verification of his theories. This section constitutes the central part of the thesis since it is aimed at detecting a specific methodological approach showing a theoretical and ideological evolution expressed in the vast edited literature. The three projects that have been chosen explore the theme of the house, looking at various research themes that the author proposes and that find continuity in the affirmation of a specific rationalism, focussed on concepts such as essentiality, utility, functionality and building honesty. These concepts guide the thought and the activities of Pagano, also reflecting a social and cultural period. The projects span from the theme of the villa moderna, Villa Colli, which, inspired by the architecture of North Europe, anticipates a specific rationalism of Pagano based on rigour, simplicity and essentiality, to the theme of the common house, Casa a struttura d’acciaio, la casa del domani, which ponders on the definition of new living spaces and, moreover, on new concepts of standardisation, economical efficiency and new materials responding to the changing needs of the modern society. Finally, the third project returns to the theme of the, Villa Caraccio, revisiting it with new perspectives. These perspectives find in the solution of the open plant, in the openness to nature and landscape and in the revisiting of materials and local building systems that idea of the freed house, which express clearly a new theoretical thought. Methodology It needs to be noted that due to the lack of an official Archive of Pagano’s work, the analysis of his work has been difficult and this explains the necessity to read the articles and the drawings published in the pages of «Casabella» and «Domus». As for the projects of Villa Colli and Casa a struttura d’acciaio, parts of the original drawings have been consulted. These drawings are not published and are kept in private archives of the collaborators of Pagano. The consultation of these documents has permitted the analysis of the cited works, which have been subject to a more complete reading following the different proposed solutions, which have permitted to understand the project path. The projects are analysed thought the method of comparison and critical reading which, specifically, means graphical elaborations and analytical schemes, mostly reconstructed on the basis of original projects but, where possible, also on a photographic investigation. The focus is on the project theme which, beginning with a specific living (dwelling) typology, finds variations because of the historico-political context in which Pagano is embedded and which partially shapes his research and theoretical thought, then translated in the built work. The analysis of the work follows, beginning, where possible, from a reconstruction of the evolution of the project as elaborated on the basis of the original documents and ending on an analysis of the constructive principles and composition. This second phase employs a methodology proposed by Pagano in his article Piante di ville, which, as expected, focuses on the plant as essential tool to identify the “true practical and poetic qualities of the construction”(Pagano, «Costruzioni-Casabella», 1940, p. 2). The reading of the project is integrated with the constructive analyses related to the technical aspects of the house which, in the case of Casa a struttura d’acciaio, play an important role in the project, while in Villa Colli and in Villa Caraccio are principally linked to the choice of materials for the construction of the different architectural elements. These are nonetheless key factors in the composition of the work. Future work could extend this reading to other house projects to deepen the research that could be completed with the consultation of Archival materials, which are missing at present. Finally, in the appendix I present a critical selection of the Pagano’s writings, which recall the themes discussed and embodied by the three projects. The texts have been selected among the articles published in Casabella and in other journals, completing the reading of the project work which cannot be detached from his theoretical thought. Moving from theory to project, we follow a path that brings us to define and deepen the central theme of the thesis: rational building as the principal feature of the architectural research of Pagano, which is paraphrased in multiple ways in his designed and built works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The elusive fiction of J. M. Coetzee is not a work in which you can read fixed ethical stances. I suggest testing the potentialities of a logic based on frames and double binds in Coetzee's novels. A double bind is a dilemma in communication which consists on tho conflicting messages, with the result that you can’t successfully respond to neither. Jacques Derrida highlighted the strategic value of a way of thinking based on the double bind (but on frames as well), which enables to escape binary thinking and so it opens an ethical space, where you can make a choice out of a set of fixed rules and take responsibility for it. In Coetzee’s fiction the author himself can be considered in a double bind, seeing that he is a white South African writer who feels that his “task” can’t be as simply as choosing to represent faithfully the violence and the racism of the apartheid or of choosing to give a voice to the oppressed. Good intentions alone do not ensure protection against entering unwittingly into complicity with the dominant discourse, and this is why is important to make the frame in which one is always situated clearly visible and explicit. The logic of the double bind becomes the way in which moral problem are staged in Coetzee’s fiction as well: the opportunity to give a voice to the oppressed through the same language which co-opted to serve the cause of oppression, a relation with the otherness never completed, or the representability of evil in literature, of the secret and of the paradoxical implications of confession and forgiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this PhD thesis is to study accurately and in depth the figure and the literary production of the intellectual Jacopo Aconcio. This minor author of the 16th century has long been considered a sort of “enigmatic character”, a profile which results from the work of those who, for many centuries, have left his writing to its fate: a story of constant re-readings and equally incessant oversights. This is why it is necessary to re-read Aconcio’s production in its entirety and to devote to it a monographic study. Previous scholars’ interpretations will obviously be considered, but at the same time an effort will be made to go beyond them through the analysis of both published and manuscript sources, in the attempt to attain a deeper understanding of the figure of this man, who was a Christian, a military and hydraulic engineer and a political philosopher,. The title of the thesis was chosen to emphasise how, throughout the three years of the doctorate, my research concentrated in equal measure and with the same degree of importance on all the reflections and activities of Jacopo Aconcio. My object, in fact, was to establish how and to what extent the methodological thinking of the intellectual found application in, and at the same time guided, his theoretical and practical production. I did not mention in the title the author’s religious thinking, which has always been considered by everyone the most original and interesting element of his production, because religion, from the Reformation onwards, was primarily a political question and thus it was treated by almost all the authors involved in the Protestant movement - Aconcio in the first place. Even the remarks concerning the private, intimate sphere of faith have therefore been analysed in this light: only by acknowledging the centrality of the “problem of politics” in Aconcio’s theories, in fact, is it possible to interpret them correctly. This approach proves the truth of the theoretical premise to my research, that is to say the unity and orderliness of the author’s thought: in every field of knowledge, Aconcio applies the rules of the methodus resolutiva, as a means to achieve knowledge and elaborate models of pacific cohabitation in society. Aconcio’s continuous references to method can make his writing pedant and rather complex, but at the same time they allow for a consistent and valid analysis of different disciplines. I have not considered the fact that most of his reflections appear to our eyes as strongly conditioned by the time in which he lived as a limit. To see in him, as some have done, the forerunner of Descartes’ methodological discourse or, conversely, to judge his religious theories as not very modern, is to force the thought of an author who was first and foremost a Christian man of his own time. Aconcio repeats this himself several times in his writings: he wants to provide individuals with the necessary tools to reach a full-fledged scientific knowledge in the various fields, and also to enable them to seek truth incessantly in the religious domain, which is the duty of every human being. The will to find rules, instruments, effective solutions characterizes the whole of the author’s corpus: Aconcio feels he must look for truth in all the arts, aware as he is that anything can become science as long as it is analysed with method. Nevertheless, he remains a man of his own time, a Christian convinced of the existence of God, creator and governor of the world, to whom people must account for their own actions. To neglect this fact in order to construct a “character”, a generic forerunner, but not participant, of whatever philosophical current, is a dangerous and sidetracking operation. In this study, I have highlighted how Aconcio’s arguments only reveal their full meaning when read in the context in which they were born, without depriving them of their originality but also without charging them with meanings they do not possess. Through a historical-doctrinal approach, I have tried to analyse the complex web of theories and events which constitute the substratum of Aconcio’s reflection, in order to trace the correct relations between texts and contexts. The thesis is therefore organised in six chapters, dedicated respectively to Aconcio’s biography, to the methodological question, to the author’s engineering activity, to his historical knowledge and to his religious thinking, followed by a last section concerning his fortune throughout the centuries. The above-mentioned complexity is determined by the special historical moment in which the author lived. On the one hand, thanks to the new union between science and technique, the 16th century produces discoveries and inventions which make available a previously unthinkable number of notions and lead to a “revolution” in the way of studying and teaching the different subjects, which, by producing a new form of intellectual, involved in politics but also aware of scientific-technological issues, will contribute to the subsequent birth of modern science. On the other, the 16th century is ravaged by religious conflicts, which shatter the unity of the Christian world and generate theological-political disputes which will inform the history of European states for many decades. My aim is to show how Aconcio’s multifarious activity is the conscious fruit of this historical and religious situation, as well as the attempt of an answer to the request of a new kind of engagement on the intellectual’s behalf. Plunged in the discussions around methodus, employed in the most important European courts, involved in the abrupt acceleration of technical-scientific activities, and especially concerned by the radical religious reformation brought on by the Protestant movement, Jacopo Aconcio reflects this complex conjunction in his writings, without lacking in order and consistency, differently from what many scholars assume. The object of this work, therefore, is to highlight the unity of the author’s thought, in which science, technique, faith and politics are woven into a combination which, although it may appear illogical and confused, is actually tidy and methodical, and therefore in agreement with Aconcio’s own intentions and with the specific characters of European culture in the Renaissance. This theory is confirmed by the reading of the Ars muniendorum oppidorum, Aconcio’s only work which had been up till now unavailable. I am persuaded that only a methodical reading of Aconcio’s works, without forgetting nor glorifying any single one, respects the author’s will. From De methodo (1558) onwards, all his writings are summae, guides for the reader who wishes to approach the study of the various disciplines. Undoubtedly, Satan’s Stratagems (1565) is something more, not only because of its length, but because it deals with the author’s main interest: the celebration of doubt and debate as bases on which to build religious tolerance, which is the best method for pacific cohabitation in society. This, however, does not justify the total centrality which the Stratagems have enjoyed for centuries, at the expense of a proper understanding of the author’s will to offer examples of methodological rigour in all sciences. Maybe it is precisely because of the reforming power of Aconcio’s thought that, albeit often forgotten throughout the centuries, he has never ceased to reappear and continues to draw attention, both as a man and as an author. His ideas never stop stimulating the reader’s curiosity and this may ultimately be the best demonstration of their worth, independently from the historical moment in which they come back to the surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose of this research is to deepen the study on the section in architecture. The survey aims as important elements in the project Teatro Domestico by Aldo Rossi built for the XVII Triennale di Milano in 1986 and, through the implementation on several topics of architecture, verify the timeliness and fertility in the new compositional exercises. Through the study of certain areas of the Rossi’s theory we tried to find a common thread for the reading of the theater project. The theater is the place of the ephemeral and the artificial, which is why his destiny is the end and the fatal loss. The design and construction of theater setting has always had a double meaning between the value of civil architecture and testing of new technologies available. Rossi's experience in this area are clear examples of the inseparable relationship between the representation of architecture as art and design of architecture as a model of reality. In the Teatro Domestico, the distinction between representation and the real world is constantly canceled and returned through the reversal of the meaning and through the skip of scale. At present, studies conducted on the work of Rossi concern the report that the architectural composition is the theory of form, focusing compositional development of a manufacturing process between the typological analysis and form invention. The research, through the analysis of some projects few designs, will try to analyze this issue through the rules of composition both graphical and concrete construction, hoping to decipher the mechanism underlying the invention. The almost total lack of published material on the project Teatro Domestico and the opportunity to visit the archives that preserve the drawings, has allowed the author of this study to deepen the internal issues in the project, thus placing this search as a first step toward possible further analysis on the works of Rossi linked to performance world. The final aim is therefore to produce material that can best describe the work of Rossi. Through the reading of the material published by the same author and the vision of unpublished material preserved in the archives, it was possible to develop new material and increasing knowledge about the work, otherwise difficult to analyze. The research is divided into two groups. The first, taking into account the close relationship most frequently mentioned by Rossi himself between archeology and architectural composition, stresses the importance of tipo such as urban composition reading system as well as open tool of invention. Resuming Ezio Bonfanti’s essay on the work of the architect we wanted to investigate how the paratactic method is applied to the early work conceived and, subsequently as the process reaches a complexity accentuated, while keeping stable the basic terms. Following a brief introduction related to the concept of the section and the different interpretations that over time the term had, we tried to identify with this facility a methodology for reading Rossi’s projects. The result is a constant typological interpretation of the term, not only related to the composition in plant but also through the elevation plans. The section is therefore intended as the overturning of such elevation is marked on the same plane of the terms used, there is a different approach, but a similarity of characters. The identification of architectural phonemes allows comparison with other arts. The research goes in the direction of language trying to identify the relationship between representation and construction, between the ephemeral and the real world. In this sense it will highlight the similarities between the graphic material produced by Ross and some important examples of contemporary author. The comparison between the composition system with the surrealist world of painting and literature will facilitate the understanding and identification of possible rules applied by Rossi. The second part of the research is characterized by a focus on the intent of the project chosen. Teatro Domestico embodies a number of elements that seem to conclude (assuming an end point but also to start) a curriculum author. With it, the experiments carried out on the theater started with the project for the Teatrino Scientifico (1978) through the project for the Teatro del Mondo (1979), into a Laic Tabernacle representative collective and private memory of the city. Starting from a reading of the draft, through the collection of published material, we’ve made an analysis on the explicit themes of the work, finding the conceptual references. Following the taking view of the original materials not published kept at Aldo Rossi's Archive Collection of the Canadian Center for Architecture in Montréal, will be implemented through the existing techniques for digital representation, a virtual reconstruction of the project, adding little to the material, a new element for future studies. The reconstruction is part of a larger research studies where the current technologies of composition and representation in architecture stand side by side with research on the method of composition of this architect. The results achieved are in addition to experiences in the past dealt with the reconstruction of some of the lost works of Aldo Rossi. A partial objective is to reactivate a discourse around this work is considered non-principal, among others born in the prolific activities. Reassessment of development projects which would bring the level of ephemeral works most frequented by giving them the value earned. In conclusion, the research aims to open a new field of interest on the part not only as a technical instrument of representation of an idea but as an actual mechanism through which composition is formed and the idea is developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The PhD project was focused on the study of the poultry welfare conditions and improvements. The project work was divided into 3 main research activities. A) Field evaluation of chicken meat rearing conditions kept in intensive farms. Considering the lack of published reports concerning the overall Italian rearing conditions of broiler chickens, a survey was carried out to assess the welfare conditions of broiler reared in the most important poultry companies in Italy to verify if they are in accordance with the advices given in the European proposal COM (2005) 221 final. Chicken farm conditions, carcass lesions and meat quality were investigated. 1. The densities currently used in Italy are in accordance with the European proposal COM 221 final (2005) which suggests to keep broilers at a density lower than 30-32 kg live weight/m2 and to not exceed 38-40 kg live weight/m2. 2. The mortality rates in summer and winter agree with the mortality score calculated following the formula reported in the EU Proposal COM 221 final (2005). 3. The incidence of damaged carcasses was very low and did not seem related to the stocking density. 4. The FPD scores were generally above the maximum limit advised by the EU proposal COM 221 final (2005), although the stocking densities were lower than 30-32 kg live weight per m2. 5. It can be stated that the control of the environmental conditions, particularly litter quality, appears a key issue to control the onset of foot dermatitis. B) Manipulation of several farm parameters, such litter material and depth, stocking density and light regimen to improve the chicken welfare conditions, in winter season. 1. Even though 2 different stocking densities were established in this study, the performances achieved from the chickens were almost identical among groups. 2. The FCR was significantly better in Standard conditions contrarily to birds reared in Welfare conditions with lower stocking density, more litter material and with a light program of 16 hours light and 8 hours dark. 3. In our trial, in Standard groups we observed a higher content of moisture, nitrogen and ammonia released from the litter. Therefore it can be assumed that the environmental characteristics have been positively changed by the improvements of the rearing conditions adopted for Welfare groups. 4. In Welfare groups the exhausted litters of the pens were dryer and broilers showed a lower occurrence of FPD. 5. The prevalence of hock burn lesions, like FPD, is high with poor litter quality conditions. 6. The combined effect of a lower stocking density, a greater amount of litter material and a photoperiod similar to the natural one, have positively influenced the chickens welfare status, as a matter of fact the occurrence of FPD in Welfare groups was the lowest keeping the score under the European threshold of the proposal COM 221 final(2005). C) The purpose of the third research was to study the effect of high or low stocking density of broiler chickens, different types of litter and the adoption of short or long lighting regimen on broiler welfare through the evaluation of their productivity and incidence of foot pad dermatitis during the hot season. 1. The feed efficiency was better for the Low Density than for High Density broilers. 2. The appearance of FPD was not influenced by stocking density. 3. The foot examination revealed that the lesions occurred more in birds maintained on chopped wheat straw than on wood shaving. 4. In conclusion, the adoptions of a short light regimen similar to that occurring in nature during summer reduces the feed intake without modify the growth rate thus improving the feed efficiency. Foot pad lesion were not affected neither by stocking densities nor by light regimens whereas wood shavings exerted a favourable effect in preserving foot pad in good condition. D) A study was carried out to investigate more widely the possible role of 25-hydroxycholecalciferol supplemented in the diet of a laying hen commercial strain (Lohmann brown) in comparison of diets supplemented with D3 or with D3 + 25- hydroxycholecalciferol. Egg traits during a productive cycle as well as the bone characteristics of the layers have been as well evaluated to determine if there the vitamin D3 may enhance the welfare status of the birds. 1. The weight of the egg and of its components is often greater in hens fed a diet enriched with 25-hydroxycholecalciferol. 2. Since eggs of treated groups are heavier and a larger amount of shell is needed, a direct effect on shell strength is observed. 3. At 30 and at 50 wk of age hens fed 25 hydroxycholecalciferol exhibited greater values of bone breaking force. 4. Radiographic density values obtained in the trial are always higher in hens fed with 25-hydroxycholecalciferol of both treatments: supplemented for the whole laying cycle (25D3) or from 40 weeks of age onward (D3+25D3).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Il problema affrontato nel lavoro riguarda l'allocazione della spesa tra gruppi di beni alimentari (domestici ed extra-domestici) e le modificazioni che tale allocazione ha subito nell’arco dell’ultimo decennio. L’obiettivo principale dell'analisi proposta è, quindi, di spiegare come variazioni della quota di spesa destinata alle componenti del consumo alimentare siano attribuibili a fattori strettamente economici, oltre che alle caratteristiche struttura socio-demografiche dei consumatori. Allo scopo di valutare l’allocazione inter-temporale della spesa individuale viene proposto come schema di analisi il sistema di domanda Almost Ideal di Deaton e Muellbauer (AIDS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La specificità dell'acquisizione di contenuti attraverso le interfacce digitali condanna l'agente epistemico a un'interazione frammentata, insufficiente da un punto di vista computazionale, mnemonico e temporale, rispetto alla mole informazionale oggi accessibile attraverso una qualunque implementazione della relazione uomo-computer, e invalida l'applicabilità del modello standard di conoscenza, come credenza vera e giustificata, sconfessando il concetto di credenza razionalmente fondata, per formare la quale, sarebbe invece richiesto all'agente di poter disporre appunto di risorse concettuali, computazionali e temporali inaccessibili. La conseguenza è che l'agente, vincolato dalle limitazioni ontologiche tipiche dell'interazione con le interfacce culturali, si vede costretto a ripiegare su processi ambigui, arbitrari e spesso più casuali di quanto creda, di selezione e gestione delle informazioni che danno origine a veri e propri ibridi (alla Latour) epistemologici, fatti di sensazioni e output di programmi, credenze non fondate e bit di testimonianze indirette e di tutta una serie di relazioni umano-digitali che danno adito a rifuggire in una dimensione trascendente che trova nel sacro il suo più immediato ambito di attuazione. Tutto ciò premesso, il presente lavoro si occupa di costruire un nuovo paradigma epistemologico di conoscenza proposizionale ottenibile attraverso un'interfaccia digitale di acquisizione di contenuti, fondato sul nuovo concetto di Tracciatura Digitale, definito come un un processo di acquisizione digitale di un insieme di tracce, ossia meta-informazioni di natura testimoniale. Tale dispositivo, una volta riconosciuto come un processo di comunicazione di contenuti, si baserà sulla ricerca e selezione di meta-informazioni, cioè tracce, che consentiranno l'implementazione di approcci derivati dall'analisi decisionale in condizioni di razionalità limitata, approcci che, oltre ad essere quasi mai utilizzati in tale ambito, sono ontologicamente predisposti per una gestione dell'incertezza quale quella riscontrabile nell'istanziazione dell'ibrido informazionale e che, in determinate condizioni, potranno garantire l'agente sulla bontà epistemica del contenuto acquisito.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innerhalb dieser Dissertation wurde zwischen den Jahren 2002 und 2005 mit Hilfe von Barberfallen die Laufkäferfauna der Auwälder am nördlichen Oberrhein zwischen Mainz und Bingen erfasst. Dabei dienten verschiedene Rheininseln und ufernahe Festlandgebiete als Probeflächen. Fünf der typischen Bewohner dieser Flächen (Agonum afrum, Nebria brevicollis, Oxypselaphus obscurus, Platynus assimilis, Pterostichus anthracinus) dienten weiterhin als Modellarten für die Untersuchung der genetischen Diversität zwischen den einzelnen Populationen mittels RAPD-Analysen. Alles in allem konnten im Untersuchungsgebiet über 20.000 Individuen aus 101 Carabidenarten gefangen werden. Die häufigsten Vertreter waren Platynus assimilis, Pterostichus melanarius und Agonum afrum. Hohe Diversitäts- und Dominanzindices auf allen Flächen sprechen für die Dynamik des Lebensraumes und somit die Intaktheit der untersuchten Auwälder. Einen weiteren Hinweis auf die ständig wechselnden Lebensbedingungen durch immer wiederkehrende Überflutungen zeigt das Auftreten verschiedener ökologischer Gruppen. Überall dominierten deutlich die Arten, die mit gewissen Störungen des Habitates auskommen oder durch ihr hohes Ausbreitungspotential davor fliehen können. Das sind die Imaginalüberwinterer, makropteren, hygrophilen und kleinen Spezies. Auch das Geschlechterverhältnis weist auf deutliche Anzeichen für regelmäßige Beeinträchtigungen der Flächen hin. Knapp die Hälfte der beobachteten Arten im Untersuchungsgebiet steht auf einer der Roten Listen von Deutschland, Rheinland-Pfalz oder Hessen. Somit besteht für das gesamte Gebiet ein hoher Schutzbedarf. Das Hauptaugenmerk dieser Arbeit lag bei den Einflüssen der Hochwasserstände auf die Artenzusammensetzungen und die genetischen Diversitäten der Laufkäferpopulationen. Deshalb untersuchte man auch die Wirkung derjenigen Faktoren, die ihrerseits unmittelbar von den Extremwasserständen beeinflusst werden. Hier sind vor allem der Auentyp (Weichholz/Hartholz) und die Lage der Flächen auf Insel oder Festland zu nennen, die die deutlichsten Unterschiede in Artendiversität und Genetik der einzelnen Populationen zeigten. Aber auch weitere Faktoren, wie Wasserstandsdynamik, Auwaldbreite, Entfernung vom Fluss und Lage zum Damm weisen Zusammenhänge mit bestimmten ökologischen Gruppen auf. Lediglich die Habitatgröße scheint keinen Einfluss auf die Diversitäten zu nehmen. Abschließend konnten auch für das Jahr 2003, in dem extrem heiße und trockene Bedingungen herrschten, negative Effekte auf die Laufkäfergemeinschaften gezeigt werden.