14 resultados para publicly verifiable

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Ph.D. candidate thesis collects the research work I conducted under the supervision of Prof.Bruno Samor´ı in 2005,2006 and 2007. Some parts of this work included in the Part III have been begun by myself during my undergraduate thesis in the same laboratory and then completed during the initial part of my Ph.D. thesis: the whole results have been included for the sake of understanding and completeness. During my graduate studies I worked on two very different protein systems. The theorical trait d’union between these studies, at the biological level, is the acknowledgement that protein biophysical and structural studies must, in many cases, take into account the dynamical states of protein conformational equilibria and of local physico-chemical conditions where the system studied actually performs its function. This is introducted in the introductory part in Chapter 2. Two different examples of this are presented: the structural significance deriving from the action of mechanical forces in vivo (Chapter 3) and the complexity of conformational equilibria in intrinsically unstructured proteins and amyloid formation (Chapter 4). My experimental work investigated both these examples by using in both cases the single molecule force spectroscopy technique (described in Chapter 5 and Chapter 6). The work conducted on angiostatin focused on the characterization of the relationships between the mechanochemical properties and the mechanism of action of the angiostatin protein, and most importantly their intertwining with the further layer of complexity due to disulfide redox equilibria (Part III). These studies were accompanied concurrently by the elaboration of a theorical model for a novel signalling pathway that may be relevant in the extracellular space, detailed in Chapter 7.2. The work conducted on -synuclein (Part IV) instead brought a whole new twist to the single molecule force spectroscopy methodology, applying it as a structural technique to elucidate the conformational equilibria present in intrinsically unstructured proteins. These equilibria are of utmost interest from a biophysical point of view, but most importantly because of their direct relationship with amyloid aggregation and, consequently, the aetiology of relevant pathologies like Parkinson’s disease. The work characterized, for the first time, conformational equilibria in an intrinsically unstructured protein at the single molecule level and, again for the first time, identified a monomeric folded conformation that is correlated with conditions leading to -synuclein and, ultimately, Parkinson’s disease. Also, during the research work, I found myself in the need of a generalpurpose data analysis application for single molecule force spectroscopy data analysis that could solve some common logistic and data analysis problems that are common in this technique. I developed an application that addresses some of these problems, herein presented (Part V), and that aims to be publicly released soon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The scale down of transistor technology allows microelectronics manufacturers such as Intel and IBM to build always more sophisticated systems on a single microchip. The classical interconnection solutions based on shared buses or direct connections between the modules of the chip are becoming obsolete as they struggle to sustain the increasing tight bandwidth and latency constraints that these systems demand. The most promising solution for the future chip interconnects are the Networks on Chip (NoC). NoCs are network composed by routers and channels used to inter- connect the different components installed on the single microchip. Examples of advanced processors based on NoC interconnects are the IBM Cell processor, composed by eight CPUs that is installed on the Sony Playstation III and the Intel Teraflops pro ject composed by 80 independent (simple) microprocessors. On chip integration is becoming popular not only in the Chip Multi Processor (CMP) research area but also in the wider and more heterogeneous world of Systems on Chip (SoC). SoC comprehend all the electronic devices that surround us such as cell-phones, smart-phones, house embedded systems, automotive systems, set-top boxes etc... SoC manufacturers such as ST Microelectronics , Samsung, Philips and also Universities such as Bologna University, M.I.T., Berkeley and more are all proposing proprietary frameworks based on NoC interconnects. These frameworks help engineers in the switch of design methodology and speed up the development of new NoC-based systems on chip. In this Thesis we propose an introduction of CMP and SoC interconnection networks. Then focusing on SoC systems we propose: • a detailed analysis based on simulation of the Spidergon NoC, a ST Microelectronics solution for SoC interconnects. The Spidergon NoC differs from many classical solutions inherited from the parallel computing world. Here we propose a detailed analysis of this NoC topology and routing algorithms. Furthermore we propose aEqualized a new routing algorithm designed to optimize the use of the resources of the network while also increasing its performance; • a methodology flow based on modified publicly available tools that combined can be used to design, model and analyze any kind of System on Chip; • a detailed analysis of a ST Microelectronics-proprietary transport-level protocol that the author of this Thesis helped developing; • a simulation-based comprehensive comparison of different network interface designs proposed by the author and the researchers at AST lab, in order to integrate shared-memory and message-passing based components on a single System on Chip; • a powerful and flexible solution to address the time closure exception issue in the design of synchronous Networks on Chip. Our solution is based on relay stations repeaters and allows to reduce the power and area demands of NoC interconnects while also reducing its buffer needs; • a solution to simplify the design of the NoC by also increasing their performance and reducing their power and area consumption. We propose to replace complex and slow virtual channel-based routers with multiple and flexible small Multi Plane ones. This solution allows us to reduce the area and power dissipation of any NoC while also increasing its performance especially when the resources are reduced. This Thesis has been written in collaboration with the Advanced System Technology laboratory in Grenoble France, and the Computer Science Department at Columbia University in the city of New York.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I Max Bill is an intense giornata of a big fresco. An analysis of the main social, artistic and cultural events throughout the twentieth century is needed in order to trace his career through his masterpieces and architectures. Some of the faces of this hypothetical mural painting are, among others, Le Corbusier, Walter Gropius, Ernesto Nathan Rogers, Kandinskij, Klee, Mondrian, Vatongerloo, Ignazio Silone, while the backcloth is given by artistic avant-gardes, Bauhaus, International Exhibitions, CIAM, war events, reconstruction, Milan Triennali, Venice Biennali, the School of Ulm. Architect, even though more known as painter, sculptor, designer and graphic artist, Max Bill attends the Bauhaus as a student in the years 1927-1929, and from this experience derives the main features of a rational, objective, constructive and non figurative art. His research is devoted to give his art a scientific methodology: each work proceeds from the analysis of a problem to the logical and always verifiable solution of the same problem. By means of composition elements (such as rhythm, seriality, theme and its variation, harmony and dissonance), he faces, with consistent results, themes apparently very distant from each other as the project for the H.f.G. or the design for a font. Mathematics are a constant reference frame as field of certainties, order, objectivity: ‘for Bill mathematics are never confined to a simple function: they represent a climate of spiritual certainties, and also the theme of non attempted in its purest state, objectivity of the sign and of the geometrical place, and at the same time restlessness of the infinity: Limited and Unlimited ’. In almost sixty years of activity, experiencing all artistic fields, Max Bill works, projects, designs, holds conferences and exhibitions in Europe, Asia and Americas, confronting himself with the most influencing personalities of the twentieth century. In such a vast scenery, the need to limit the investigation field combined with the necessity to address and analyse the unpublished and original aspect of Bill’s relations with Italy. The original contribution of the present research regards this particular ‘geographic delimitation’; in particular, beyond the deep cultural exchanges between Bill and a series of Milanese architects, most of all with Rogers, two main projects have been addressed: the realtà nuova at Milan Triennale in 1947, and the Contemporary Art Museum in Florence in 1980. It is important to note that these projects have not been previously investigated, and the former never appears in the sources either. These works, together with the most well-known ones, such as the projects for the VI and IX Triennale, and the Swiss pavilion for the Biennale, add important details to the reference frame of the relations which took place between Zurich and Milan. Most of the occasions for exchanges took part in between the Thirties and the Fifties, years during which Bill underwent a significant period of artistic growth. He meets the Swiss progressive architects and the Paris artists from the Abstraction-Création movement, enters the CIAM, collaborates with Le Corbusier to the third volume of his Complete Works, and in Milan he works and gets confronted with the events related to post-war reconstruction. In these years Bill defines his own working methodology, attaining an artistic maturity in his work. The present research investigates the mentioned time period, despite some necessary exceptions. II The official Max Bill bibliography is naturally wide, including spreading works along with ones more devoted to analytical investigation, mainly written in German and often translated into French and English (Max Bill himself published his works in three languages). Few works have been published in Italian and, excluding the catalogue of the Parma exhibition from 1977, they cannot be considered comprehensive. Many publications are exhibition catalogues, some of which include essays written by Max Bill himself, some others bring Bill’s comments in a educational-pedagogical approach, to accompany the observer towards a full understanding of the composition processes of his art works. Bill also left a great amount of theoretical speculations to encourage a critical reading of his works in the form of books edited or written by him, and essays published in ‘Werk’, magazine of the Swiss Werkbund, and other international reviews, among which Domus and Casabella. These three reviews have been important tools of analysis, since they include tracks of some of Max Bill’s architectural works. The architectural aspect is less investigated than the plastic and pictorial ones in all the main reference manuals on the subject: Benevolo, Tafuri and Dal Co, Frampton, Allenspach consider Max Bill as an artist proceeding in his work from Bauhaus in the Ulm experience . A first filing of his works was published in 2004 in the monographic issue of the Spanish magazine 2G, together with critical essays by Karin Gimmi, Stanislaus von Moos, Arthur Rüegg and Hans Frei, and in ‘Konkrete Architektur?’, again by Hans Frei. Moreover, the monographic essay on the Atelier Haus building by Arthur Rüegg from 1997, and the DPA 17 issue of the Catalonia Polytechnic with contributions of Carlos Martì, Bruno Reichlin and Ton Salvadò, the latter publication concentrating on a few Bill’s themes and architectures. An urge to studying and going in depth in Max Bill’s works was marked in 2008 by the centenary of his birth and by a recent rediscovery of Bill as initiator of the ‘minimalist’ tradition in Swiss architecture. Bill’s heirs are both very active in promoting exhibitions, researching and publishing. Jakob Bill, Max Bill’s son and painter himself, recently published a work on Bill’s experience in Bauhaus, and earlier on he had published an in-depth study on ‘Endless Ribbons’ sculptures. Angela Thomas Schmid, Bill’s wife and art historian, published in end 2008 the first volume of a biography on Max Bill and, together with the film maker Eric Schmid, produced a documentary film which was also presented at the last Locarno Film Festival. Both biography and documentary concentrate on Max Bill’s political involvement, from antifascism and 1968 protest movements to Bill experiences as Zurich Municipality councilman and member of the Swiss Confederation Parliament. In the present research, the bibliography includes also direct sources, such as interviews and original materials in the form of letters correspondence and graphic works together with related essays, kept in the max+binia+jakob bill stiftung archive in Zurich. III The results of the present research are organized into four main chapters, each of them subdivided into four parts. The first chapter concentrates on the research field, reasons, tools and methodologies employed, whereas the second one consists of a short biographical note organized by topics, introducing the subject of the research. The third chapter, which includes unpublished events, traces the historical and cultural frame with particular reference to the relations between Max Bill and the Italian scene, especially Milan and the architects Rogers and Baldessari around the Fifties, searching the themes and the keys for interpretation of Bill’s architectures and investigating the critical debate on the reviews and the plastic survey through sculpture. The fourth and last chapter examines four main architectures chosen on a geographical basis, all devoted to exhibition spaces, investigating Max Bill’s composition process related to the pictorial field. Paintings has surely been easier and faster to investigate and verify than the building field. A doctoral thesis discussed in Lausanne in 1977 investigating Max Bill’s plastic and pictorial works, provided a series of devices which were corrected and adapted for the definition of the interpretation grid for the composition structures of Bill’s main architectures. Four different tools are employed in the investigation of each work: a context analysis related to chapter three results; a specific theoretical essay by Max Bill briefly explaining his main theses, even though not directly linked to the very same work of art considered; the interpretation grid for the composition themes derived from a related pictorial work; the architecture drawing and digital three-dimensional model. The double analysis of the architectural and pictorial fields is functional to underlining the relation among the different elements of the composition process; the two fields, however, cannot be compared and they stay, in Max Bill’s works as in the present research, interdependent though self-sufficient. IV An important aspect of Max Bill production is self-referentiality: talking of Max Bill, also through Max Bill, as a need for coherence instead of a method limitation. Ernesto Nathan Rogers describes Bill as the last humanist, and his horizon is the known world but, as the ‘Concrete Art’ of which he is one of the main representatives, his production justifies itself: Max Bill not only found a method, but he autonomously re-wrote the ‘rules of the game’, derived timeless theoretical principles and verified them through a rich and interdisciplinary artistic production. The most recurrent words in the present research work are synthesis, unity, space and logic. These terms are part of Max Bill’s vocabulary and can be referred to his works. Similarly, graphic settings or analytical schemes in this research text referring to or commenting Bill’s architectural projects were drawn up keeping in mind the concise precision of his architectural design. As for Mies van der Rohe, it has been written that Max Bill took art to ‘zero degree’ reaching in this way a high complexity. His works are a synthesis of art: they conceptually encompass all previous and –considered their developments- most of contemporary pictures. Contents and message are generally explicitly declared in the title or in Bill’s essays on his artistic works and architectural projects: the beneficiary is invited to go through and re-build the process of synthesis generating the shape. In the course of the interview with the Milan artist Getulio Alviani, he tells how he would not write more than a page for an essay on Josef Albers: everything was already evident ‘on the surface’ and any additional sentence would be redundant. Two years after that interview, these pages attempt to decompose and single out the elements and processes connected with some of Max Bill’s works which, for their own origin, already contain all possible explanations and interpretations. The formal reduction in favour of contents maximization is, perhaps, Max Bill’s main lesson.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Premise: In the literary works of our anthropological and cultural imagination, the various languages and the different discursive practices are not necessarily quoted, expressly alluded to or declared through clear expressive mechanisms; instead, they rather constitute a substratum, a background, now consolidated, which with irony and intertextuality shines through the thematic and formal elements of each text. The various contaminations, hybridizations and promptings that we find in the expressive forms, the rhetorical procedures and the linguistic and thematic choices of post-modern literary texts are shaped as fluid and familiar categories. Exchanges and passages are no longer only allowed but also inevitable; the post-modern imagination is made up of an agglomeration of discourses that are no longer really separable, built up from texts that blend and quote one another, composing, each with its own specificities, the great family of the cultural products of our social scenario. A literary work, therefore, is not only a whole phenomenon, delimited hic et nunc by a beginning and an ending, but is a fragment of that complex, dense and boundless network that is given by the continual interrelations between human forms of communication and symbolization. The research hypothesis: A vision is delineated of comparative literature as a discipline attentive to the social contexts in which texts take shape and move and to the media-type consistency that literary phenomena inevitably take on. Hence literature is seen as an open systematicity that chooses to be contaminated by other languages and other discursive practices of an imagination that is more than ever polymorphic and irregular. Inside this interpretative framework the aim is to focus the analysis on the relationship that postmodern literature establishes with advertising discourse. On one side post-modern literature is inserted in the world of communication, loudly asserting the blending and reciprocal contamination of literary modes with media ones, absorbing their languages and signification practices, translating them now into thematic nuclei, motifs and sub-motifs and now into formal expedients and new narrative choices; on the other side advertising is chosen as a signification practice of the media universe, which since the 1960s has actively contributed to shaping the dynamics of our socio-cultural scenarios, in terms which are just as important as those of other discursive practices. Advertising has always been a form of communication and symbolization that draws on the collective imagination – myths, actors and values – turning them into specific narrative programs for its own texts. Hence the aim is to interpret and analyze this relationship both from a strictly thematic perspective – and therefore trying to understand what literature speaks about when it speaks about advertising, and seeking advertising quotations in post-modern fiction – and from a formal perspective, with a search for parallels and discordances between the rhetorical procedures, the languages and the verifiable stylistic choices in the texts of the two different signification practices. The analysis method chosen, for the purpose of constructive multiplication of the perspectives, aims to approach the analytical processes of semiotics, applying, when possible, the instruments of the latter, in order to highlight the thematic and formal relationships between literature and advertising. The corpus: The corpus of the literary texts is made up of various novels and, although attention is focused on the post-modern period, there will also be ineludible quotations from essential authors that with their works prompted various reflections: H. De Balzac, Zola, Fitzgerald, Joyce, Calvino, etc… However, the analysis focuses the corpus on three authors: Don DeLillo, Martin Amis and Aldo Nove, and in particular the followings novels: “Americana” (1971) and “Underworld” (1999) by Don DeLillo, “Money” (1984) by Martin Amis and “Woobinda and other stories without a happy ending” (1996) and “Superwoobinda” (1998) by Aldo Nove. The corpus selection is restricted to these novels for two fundamental reasons: 1. assuming parameters of spatio-temporal evaluation, the texts are representative of different socio-cultural contexts and collective imaginations (from the masterly glimpses of American life by DeLillo, to the examples of contemporary Italian life by Nove, down to the English imagination of Amis) and of different historical moments (the 1970s of DeLillo’s Americana, the 1980s of Amis, down to the 1990s of Nove, decades often used as criteria of division of postmodernism into phases); 2. adopting a perspective of strictly thematic analysis, as mentioned in the research hypothesis, the variations and the constants in the novels (thematic nuclei, topoi, images and narrative developments) frequently speak of advertising and inside the narrative plot they affirm various expressions and realizations of it: value ones, thematic ones, textual ones, urban ones, etc… In these novels the themes and the processes of signification of advertising discourse pervade time, space and the relationships that the narrator character builds around him. We are looking at “particle-characters” whose endless facets attest the influence and contamination of advertising in a large part of the narrative developments of the plot: on everyday life, on the processes of acquisition and encoding of the reality, on ideological and cultural baggage, on the relationships and interchanges with the other characters, etc… Often the characters are victims of the implacable consequentiality of the advertising mechanism, since the latter gets the upper hand over the usual processes of communication, which are overwhelmed by it, wittingly or unwittingly (for example: disturbing openings in which the protagonist kills his or her parents on the basis of a spot, former advertisers that live life codifying it through the commercial mechanisms of products, sons and daughters of advertisers that as children instead of playing outside for whole nights saw tapes of spots.) Hence the analysis arises from the text and aims to show how much the developments and the narrative plots of the novels encode, elaborate and recount the myths, the values and the narrative programs of advertising discourse, transforming them into novel components in their own right. And also starting from the text a socio-cultural reference context is delineated, a collective imagination that is different, now geographically, now historically, and from comparison between them the aim is to deduce the constants, the similarities and the variations in the relationship between literature and advertising.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of publicly available genomes of Streptococcus pneumoniae has led to the identification of a new genomic element resembling gram-positive pilus islets (PIs). Here, we demonstrate that this genomic region, herein referred to as PI-2 (containing the genes pitA, sipA, pitB, srtG1, and srtG2) codes for a novel functional pilus in pneumococcus. Therefore, there are two pilus islets identified so far in this pathogen (PI-1 and PI-2). Polymerization of the PI-2 pilus requires the backbone protein PitB as well as the sortase SrtG1 and the signal peptidase-like protein SipA. PI-2 is associated with serotypes 1, 2, 7F, 19A, and 19F, considered to be emerging in both industrialized and developing countries. Interestingly, strains belonging to clonal complex 271 (CC271) contain both PI-1 and PI-2, as revealed by genome analyses. In these strains both pili are surface exposed and independently assembled. Furthermore, in vitro experiments provide evidence that the pilus encoded by PI-2 of S. pneumoniae is involved in adherence. Thus, pneumococci encode at least two types of pili that may play a role in the initial host cell contact to the respiratory tract. In addition, the pilus proteins are potential antigens for inclusion in a new generation of pneumococcal vaccines. Adherence by pili could represent important factor in bacterial community formation, since it has been demonstrated that bacterial community formation plays an important role in pneumococcal otitis media. In vitro quantification of bacterial community formation by S. pneumoniae was performed in order to investigate the possible role of pneumococcal pili to form communities. By using different growth media we were not able to see clear association between pili and community formation. But our findings revealed that strains belonging to MLST clonal complex CC15 efficiently form bacterial communities in vitro in a glucose dependent manner. We compared the genome of forty-four pneumococcal isolates discovering four open reading frames specifically associated with CC15. These four genes are annotated as members of an operon responsible for the biosynthesis of a putative lanctibiotic peptide, described to be involved in bacterial community formation. Our experiments show that the lanctibiotic operon deletion affects glucose mediated community formation in CC 15 strain INV200. Moreover, since glucose consumption during bacterial growth produce an acidic environment, we tested bacterial community formation at different pH and we showed that the lanctibiotic operon deletion affected pH mediated community formation in CC 15 strain INV200. In conclusion, these data demonstrate that the putative lanctibiotic operon is associated with pneumococcal CC 15 strains in vitro bacterial community formation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this proposal is to offer an alternative perspective on the study of Cold War, since insufficient attention is usually paid to those organizations that mobilized against the development and proliferation of nuclear weapons. The antinuclear movement began to mobilize between the 1950s and the 1960s, when it finally gained the attention of public opinion, and helped to build a sort of global conscience about nuclear bombs. This was due to the activism of a significant part of the international scientific community, which offered powerful intellectual and political legitimization to the struggle, and to the combined actions of the scientific and organized protests. This antinuclear conscience is something we usually tend to consider as a fait accompli in contemporary world, but the question is to show its roots, and the way it influenced statesmen and political choices during the period of nuclear confrontation of the early Cold War. To understand what this conscience could be and how it should be defined, we have to look at the very meaning of the nuclear weapons that has deeply modified the sense of war. Nuclear weapons seemed to be able to destroy human beings everywhere with no realistic forms of control of the damages they could set off, and they represented the last resource in the wide range of means of mass destruction. Even if we tend to consider this idea fully rational and incontrovertible, it was not immediately born with the birth of nuclear weapons themselves. Or, better, not everyone in the world did immediately share it. Due to the particular climate of Cold War confrontation, deeply influenced by the persistence of realistic paradigms in international relations, British and U.S. governments looked at nuclear weapons simply as «a bullet». From the Trinity Test to the signature of the Limited Test Ban Treaty in 1963, many things happened that helped to shift this view upon nuclear weapons. First of all, more than ten years of scientific protests provided a more concerned knowledge about consequences of nuclear tests and about the use of nuclear weapons. Many scientists devoted their social activities to inform public opinion and policy-makers about the real significance of the power of the atom and the related danger for human beings. Secondly, some public figures, as physicists, philosophers, biologists, chemists, and so on, appealed directly to the human community to «leave the folly and face reality», publicly sponsoring the antinuclear conscience. Then, several organizations leaded by political, religious or radical individuals gave to this protests a formal structure. The Campaign for Nuclear Disarmament in Great Britain, as well as the National Committee for a Sane Nuclear Policy in the U.S., represented the voice of the masses against the attempts of governments to present nuclear arsenals as a fundamental part of the international equilibrium. Therefore, the antinuclear conscience could be defined as an opposite feeling to the development and the use of nuclear weapons, able to create a political issue oriented to the influence of military and foreign policies. Only taking into consideration the strength of this pressure, it seems possible to understand not only the beginning of nuclear negotiations, but also the reasons that permitted Cold War to remain cold.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study aims at assessing the innovation strategies adopted within a regional economic system, the Italian region Emilia-Romagna, as it faced the challenges of a changing international scenario. As the strengthening of the regional innovative capabilities is regarded as a keystone to foster a new phase of economic growth, it is important also to understand how the local industrial, institutional, and academic actors have tackled the problem of innovation in the recent past. In this study we explore the approaches to innovation and the strategies adopted by the main regional actors through three different case studies. Chapter 1 provides a general survey of the innovative performance of the regional industries over the past two decades, as it emerges from statistical data and systematic comparisons at the national and European levels. The chapter also discusses the innovation policies that the regional government set up since 2001 in order to strengthen the collaboration among local economic actors, including universities and research centres. As mechanics is the most important regional industry, chapter 2 analyses the combination of knowledge and practices utilized in the period 1960s-1990s in the design of a particular kind of machinery produced by G.D S.p.A., a world-leader in the market of tobacco packaging machines. G.D is based in Bologna, the region’s capital, and is at the centre of the most important Italian packaging district. In chapter 3 the attention turns to the institutional level, focusing on how the local public administrations, and the local, publicly-owned utility companies have dealt with the creation of new telematic networks on the regional territory during the 1990s and 2000s. Finally, chapter 4 assesses the technology transfer carried out by the main university of the region – the University of Bologna – by focusing on the patenting activities involving its research personnel in the period 1960-2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of people by measuring some traits of individual anatomy or physiology has led to a specific research area called biometric recognition. This thesis is focused on improving fingerprint recognition systems considering three important problems: fingerprint enhancement, fingerprint orientation extraction and automatic evaluation of fingerprint algorithms. An effective extraction of salient fingerprint features depends on the quality of the input fingerprint. If the fingerprint is very noisy, we are not able to detect a reliable set of features. A new fingerprint enhancement method, which is both iterative and contextual, is proposed. This approach detects high-quality regions in fingerprints, selectively applies contextual filtering and iteratively expands like wildfire toward low-quality ones. A precise estimation of the orientation field would greatly simplify the estimation of other fingerprint features (singular points, minutiae) and improve the performance of a fingerprint recognition system. The fingerprint orientation extraction is improved following two directions. First, after the introduction of a new taxonomy of fingerprint orientation extraction methods, several variants of baseline methods are implemented and, pointing out the role of pre- and post- processing, we show how to improve the extraction. Second, the introduction of a new hybrid orientation extraction method, which follows an adaptive scheme, allows to improve significantly the orientation extraction in noisy fingerprints. Scientific papers typically propose recognition systems that integrate many modules and therefore an automatic evaluation of fingerprint algorithms is needed to isolate the contributions that determine an actual progress in the state-of-the-art. The lack of a publicly available framework to compare fingerprint orientation extraction algorithms, motivates the introduction of a new benchmark area called FOE (including fingerprints and manually-marked orientation ground-truth) along with fingerprint matching benchmarks in the FVC-onGoing framework. The success of such framework is discussed by providing relevant statistics: more than 1450 algorithms submitted and two international competitions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The general theme of the present inquiry concerns the role of training and continuous updating of knowledge and skills in relation to the concept of employability and social vulnerability. The empirical research has affected the entire calendar year 2010, namely from 13 February 2010 to December 31, 2010: data refer to a very specific context or to the course funded by the Emilia Romagna region and targeted to employees in cassintegrazione notwithstanding domiciled in the region. The investigations were performed in a vocational training scheme accredited by the Emilia Romagna for the provision of publicly funded training courses. The quantitative data collected are limited to the region and distributed in all the provinces of Emilia Romagna; It addressed the issue of the role of continuing education throughout life and the importance of updating knowledge and skills, such as privileged instruments to address the instability of the labor market and what strategy to reduce the risk unemployment. Based on the different strategies that the employee puts in place during their professional careers, we introduce two concepts that are more common in the so-called knowledge society, namely the concept of social vulnerability and employability. In modern organizations becomes relevant knowledge they bring workers and the relationships that develop between people and allowing exponentially and disseminate such knowledge and skills. The knowledge thus becomes the first productive force, defined by Davenport and Prusak (1998) as "fluid combination of experience, values, contextual information and specialist knowledge that provides a framework for the evaluation and assimilation of new experience and new information ". Learning at work is a by stable explicit and conscious, and even enjoyable for everyone, especially outside of a training intervention. It then goes on to address the specific issue of training, under the current labor market increasingly deconstructed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questo studio propone un'esplorazione dei nessi tra processi migratori ed esperienze di salute e malattia a partire da un'indagine sulle migrazioni provenienti dall'America latina in Emilia-Romagna. Contemporaneamente indaga i termini del dibattito sulla diffusione della Malattia di Chagas, “infezione tropicale dimenticata” endemica in America centro-meridionale che, grazie all'incremento dei flussi migratori transnazionali, viene oggi riconfigurata come 'emergente' in alcuni contesti di immigrazione. Attraverso i paradigmi teorico-metodologici disciplinari dell'antropologia medica, della salute globale e degli studi sulle migrazioni, si è inteso indagare la natura della relazione tra “dimenticanza” ed “emergenza” nelle politiche che caratterizzano il contesto migratorio europeo e italiano nello specifico. Si sono analizzate questioni vincolate alla legittimità degli attori coinvolti nella ridefinizione del fenomeno in ambito pubblico; alle visioni che informano le strategie sanitarie di presa in carico dell'infezione; alle possibili ricadute di tali visioni nelle pratiche di cura. Parte della ricerca si è realizzata all'interno del reparto ospedaliero ove è stato implementato il primo servizio di diagnosi e trattamento per l'infezione in Emilia-Romagna. È stata pertanto realizzata una etnografia fuori/dentro al servizio, coinvolgendo i principali soggetti del campo di indagine -immigrati latinoamericani e operatori sanitari-, con lo scopo di cogliere visioni, logiche e pratiche a partire da un'analisi della legislazione che regola l'accesso al servizio sanitario pubblico in Italia. Attraverso la raccolta di narrazioni biografiche, lo studio ha contribuito a far luce su peculiari percorsi migratori e di vita nel contesto locale; ha permesso di riflettere sulla validità di categorie come quella di “latinoamericano” utilizzata dalla comunità scientifica in stretta correlazione con il Chagas; ha riconfigurato il senso di un approccio attento alle connotazioni culturali all'interno di un più ampio ripensamento delle forme di inclusione e di partecipazione finalizzate a dare asilo ai bisogni sanitari maggiormente percepiti e alle esperienze soggettive di malattia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’enigma della relazione tra Gesù e Giovanni il Battista ha da sempre stimolato l’immaginazione storica degli studiosi, suscitando una varietà di ipotesi e valutazioni, spesso assai diverse. Ciò nonostante, tutti concordano su un punto: che, almeno nella sua fase maggiore in Galilea, il ministero di Gesù fosse una realtà essenzialmente autonoma, diversa, originale e irriducibile rispetto alla missione di Giovanni. In controtendenza con questa “impostazione predefinita”, il presente studio sostiene la tesi secondo cui Gesù portò avanti la sua missione come intenzionale e programmatica prosecuzione dell’opera prematuramente interrotta di Giovanni. Nella prima parte, si esamina approfonditamente quale memoria della relazione sia conservata nelle fonti più antiche, cioè Q (qui analizzata con particolare attenzione) e Marco – a cui si aggiunge Matteo, che, in ragione dello stretto legame storico-sociologico con Q, offre un esempio illuminante di rinarrazione della memoria altamente originale eppure profondamente fedele. La conclusione è che la memoria più antica della relazione Gesù-Giovanni è profondamente segnata da aspetti di accordo, conformità e allineamento. Nella seconda parte si esaminano una serie di tradizioni che attestano che Gesù era percepito pubblicamente in relazione al Battista e che egli stesso condivideva e alimentava tale percezione riallacciandosi a lui in polemica con i suoi avversari, e dipingendolo come una figura di capitale importanza nella predicazione e nell’insegnamento a seguaci e discepoli. Infine, si argomenta l’esistenza di ampie e sostanziali aree di accordo tra i due a livello di escatologia, istruzioni etiche e programma sociale, missione penitenziale verso i peccatori e attività battesimale. L’ipotesi che Gesù portasse avanti l’attività riformatrice di Giovanni, in termini di una campagna purificatoria “penitenziale-battesimale-esorcistica” in preparazione dell’avvento di Dio, consente infine di armonizzare in modo soddisfacente i due aspetti più caratteristici dell’attività di Gesù (normalmente giustapposti, quando non contrapposti): escatologia e miracoli, il Gesù profeta e il Gesù taumaturgo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.