22 resultados para RIGTHS GUARANTEES
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Self-organisation is increasingly being regarded as an effective approach to tackle modern systems complexity. The self-organisation approach allows the development of systems exhibiting complex dynamics and adapting to environmental perturbations without requiring a complete knowledge of the future surrounding conditions. However, the development of self-organising systems (SOS) is driven by different principles with respect to traditional software engineering. For instance, engineers typically design systems combining smaller elements where the composition rules depend on the reference paradigm, but typically produce predictable results. Conversely, SOS display non-linear dynamics, which can hardly be captured by deterministic models, and, although robust with respect to external perturbations, are quite sensitive to changes on inner working parameters. In this thesis, we describe methodological aspects concerning the early-design stage of SOS built relying on the Multiagent paradigm: in particular, we refer to the A&A metamodel, where MAS are composed by agents and artefacts, i.e. environmental resources. Then, we describe an architectural pattern that has been extracted from a recurrent solution in designing self-organising systems: this pattern is based on a MAS environment formed by artefacts, modelling non-proactive resources, and environmental agents acting on artefacts so as to enable self-organising mechanisms. In this context, we propose a scientific approach for the early design stage of the engineering of self-organising systems: the process is an iterative one and each cycle is articulated in four stages, modelling, simulation, formal verification, and tuning. During the modelling phase we mainly rely on the existence of a self-organising strategy observed in Nature and, hopefully encoded as a design pattern. Simulations of an abstract system model are used to drive design choices until the required quality properties are obtained, thus providing guarantees that the subsequent design steps would lead to a correct implementation. However, system analysis exclusively based on simulation results does not provide sound guarantees for the engineering of complex systems: to this purpose, we envision the application of formal verification techniques, specifically model checking, in order to exactly characterise the system behaviours. During the tuning stage parameters are tweaked in order to meet the target global dynamics and feasibility constraints. In order to evaluate the methodology, we analysed several systems: in this thesis, we only describe three of them, i.e. the most representative ones for each of the three years of PhD course. We analyse each case study using the presented method, and describe the exploited formal tools and techniques.
Resumo:
Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.
Resumo:
This study aims at analysing Brian O'Nolans literary production in the light of a reconsideration of the role played by his two most famous pseudonyms ,Flann Brien and Myles na Gopaleen, behind which he was active both as a novelist and as a journalist. We tried to establish a new kind of relationship between them and their empirical author following recent cultural and scientific surveys in the field of Humour Studies, Psychology, and Sociology: taking as a starting point the appreciation of the comic attitude in nature and in cultural history, we progressed through a short history of laughter and derision, followed by an overview on humour theories. After having established such a frame, we considered an integration of scientific studies in the field of laughter and humour as a base for our study scheme, in order to come to a definition of the comic author as a recognised, powerful and authoritative social figure who acts as a critic of conventions. The history of laughter and comic we briefly summarized, based on the one related by the French scholar Georges Minois in his work (Minois 2004), has been taken into account in the view that humorous attitude is one of manâs characteristic traits always present and witnessed throughout the ages, though subject in most cases to repression by cultural and political conservative power. This sort of Super-Ego notwithstanding, or perhaps because of that, comic impulse proved irreducible exactly in its influence on the current cultural debates. Basing mainly on Robert R. Provineâs (Provine 2001), Fabio Ceccarelliâs (Ceccarelli 1988), Arthur Koestlerâs (Koestler 1975) and Peter L. Bergerâs (Berger 1995) scientific essays on the actual occurrence of laughter and smile in complex social situations, we underlined the many evidences for how the use of comic, humour and wit (in a Freudian sense) could be best comprehended if seen as a common mind process designed for the improvement of knowledge, in which we traced a strict relation with the play-element the Dutch historian Huizinga highlighted in his famous essay, Homo Ludens (Huizinga 1955). We considered comic and humour/wit as different sides of the same coin, and showed how the demonstrations scientists provided on this particular subject are not conclusive, given that the mental processes could not still be irrefutably shown to be separated as regards graduations in comic expression and reception: in fact, different outputs in expressions might lead back to one and the same production process, following the general âEconomy Ruleâ of evolution; man is the only animal who lies, meaning with this that one feeling is not necessarily biuniquely associated with one and the same outward display, so human expressions are not validation proofs for feelings. Considering societies, we found that in nature they are all organized in more or less the same way, that is, in élites who govern over a community who, in turn, recognizes them as legitimate delegates for that task; we inferred from this the epistemological possibility for the existence of an added ruling figure alongside those political and religious: this figure being the comic, who is the person in charge of expressing true feelings towards given subjects of contention. Any community owns one, and his very peculiar status is validated by the fact that his place is within the community, living in it and speaking to it, but at the same time is outside it in the sense that his action focuses mainly on shedding light on ideas and objects placed out-side the boundaries of social convention: taboos, fears, sacred objects and finally culture are the favourite targets of the comic personâs arrow. This is the reason for the word a(rche)typical as applied to the comic figure in society: atypical in a sense, because unconventional and disrespectful of traditions, critical and never at ease with unblinkered respect of canons; archetypical, because the âvillage foolâ, buffoon, jester or anyone in any kind of society who plays such roles, is an archetype in the Jungian sense, i.e. a personification of an irreducible side of human nature that everybody instinctively knows: a beginner of a tradition, the perfect type, what is most conventional of all and therefore the exact opposite of an atypical. There is an intrinsic necessity, we think, of such figures in societies, just like politicians and priests, who should play an elitist role in order to guide and rule not for their own benefit but for the good of the community. We are not naïve and do know that actual owners of power always tend to keep it indefinitely: the âsocial comicâ as a role of power has nonetheless the distinctive feature of being the only job whose tension is not towards stability. It has got in itself the rewarding permission of contradiction, for the very reason we exposed before that the comic must cast an eye both inside and outside society and his vision may be perforce not consistent, then it is satisfactory for the popularity that gives amongst readers and audience. Finally, the difference between governors, priests and comic figures is the seriousness of the first two (fundamentally monologic) and the merry contradiction of the third (essentially dialogic). MPs, mayors, bishops and pastors should always console, comfort and soothe popular mood in respect of the public convention; the comic has the opposite task of provoking, urging and irritating, accomplishing at the same time a sort of control of the soothing powers of society, keepers of the righteousness. In this view, the comic person assumes a paramount importance in the counterbalancing of power administration, whether in form of acting in public places or in written pieces which could circulate for private reading. At this point comes into question our Irish writer Brian O'Nolan(1911-1966), real name that stood behind the more famous masks of Flann O'Brien, novelist, author of At Swim-Two-Birds (1939), The Hard Life (1961), The Dalkey Archive (1964) and, posthumously, The Third Policeman (1967); and of Myles na Gopaleen, journalist, keeper for more than 25 years of the Cruiskeen Lawn column on The Irish Times (1940-1966), and author of the famous book-parody in Irish An Béal Bocht (1941), later translated in English as The Poor Mouth (1973). Brian O'Nolan, professional senior civil servant of the Republic, has never seen recognized his authorship in literary studies, since all of them concentrated on his alter egos Flann, Myles and some others he used for minor contributions. So far as we are concerned, we think this is the first study which places the real name in the title, this way acknowledging him an unity of intents that no-one before did. And this choice in titling is not a mere mark of distinction for the sake of it, but also a wilful sign of how his opus should now be reconsidered. In effect, the aim of this study is exactly that of demonstrating how the empirical author Brian O'Nolan was the real Deus in machina, the master of puppets who skilfully directed all of his identities in planned directions, so as to completely fulfil the role of the comic figure we explained before. Flann O'Brien and Myles na Gopaleen were personae and not persons, but the impression one gets from the critical studies on them is the exact opposite. Literary consideration, that came only after O'Nolans death, began with Anne Clissmannâs work, Flann O'Brien: A Critical Introduction to His Writings (Clissmann 1975), while the most recent book is Keith Donohueâs The Irish Anatomist: A Study of Flann O'Brien (Donohue 2002); passing through M.Keith Bookerâs Flann O'Brien, Bakhtin and Menippean Satire (Booker 1995), Keith Hopperâs Flann O'Brien: A Portrait of the Artist as a Young Post-Modernist (Hopper 1995) and Monique Gallagherâs Flann O'Brien, Myles et les autres (Gallagher 1998). There have also been a couple of biographies, which incidentally somehow try to explain critical points his literary production, while many critical studies do the same on the opposite side, trying to found critical points of view on the authorâs restless life and habits. At this stage, we attempted to merge into O'Nolan's corpus the journalistic articles he wrote, more than 4,200, for roughly two million words in the 26-year-old running of the column. To justify this, we appealed to several considerations about the figure O'Nolan used as writer: Myles na Gopaleen (later simplified in na Gopaleen), who was the equivalent of the street artist or storyteller, speaking to his imaginary public and trying to involve it in his stories, quarrels and debates of all kinds. First of all, he relied much on language for the reactions he would obtain, playing on, and with, words so as to ironically unmask untrue relationships between words and things. Secondly, he pushed to the limit the convention of addressing to spectators and listeners usually employed in live performing, stretching its role in the written discourse to come to a greater effect of involvement of readers. Lastly, he profited much from what we labelled his âspecific weightâ, i.e. the potential influence in society given by his recognised authority in determined matters, a position from which he could launch deeper attacks on conventional beliefs, so complying with the duty of a comic we hypothesised before: that of criticising society even in threat of losing the benefits the post guarantees. That seemingly masochistic tendency has its rationale. Every representative has many privileges on the assumption that he, or she, has great responsibilities in administrating. The higher those responsibilities are, the higher is the reward but also the severer is the punishment for the misfits done while in charge. But we all know that not everybody accepts the rules and many try to use their power for their personal benefit and do not want to undergo lawâs penalties. The comic, showing in this case more civic sense than others, helped very much in this by the non-accessibility to the use of public force, finds in the role of the scapegoat the right accomplishment of his task, accepting the punishment when his breaking of the conventions is too stark to be forgiven. As Ceccarelli demonstrated, the role of the object of laughter (comic, ridicule) has its very own positive side: there is freedom of expression for the person, and at the same time integration in the society, even though at low levels. Then the banishment of a âsocialâ comic can never get to total extirpation from society, revealing how the scope of the comic lies on an entirely fictional layer, bearing no relation with facts, nor real consequences in terms of physical health. Myles na Gopaleen, mastering these three characteristics we postulated in the highest way, can be considered an author worth noting; and the oeuvre he wrote, the whole collection of Cruiskeen Lawn articles, is rightfully a novel because respects the canons of it especially regarding the authorial figure and his relationship with the readers. In addition, his work can be studied even if we cannot conduct our research on the whole of it, this proceeding being justified exactly because of the resemblances to the real figure of the storyteller: its âchaptersâ âthe daily articlesâ had a format that even the distracted reader could follow, even one who did not read each and every article before. So we can critically consider also a good part of them, as collected in the seven volumes published so far, with the addition of some others outside the collections, because completeness in this case is not at all a guarantee of a better precision in the assessment; on the contrary: examination of the totality of articles might let us consider him as a person and not a persona. Once cleared these points, we proceeded further in considering tout court the works of Brian O'Nolan as the works of a unique author, rather than complicating the references with many names which are none other than well-wrought sides of the same personality. By putting O'Nolan as the correct object of our research, empirical author of the works of the personae Flann O'Brien and Myles na Gopaleen, there comes out a clearer literary landscape: the comic author Brian O'Nolan, self-conscious of his paramount role in society as both a guide and a scourge, in a word as an a(rche)typical, intentionally chose to differentiate his personalities so as to create different perspectives in different fields of knowledge by using, in addition, different means of communication: novels and journalism. We finally compared the newly assessed author Brian O'Nolan with other great Irish comic writers in English, such as James Joyce (the one everybody named as the master in the field), Samuel Beckett, and Jonathan Swift. This comparison showed once more how O'Nolan is in no way inferior to these authors who, greatly celebrated by critics, have nonetheless failed to achieve that great public recognition OâNolan received alias Myles, awarded by the daily audience he reached and influenced with his Cruiskeen Lawn column. For this reason, we believe him to be representative of the comic figureâs function as a social regulator and as a builder of solidarity, such as that Raymond Williams spoke of in his work (Williams 1982), with in mind the aim of building a âculture in commonâ. There is no way for a âculture in commonâ to be acquired if we do not accept the fact that even the most functional society rests on conventions, and in a world more and more âconnectedâ we need someone to help everybody negotiate with different cultures and persons. The comic gives us a worldly perspective which is at the same time comfortable and distressing but in the end not harmful as the one furnished by politicians could be: he lets us peep into parallel worlds without moving too far from our armchair and, as a consequence, is the one who does his best for the improvement of our understanding of things.
Resumo:
The Time-Of-Flight (TOF) detector of ALICE is designed to identify charged particles produced in Pb--Pb collisions at the LHC to address the physics of strongly-interacting matter and the Quark-Gluon Plasma (QGP). The detector is based on the Multigap Resistive Plate Chamber (MRPC) technology which guarantees the excellent performance required for a large time-of-flight array. The construction and installation of the apparatus in the experimental site have been completed and the detector is presently fully operative. All the steps which led to the construction of the TOF detector were strictly followed by a set of quality assurance procedures to enable high and uniform performance and eventually the detector has been commissioned with cosmic rays. This work aims at giving a detailed overview of the ALICE TOF detector, also focusing on the tests performed during the construction phase. The first data-taking experience and the first results obtained with cosmic rays during the commissioning phase are presented as well and allow to confirm the readiness state of the TOF detector for LHC collisions.
Resumo:
Family businesses have acquired a very specific gravity in the economy of occidental countries, generating most of the employment and the richness for the last ages. In Spain Family Businesses represent the 65% about the total of enterprises with 1,5 million companies. They give employment to 8 million people, the 80% of the private employment and develop the 65% of the Spanish GNP (Gross National Product). Otherwise, the family business needs a complete law regulation that gives satisfaction to their own necessities and challenges. These companies have to deal with national or international economic scene to assure their permanency and competitiveness. In fact, the statistics about family companies have a medium life of 35 years. European family businesses success their successor process between a 10 and 25%. Itâs said: first generation makes, second generation stays, third generation distributes. In that sense, the Recommendation of the European Commission of December 7º 1994 about the succession of the small and medium companies has reformed European internal orders according to make easier successor process and to introduce practices of family companiesâ good government. So, the Italian law, under the 14th Law, February 2006, has reformed its Covil Code, appearing a new concept, called âPatto di famigliaâ, wich abolish the prohibition as laid dwon in the 458 article about successorsâ agreements, admitting the possibility that testator guarantees the continuity of the company or of the family society, giving it, totally or in part, to one or various of its descendents. On other hand, Spain has promulgated the 17th Royal Decree (9th February 2007), that governs the publicity of family agreements (Protocolos familiars). These âprotocolo familiarâ (Family Agreement) are known as accord of wills, consented and accepted unanimously of all the family members and the company, taking into account recommendations and practices of family companyâs good government.
Resumo:
The Ph.D. dissertation analyses the reasons for which political actors (governments, legislatures and political parties) decide consciously to give away a source of power by increasing the political significance of the courts. It focuses on a single case of particular significance: the passage of the Constitutional Reform Act 2005 in the United Kingdom. This Act has deeply changed the governance and the organization of the English judicial system, has provided a much clearer separation of powers and a stronger independence of the judiciary from the executive and the legislative. What’s more, this strengthening of the judicial independence has been decided in a period in which the political role of the English judges was evidently increasing. I argue that the reform can be interpreted as a «paradigm shift» (Hall 1993), that has changed the way in which the judicial power is considered. The most diffused conceptions in the sub-system of the English judicial policies are shifted, and a new paradigm has become dominant. The new paradigm includes: (i) stronger separation of powers, (ii) collective (as well as individual) conception of the independence of the judiciary, (iii) reduction of the political accountability of the judges, (iv) formalization of the guarantees of judicial independence, (v) principle-driven (instead of pragmatic) approach to the reforms, and (vi) transformation of a non-codified constitution in a codified one. Judicialization through political decisions represent an important, but not fully explored, field of research. The literature, in particular, has focused on factors unable to explain the English case: the competitiveness of the party system (Ramseyer 1994), the political uncertainty at the time of constitutional design (Ginsburg 2003), the cultural divisions within the polity (Hirschl 2004), federal institutions and division of powers (Shapiro 2002). All these contributes link the decision to enhance the political relevance of the judges to some kind of diffusion of political power. In the contemporary England, characterized by a relative high concentration of power in the government, the reasons for such a reform should be located elsewhere. I argue that the Constitutional Reform Act 2005 can be interpreted as a result of three different kinds of reasons: (i) the social and demographical transformations of the English judiciary, which have made inefficient most of the precedent mechanism of governance, (ii) the role played by the judges in the policy process and (iii) the cognitive and normative influences originated from the European context, as a consequence of the membership of the United Kingdom to the European Union and the Council of Europe. My thesis is that only a full analysis of all these three aspects can explain the decision to reform the judicial system and the content of the Constitutional Reform Act 2005. Only the cultural influences come from the European legal complex, above all, can explain the paradigm shift previously described.
Resumo:
Abstract Il tema delle infrastrutture, intese come parte dell’architettura dello spazio urbano e del territorio, assume un ruolo centrale in molti progetti contemporanei e costituisce la ragione di questa ricerca. E’ preso in esame, in particolare, il tracciato extraurbano della via Emilia, antica strada consolare romana la cui definizione risale al II sec. a.C., nel tratto compreso tra le città di Rimini e Forlì. Studiare la strada nel suo rapporto con il territorio locale ha significato in primo luogo prendere in considerazione la via Emilia in quanto manufatto, ma anche in quanto percorso che si compie nel tempo. Si è dunque cercato di mostrare come, in parallelo all’evoluzione della sua sezione e della geometria del suo tracciato, sia cambiata anche la sua fruizione, e come si sia evoluto il modo in cui la strada viene “misurata”, denominata e gestita. All’interno di una riflessione critica sulla forma e sul ruolo della strada nel corso dei secoli la Tesi rilegge il territorio nella sua dimensione di “palinsesto”, riconoscendo e isolando alcuni momenti in cui la via Emilia ha assunto un valore “simbolico” che rimanda alla Roma imperiale. La perdita del significato via Emilia, intesa come elemento di “costruzione” del territorio, ha origine con il processo di urbanizzazione diffusa che ha investito il territorio extraurbano a partire dalla fine della seconda guerra mondiale. La condizione attuale della strada, sempre più congestionata dal traffico veicolare, costituisce la premesse per una riflessione sul futuro della sua forma e degli insediamenti che attraversa. La strategia proposta dagli Enti locali che prevede il raddoppio della strada, con la costruzione della via Emilia Bis, non garantisce solo un potenziamento infrastrutturale ma rappresenta l’occasione per sottrarre al tracciato attuale la funzione di principale asse di comunicazione extraurbana. La via Emilia potrebbe così recuperare il ruolo di itinerario narrativo, attraverso la configurazione dei suoi spazi collettivi, l’architettura dei suoi edifici, il significato dei suoi monumenti, e diventare spazio privilegiato di relazione e di aggregazione. The theme of urban infrastructures, thought as part of the design of urban space and territory, has a central role in several contemporary projects and is the reason of this research. The object of the study is the extra urban route of the via Emilia, an ancient roman road which has been defined in the II century b. C., in its stretch between the cities of Rimini and Forlì. Studying the road in its relationship with the local environment has meant first of all considering the via Emilia as an “artefact” but also as a path that takes place over time. The aim of this research was also to demonstrate how its fruition has changed together with the evolution of the section and geometry of the route, and how the road itself is measured, named and managed. Within a critical approach on the shape and on the role played by the road through the centuries, this Essay reinterprets the territory in its dimension of “palimpsest”, identifying and isolating some periods of time when the via Emilia assumed a symbolic value which recalls the Imperial Rome. The loss of the meaning of the via Emilia, intended as an element that “constitutes” the territory originates from a process of diffused urbanization, which spread in the extra urban environment from the end of the second world war. The actual condition of the road, more and more congested by traffic, is the premise of a reflection about the future of its shape and of the settlements alongside. The strategy proposed by the local authorities, that foresees to double the size of the road, building the via Emilia Bis, not only guarantees an infrastructural enhancement but also it represents an opportunity to take off from the road itself the current function of being the principal axis of extra urban connection. In this way the via Emilia could regain its role as a narrative itinerary, through the configuration of its public spaces, the architecture of its buildings, the meaning of its monuments, and then become a privileged space of relationship and aggregation.
Resumo:
This thesis deal with the design of advanced OFDM systems. Both waveform and receiver design have been treated. The main scope of the Thesis is to study, create, and propose, ideas and novel design solutions able to cope with the weaknesses and crucial aspects of modern OFDM systems. Starting from the the transmitter side, the problem represented by low resilience to non-linear distortion has been assessed. A novel technique that considerably reduces the Peak-to-Average Power Ratio (PAPR) yielding a quasi constant signal envelope in the time domain (PAPR close to 1 dB) has been proposed.The proposed technique, named Rotation Invariant Subcarrier Mapping (RISM),is a novel scheme for subcarriers data mapping,where the symbols belonging to the modulation alphabet are not anchored, but maintain some degrees of freedom. In other words, a bit tuple is not mapped on a single point, rather it is mapped onto a geometrical locus, which is totally or partially rotation invariant. The final positions of the transmitted complex symbols are chosen by an iterative optimization process in order to minimize the PAPR of the resulting OFDM symbol. Numerical results confirm that RISM makes OFDM usable even in severe non-linear channels. Another well known problem which has been tackled is the vulnerability to synchronization errors. Indeed in OFDM system an accurate recovery of carrier frequency and symbol timing is crucial for the proper demodulation of the received packets. In general, timing and frequency synchronization is performed in two separate phases called PRE-FFT and POST-FFT synchronization. Regarding the PRE-FFT phase, a novel joint symbol timing and carrier frequency synchronization algorithm has been presented. The proposed algorithm is characterized by a very low hardware complexity, and, at the same time, it guarantees very good performance in in both AWGN and multipath channels. Regarding the POST-FFT phase, a novel approach for both pilot structure and receiver design has been presented. In particular, a novel pilot pattern has been introduced in order to minimize the occurrence of overlaps between two pattern shifted replicas. This allows to replace conventional pilots with nulls in the frequency domain, introducing the so called Silent Pilots. As a result, the optimal receiver turns out to be very robust against severe Rayleigh fading multipath and characterized by low complexity. Performance of this approach has been analytically and numerically evaluated. Comparing the proposed approach with state of the art alternatives, in both AWGN and multipath fading channels, considerable performance improvements have been obtained. The crucial problem of channel estimation has been thoroughly investigated, with particular emphasis on the decimation of the Channel Impulse Response (CIR) through the selection of the Most Significant Samples (MSSs). In this contest our contribution is twofold, from the theoretical side, we derived lower bounds on the estimation mean-square error (MSE) performance for any MSS selection strategy,from the receiver design we proposed novel MSS selection strategies which have been shown to approach these MSE lower bounds, and outperformed the state-of-the-art alternatives. Finally, the possibility of using of Single Carrier Frequency Division Multiple Access (SC-FDMA) in the Broadband Satellite Return Channel has been assessed. Notably, SC-FDMA is able to improve the physical layer spectral efficiency with respect to single carrier systems, which have been used so far in the Return Channel Satellite (RCS) standards. However, it requires a strict synchronization and it is also sensitive to phase noise of local radio frequency oscillators. For this reason, an effective pilot tone arrangement within the SC-FDMA frame, and a novel Joint Multi-User (JMU) estimation method for the SC-FDMA, has been proposed. As shown by numerical results, the proposed scheme manages to satisfy strict synchronization requirements and to guarantee a proper demodulation of the received signal.
Resumo:
Italy and France in Trianon’s Hungary: two political and cultural penetration models During the first post-war, the Danubian Europe was the theatre of an Italian-French diplomatic challenge to gain hegemony in that part of the continent. Because of his geographical position, Hungary had a decisive strategic importance for the ambitions of French and Italian foreign politics. Since in the 1920s culture and propaganda became the fourth dimension of international relations, Rome and Paris developed their diplomatic action in Hungary to affirm not only political and economic influence, but also cultural supremacy. In the 1930, after Hitler’s rise to power, the unstoppable comeback of German political influence in central-eastern Europe determined the progressive decline of Italian and French political and economic positions in Hungary: only the cultural field allowed a survey of Italian-Hungarian and French-Hungarian relations in the contest of a Europe dominated by Nazi Germany during the Second World War. Nevertheless, the radical geopolitical changes in second post-war Europe did not compromise Italian and French cultural presence in the new communist Hungary. Although cultural diplomacy is originally motivated by contingent political targets, it doesn’t respect the short time of politics, but it’s the only foreign politics tool that guarantees preservations of bilateral relations in the long run.
Resumo:
Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.
Resumo:
Lo studio svolto in merito alle tecniche di produzione di componenti strutturali in materiale composito ha permesso il raggiungimento di una precisa consapevolezza dello stato dell’arte del settore, in particolare in riferimento ai processi attualmente utilizzati per l’industrializzazione in media-grande serie. Con l’obiettivo di sintetizzare i principali vantaggi delle tecnologie suddette e permettere la realizzazione di forme più complesse, si è proceduto all’analisi di fattibilità, attraverso uno studio funzionale e una prima progettazione di una tecnologia di produzione per nastratura automatizzata di componenti strutturali in materiale composito. Si è voluto quindi dimostrare la flessibilità e la consistenza del processo disegnando un telaio nastrato in carbonio, intercambiabile al telaio FSAE 2009 in tubolare d’acciaio (stessi punti di attacco motore, punti di attacco telaietto posteriore, attacchi sospensioni anteriori) e che garantisca un sostanziale vantaggio in termini di peso, a pari rigidezza torsionale. La caratterizzazione di tale telaio è stata eseguita mediante l'utilizzo del calcolo strutturale, validato da prove sperimentali.
Resumo:
Il progetto di ricerca si situa nell’ambito dell’informatica giudiziaria settore che studia i sistemi informativi implementati negli uffici giudiziari allo scopo di migliorare l’efficienza del servizio, fornire una leva per la riduzione dei lunghi tempi processuali, al fine ultimo di garantire al meglio i diritti riconosciuti ai cittadini e accrescere la competitività del Paese. Oggetto di studio specifico del progetto di ricerca è l’utilizzo delle ICT nel processo penale. Si tratta di una realtà meno studiata rispetto al processo civile, eppure la crisi di efficienza del processo non è meno sentita in tale area: l’arretrato da smaltire al 30 giugno del 2011 è stato quantificato in 3,4 milioni di processi penali, e il tempo medio di definizione degli stessi è di quattro anni e nove mesi. Guardare al processo penale con gli occhi della progettazione dei sistemi informativi è vedere un fluire ininterrotto di informazioni che include realtà collocate a monte e a valle del processo stesso: dalla trasmissione della notizia di reato alla esecuzione della pena. In questa prospettiva diventa evidente l’importanza di una corretta gestione delle informazioni: la quantità, l’accuratezza, la rapidità di accesso alle stesse sono fattori così cruciali per il processo penale che l’efficienza del sistema informativo e la qualità della giustizia erogata sono fortemente interrelate. Il progetto di ricerca è orientato a individuare quali siano le condizioni in cui l’efficienza può essere effettivamente raggiunta e, soprattutto, a verificare quali siano le scelte tecnologiche che possono preservare, o anche potenziare, i principi e le garanzie del processo penale. Nel processo penale, infatti, sono coinvolti diritti fondamentali dell’individuo quali la libertà personale, la dignità, la riservatezza, diritti fondamentali che vengono tutelati attraverso un ampia gamma di diritti processuali quali la presunzione di innocenza, il diritto di difesa, il diritto al contraddittorio, la finalità di rieducazione della pena.
Resumo:
La fotografia viene utilizzata intermedialmente per la narrazione di contromemorie e memorie traumatiche ricorrendo a numerose modalità e strategie di inserzione e impiego diverse. Se l’intermedialità da un lato non è riconducibile ad una serie di pratiche convenzionali, ma dipende dal contesto narrativo, dall’altro essa detiene un’organicità che la allinea funzionalmente ai processi e alle indagini sulla rappresentabilità del trauma. Inoltre, per la versatilità della sua natura poliedrica, la pratica narrativa intermediale (nelle sue configurazioni più diverse) assume una valenza epistemologica e metodologica nei confronti degli studi sull’esternazione e rielaborazione del trauma. Questo studio si prefigge di mettere a confronto testi teorici e testi narrativi per metterne in rilievo il reciproco apporto.
Resumo:
La presente ricerca mira ad individuare e risolvere alcuni problemi di inquadramento e di disciplina applicabile in ordine all’istituto regolato dall’art. 8 della legge n. 40/2007, con successive modificazioni ed integrazioni, definito a livello normativo come «portabilità del mutuo». In particolare, ci si è chiesti come la nuova normativa in tema di trasferibilità del mutuo possa inserirsi all’interno della disciplina della surrogazione se quest’ultima non venga considerata come possibile strumento di circolazione del credito e se ci si possa spingere fino a considerare l’art. 8 come una riscrittura moderna dell’istituto codicistico. Sebbene l’art. 8 non sia stato limitato ai finanziamenti ipotecari, tali istituti costituiscono il principale ambito di applicazione della normativa. Per questa ragione si è sostenuto che la disposizione, più che la «portabilità del mutuo», avrebbe lo scopo di incentivare la «portabilità dell’ipoteca», intendendosi quest’ultima come la surrogazione del nuovo finanziatore nel credito ipotecario, ovvero più specificamente nell’ipoteca, ai sensi dell’art. 1202 c.c. Lo studio dei riflessi della surrogazione, così come prevista dalla legge del 2007, sulle garanzie in generale e sull’ipoteca in particolare, ha mostrato come il legislatore, tramite l’introduzione di una disciplina semplificata, abbia inteso adeguare gli istituti giuridici tradizionali alle esigenze pratiche di flessibilità del mercato del credito; ciò tuttavia con scarso successo e lasciando aperti taluni dubbi interpretativi. Al fine di approfondire la ricerca, si è affrontata la materia oggetto di studio in un’ottica comparata, rilevando quali siano a livello europeo le principali differenze in tema di circolazione del credito, portabilità del mutuo e trasferibilità delle garanzie.
Resumo:
In this work we studied the efficiency of the benchmarks used in the asset management industry. In chapter 2 we analyzed the efficiency of the benchmark used for the government bond markets. We found that for the Emerging Market Bonds an equally weighted index for the country weights is probably the more suited because guarantees maximum diversification of country risk but for the Eurozone government bond market we found a GDP weighted index is better because the most important matter is to avoid a higher weight for highly indebted countries. In chapter 3 we analyzed the efficiency of a Derivatives Index to invest in the European corporate bond market instead of a Cash Index. We can state that the two indexes are similar in terms of returns, but that the Derivatives Index is less risky because it has a lower volatility, has values of skewness and kurtosis closer to those of a normal distribution and is a more liquid instrument, as the autocorrelation is not significant. In chapter 4 it is analyzed the impact of fallen angels on the corporate bond portfolios. Our analysis investigated the impact of the month-end rebalancing of the ML Emu Non Financial Corporate Index for the exit of downgraded bond (the event). We can conclude a flexible approach to the month-end rebalancing is better in order to avoid a loss of valued due to the benchmark construction rules. In chapter 5 we did a comparison between the equally weighted and capitalization weighted method for the European equity market. The benefit which results from reweighting the portfolio into equal weights can be attributed to the fact that EW portfolios implicitly follow a contrarian investment strategy, because they mechanically rebalance away from stocks that increase in price.