12 resultados para Single case
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The Ph.D. dissertation analyses the reasons for which political actors (governments, legislatures and political parties) decide consciously to give away a source of power by increasing the political significance of the courts. It focuses on a single case of particular significance: the passage of the Constitutional Reform Act 2005 in the United Kingdom. This Act has deeply changed the governance and the organization of the English judicial system, has provided a much clearer separation of powers and a stronger independence of the judiciary from the executive and the legislative. What’s more, this strengthening of the judicial independence has been decided in a period in which the political role of the English judges was evidently increasing. I argue that the reform can be interpreted as a «paradigm shift» (Hall 1993), that has changed the way in which the judicial power is considered. The most diffused conceptions in the sub-system of the English judicial policies are shifted, and a new paradigm has become dominant. The new paradigm includes: (i) stronger separation of powers, (ii) collective (as well as individual) conception of the independence of the judiciary, (iii) reduction of the political accountability of the judges, (iv) formalization of the guarantees of judicial independence, (v) principle-driven (instead of pragmatic) approach to the reforms, and (vi) transformation of a non-codified constitution in a codified one. Judicialization through political decisions represent an important, but not fully explored, field of research. The literature, in particular, has focused on factors unable to explain the English case: the competitiveness of the party system (Ramseyer 1994), the political uncertainty at the time of constitutional design (Ginsburg 2003), the cultural divisions within the polity (Hirschl 2004), federal institutions and division of powers (Shapiro 2002). All these contributes link the decision to enhance the political relevance of the judges to some kind of diffusion of political power. In the contemporary England, characterized by a relative high concentration of power in the government, the reasons for such a reform should be located elsewhere. I argue that the Constitutional Reform Act 2005 can be interpreted as a result of three different kinds of reasons: (i) the social and demographical transformations of the English judiciary, which have made inefficient most of the precedent mechanism of governance, (ii) the role played by the judges in the policy process and (iii) the cognitive and normative influences originated from the European context, as a consequence of the membership of the United Kingdom to the European Union and the Council of Europe. My thesis is that only a full analysis of all these three aspects can explain the decision to reform the judicial system and the content of the Constitutional Reform Act 2005. Only the cultural influences come from the European legal complex, above all, can explain the paradigm shift previously described.
Resumo:
La tesi di Matteo Allodi intende analizzare alcune pratiche socio-assistenziali rivolte a minori e famiglie in difficoltà relative a progetti di accoglienza presso alcune strutture residenziali. In particolare, Matteo Allodi si sofferma su progetti di accoglienza elaborati presso alcune Comunità familiari la cui metodologia d’intervento si caratterizza per un orientamento verso un modello di lavoro sociale di tipo sussidiario nell’ottica del recupero dei legami e delle competenze genitoriali. La tesi affronta nella prima parte la dimensione teorica relativa a un approccio progettuale di intervento sociale che, mettendo al centro le relazioni dei soggetti in gioco, possa promuovere la loro attivazione in funzione della realizzazione dell’obiettivo del recupero della genitorialità. Allodi si concentra dal punto di vista teorico sulle modalità di realizzazione di un servizio alla persona guidato dal principio di sussidiarietà, ovvero orientato alla valorizzazione delle capacità riflessive degli attori. Nella seconda, parte Allodi presenta l’indagine condotta in alcune Comunità di tipo familiare di Parma. La strategia iniziale d’indagine è quella del case study. Allodi sceglie di indagare il fenomeno partendo da un’osservazione partecipata di orientamento etnometodologico integrata con interviste agli attori privilegiati. In questa fase si è proceduto a una prima ricerca qualitativa, attraverso la metodologia dello studio di caso, che ha permesso di entrare in contatto con alcune tipologie di strutture residenziali per minori al fine di completare il quadro generale del fenomeno delle Comunità familiari e impostare una prima mappatura esplorativa. La ricerca prosegue con uno studio longitudinale prospettico volto a monitorare e valutare il lavoro di rete della comunità e dei servizi, osservando principalmente la mobilitazione verso l’autonomia e l’empowerment dei soggetti (minori) e delle reti ancorate al soggetto (single case study). Si è voluto comprendere quali modalità relazionali gli attori della rete di coping mettono in gioco in funzione del “cambiamento sociale”.
Resumo:
La pratica del remix è al giorno d’oggi sempre più diffusa e un numero sempre più vasto di persone ha ora le competenze e gli strumenti tecnologici adeguati per eseguire operazioni un tempo riservate a nicchie ristrette. Tuttavia, nella sua forma audiovisiva, il remix ha ottenuto scarsa attenzione a livello accademico. Questo lavoro esplora la pratica del remix intesa al contempo come declinazione contemporanea di una pratica di lungo corso all’interno della storia della produzione audiovisiva – ovvero il riuso di immagini – sia come forma caratteristica della contemporaneità mediale, atto di appropriazione grassroots dei contenuti mainstream da parte degli utenti. La tesi si articola in due sezioni. Nella prima, l’analisi di tipo teorico e storico-critico è suddivisa in due macro-aree di intervento: da una parte il remix inteso come pratica, atto di appropriazione, gesto di riciclo, decontestualizzazione e risemantizzazione delle immagini mediali che ha attraversato la storia dei media audiovisivi [primo capitolo]. Dall’altra, la remix culture, ovvero il contesto culturale e sociale che informa l’ambiente mediale entro il quale la pratica del remix ha conosciuto, nell’ultimo decennio, la diffusione capillare che lo caratterizza oggi [secondo capitolo]. La seconda, che corrisponde al terzo capitolo, fornisce una dettagliata panoramica su un caso di studio, la pratica del fan vidding. Forma di remix praticata quasi esclusivamente da donne, il vidding consiste nel creare fan video a partire da un montaggio d’immagini tratte da film o serie televisive che utilizza come accompagnamento musicale una canzone. Le vidders, usando specifiche tecniche di montaggio, realizzano delle letture critiche dei prodotti mediali di cui si appropriano, per commentare, criticare o celebrare gli oggetti di loro interesse. Attraverso il vidding il presente lavoro indaga le tattiche di rielaborazione e riscrittura dell’immaginario mediale attraverso il riuso di immagini, con particolare attenzione al remix inteso come pratica di genere.
Resumo:
Per comprende più a fondo il problema che le aziende affrontare per formare le persone in grado di gestire processi di innovazione, in particolare di Open Innovation (OI), è stato realizzato nel 2021 uno studio di caso multiplo di un percorso di educazione non formale all’OI realizzato dalla società consortile ART-ER e rivolto ai dottorandi degli atenei emiliano-romagnoli. Nella seconda fase di tale percorso formativo, per rispondere alle sfide di OI lanciate dalle aziende, sono stati costituiti 4 tavoli di lavoro. A ciascun tavolo di lavoro hanno preso parte 3/4 dottorandi, due referenti aziendali, un consulente e un operatore di ART-ER. Il campione complessivo era costituito da 14 dottorandi; 8 referenti aziendali di quattro aziende; 4 membri di una società di consulenza e 4 operatori della società consortile ART-ER. Il seguente interrogativo di ricerca ha guidato l’indagine: l’interazione tra i soggetti coinvolti in ciascun tavolo di lavoro – considerato un singolo caso - si configura come una Comunità di Pratica in grado di favorire lo sviluppo di apprendimenti individuali funzionali a gestire i processi di OI attivati nelle imprese? I dati sono stati raccolti attraverso una ricerca documentale a tavolino, focus group, interviste semistrutturate e un questionario semistrutturato online. L’analisi dei dati è stata effettuata mediante un’analisi qualitativa del contenuto in più fasi con l’ausilio del software MAXQDA. I risultati dimostrano che in tre casi su quattro, i tavoli di lavoro si sono configurati come una Comunità di Pratica. In questi tre tavoli inoltre è emerso lo sviluppo di alcune aree di competenza funzionali alla gestione dei processi di OI. Nella conclusione sono state presentate alcune proposte per la riprogettazione delle future edizioni del percorso formativo.
Resumo:
The femicide in Ciudad Juárez is a story made of extreme violence against women for different reasons, by different actors, under different circumstances, and following different behavioural patterns. All within a gender discrimination frame based on the idea that women are inferior, interchangeable and disposable according to the patriarchal hierarchy still present in Mexico, but strongly reinforced by a sort of conspiracy of silence provoked either by the high impunity rate, the governmental incompetence to solve the crimes, or the general indifference of the population. It is the story of hundreds of kidnapped, raped, in many cases tortured, and murdered young women in the border between Mexico and the United States. The murders first came into light in 1993 and up to now young women continue to “disappear” without any hope of bringing the perpetrators to justice, stopping impunity, convicting the assassins, and bringing justice to the families of the deceased girls and women. The main questions about femicide in Ciudad Juárez seem to be: why were they brutally assassinated?, why most of the crimes have not been solved yet?, why and how is Ciudad Juárez different from other border cities with the same characteristics?, which powers are behind those crimes in a city that implies mainly women as its labor force, and which has the lowest unemployment rate in the whole country? But there are also many other questions dealing more with the context, the Juarences’ lifestyles, the eventual hidden powers behind the crimes, the possible murderers’ reasons, the response of the local civil society, or the international community actions to fight against femicide there, among many other things, that are still waiting for an answer and that this paper will ‘narrate’ in order to provide a holistic panorama for the readers. But above all there is the need to remember that every single woman or girl assassinated there had a name, an identity, a family, a story to be told time after time and as many times as necessary, in order to avoid accepting these crimes just as statistics, as cold numbers that might make us forget the human tragedy that has been flagellating the city since 1993. We must remember as well that their deaths express gender oppression, the inequality of the relations between what is male and what is female, a manifestation of domination, terror, social extermination, patriarchal hegemony, social class and impunity. The city is the perfect mirror where all the contradictions of globalization get reflected. It is there where all the globalization evils are present and survive by sucking their women’s blood. It is a city where some concepts such as gender, migration and power are closely related with a negative connotation.
Resumo:
Obiettivi: Valutare la modalità più efficace per la riabilitazione funzionale del limbo libero di fibula "single strut", dopo ampie resezioni per patologia neoplastica maligna del cavo orale. Metodi: Da una casistica di 62 ricostruzioni microvascolari con limbo libero di fibula, 11 casi sono stati selezionati per essere riabilitati mediante protesi dentale a supporto implantare. 6 casi sono stati trattati senza ulteriori procedure chirurgiche ad eccezione dell'implantologia (gruppo 1), affrontando il deficit di verticalità della fibula attraverso la protesi dentaria, mentre i restanti casi sono stati trattati con la distrazione osteogenetica (DO) della fibula prima della riabilitazione protesica (gruppo 2). Il deficit di verticalità fibula/mandibola è stato misurato. I criteri di valutazione utilizzati includono la misurazione clinica e radiografica del livello osseo e dei tessuti molli peri-implantari, ed il livello di soddisfazione del paziente attraverso un questionario appositamente redatto. Risultati: Tutte le riabilitazioni protesiche sono costituite da protesi dentali avvitate su impianti. L'età media è di 52 anni, il rapporto uomini/donne è di 6/5. Il numero medio di impianti inseriti nelle fibule è di 5. Il periodo massimo di follow-up dopo il carico masticatorio è stato di 30 mesi per il gruppo 1 e di 38.5 mesi (17-81) di media per il gruppo 2. Non abbiamo riportato complicazioni chirurgiche. Nessun impianto è stato rimosso dai pazienti del gruppo 1, la perdita media di osso peri-implantare registrata è stata di 1,5 mm. Nel gruppo 2 sono stati riportati un caso di tipping linguale del vettore di distrazione durante la fase di consolidazione e un caso di frattura della corticale basale in assenza di formazione di nuovo osso. L'incremento medio di osso in verticalità è stato di 13,6 mm (12-15). 4 impianti su 32 (12.5%) sono andati persi dopo il periodo di follow-up. Il riassorbimento medio peri-implantare, è stato di 2,5 mm. Conclusioni: Le soluzioni più utilizzate per superare il deficit di verticalità del limbo libero di fibula consistono nell'allestimento del lembo libero di cresta iliaca, nel posizionare la fibula in posizione ideale da un punto di vista protesico a discapito del profilo osseo basale, l'utilizzo del lembo di fibula nella versione descritta come "double barrel", nella distrazione osteogenetica della fibula. La nostra esperienza concerne il lembo libero di fibula che nella patologia neoplastica maligna utilizziamo nella versione "single strut", per mantenere disponibili tutte le potenzialità di lunghezza del peduncolo vascolare, senza necessità di innesti di vena. Entrambe le soluzioni, la protesi dentale ortopedica e la distrazione osteogenetica seguita da protesi, entrambe avvitate su impianti, costituiscono soluzioni soddisfacenti per la riabilitazione funzionale della fibula al di là del suo deficit di verticalità . La prima soluzione ha preso spunto dall'osservazione dei buoni risultati della protesi dentale su impianti corti, avendo un paragonabile rapporto corona/radice, la DO applicata alla fibula, sebbene sia risultata una metodica con un numero di complicazioni più elevato ed un maggior livello di riassorbimento di osso peri-implantare, costituisce in ogni caso una valida opzione riabilitativa, specialmente in caso di notevole discrepanza mandibulo/fibulare. Decisiva è la scelta del percorso terapeutico dopo una accurata valutazione di ogni singolo caso. Vengono illustrati i criteri di selezione provenienti dalla nostra esperienza.
Resumo:
In the post genomic era with the massive production of biological data the understanding of factors affecting protein stability is one of the most important and challenging tasks for highlighting the role of mutations in relation to human maladies. The problem is at the basis of what is referred to as molecular medicine with the underlying idea that pathologies can be detailed at a molecular level. To this purpose scientific efforts focus on characterising mutations that hamper protein functions and by these affect biological processes at the basis of cell physiology. New techniques have been developed with the aim of detailing single nucleotide polymorphisms (SNPs) at large in all the human chromosomes and by this information in specific databases are exponentially increasing. Eventually mutations that can be found at the DNA level, when occurring in transcribed regions may then lead to mutated proteins and this can be a serious medical problem, largely affecting the phenotype. Bioinformatics tools are urgently needed to cope with the flood of genomic data stored in database and in order to analyse the role of SNPs at the protein level. In principle several experimental and theoretical observations are suggesting that protein stability in the solvent-protein space is responsible of the correct protein functioning. Then mutations that are found disease related during DNA analysis are often assumed to perturb protein stability as well. However so far no extensive analysis at the proteome level has investigated whether this is the case. Also computationally methods have been developed to infer whether a mutation is disease related and independently whether it affects protein stability. Therefore whether the perturbation of protein stability is related to what it is routinely referred to as a disease is still a big question mark. In this work we have tried for the first time to explore the relation among mutations at the protein level and their relevance to diseases with a large-scale computational study of the data from different databases. To this aim in the first part of the thesis for each mutation type we have derived two probabilistic indices (for 141 out of 150 possible SNPs): the perturbing index (Pp), which indicates the probability that a given mutation effects protein stability considering all the “in vitro” thermodynamic data available and the disease index (Pd), which indicates the probability of a mutation to be disease related, given all the mutations that have been clinically associated so far. We find with a robust statistics that the two indexes correlate with the exception of all the mutations that are somatic cancer related. By this each mutation of the 150 can be coded by two values that allow a direct comparison with data base information. Furthermore we also implement computational methods that starting from the protein structure is suited to predict the effect of a mutation on protein stability and find that overpasses a set of other predictors performing the same task. The predictor is based on support vector machines and takes as input protein tertiary structures. We show that the predicted data well correlate with the data from the databases. All our efforts therefore add to the SNP annotation process and more importantly found the relationship among protein stability perturbation and the human variome leading to the diseasome.
Resumo:
This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.
Resumo:
Background: Clinical trials have demonstrated that selected secondary prevention medications for patients after acute myocardial infarction (AMI) reduce mortality. Yet, these medications are generally underprescribed in daily practice, and older people are often absent from drug trials. Objectives: To examine the relationship between adherence to evidence-based (EB) drugs and post-AMI mortality, focusing on the effects of single therapy and polytherapy in very old patients (≥80 years) compared with elderly and adults (<80 years). Methods: Patients hospitalised for AMI between 01/01/2008 and 30/06/2011 and resident in the Local Health Authority of Bologna were followed up until 31/12/2011. Medication adherence was calculated as the proportion of days covered for filled prescriptions of angiotensin-converting enzyme inhibitors (ACEIs)/angiotensin receptor blockers (ARBs), β-blockers, antiplatelet drugs, and statins. We adopted a risk set sampling method, and the adjusted relationship between medication adherence (PDC≥75%) and mortality was investigated using conditional multiple logistic regression. Results: The study population comprised 4861 patients. During a median follow-up of 2.8 years, 1116 deaths (23.0%) were observed. Adherence to the 4 EB drugs was 7.1%, while nonadherence to any of the drugs was 19.7%. For both patients aged ≥80 years and those aged <80 years, rate ratios of death linearly decreased as the number of EB drugs taken increased. There was a significant inverse relationship between adherence to each of 4 medications and mortality, although its magnitude was higher for ACEIs/ARBs (adj. rate ratio=0.60, 95%CI=0.52–0.69) and statins (0.60, 0.50–0.72), and lower for β-blockers (0.75, 0.61–0.92) and antiplatelet drugs (0.73, 0.63–0.84). Conclusions: The beneficial effect of EB polytherapy on long-term mortality following AMI is evident also in nontrial older populations. Given that adherence to combination therapies is largely suboptimal, the implementation of strategies and initiatives to increase the use of post-AMI secondary preventive medications in old patients is crucial.
Resumo:
Depth represents a crucial piece of information in many practical applications, such as obstacle avoidance and environment mapping. This information can be provided either by active sensors, such as LiDARs, or by passive devices like cameras. A popular passive device is the binocular rig, which allows triangulating the depth of the scene through two synchronized and aligned cameras. However, many devices that are already available in several infrastructures are monocular passive sensors, such as most of the surveillance cameras. The intrinsic ambiguity of the problem makes monocular depth estimation a challenging task. Nevertheless, the recent progress of deep learning strategies is paving the way towards a new class of algorithms able to handle this complexity. This work addresses many relevant topics related to the monocular depth estimation problem. It presents networks capable of predicting accurate depth values even on embedded devices and without the need of expensive ground-truth labels at training time. Moreover, it introduces strategies to estimate the uncertainty of these models, and it shows that monocular networks can easily generate training labels for different tasks at scale. Finally, it evaluates off-the-shelf monocular depth predictors for the relevant use case of social distance monitoring, and shows how this technology allows to overcome already existing strategies limitations.
Resumo:
The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.
Resumo:
This Thesis studies the optimal control problem of single-arm and dual-arm serial robots to achieve the time-optimal handling of liquids and objects. The first topic deals with the planning of time-optimal anti-sloshing trajectories of an industrial robot carrying a cylindrical container filled with a liquid, considering 1-dimensional and 2-dimensional planar motions. A technique for the estimation of the sloshing height is presented, together with its extension to 3-dimensional motions. An experimental validation campaign is provided and discussed to assess the thoroughness of such a technique. As far as anti-sloshing trajectories are concerned, 2-dimensional paths are considered and, for each one of them, three constrained optimizations with different values of the sloshing-height thresholds are solved. Experimental results are presented to compare optimized and non-optimized motions. The second part focuses on the time-optimal trajectory planning for dual-arm object handling, employing two collaborative robots (cobots) and adopting an admittance-control strategy. The chosen manipulation approach, known as cooperative grasping, is based on unilateral contact between the cobots and the object, and it may lead to slipping during motion if an internal prestress along the contact-normal direction is not prescribed. Thus, a virtual penetration is considered, aimed at generating the necessary internal prestress. The stability of cooperative grasping is ensured as long as the exerted forces on the object remain inside the static-friction cone. Constrained-optimization problems are solved for 3-dimensional paths: the virtual penetration is chosen among the control inputs of the problem and friction-cone conditions are treated as inequality constraints. Also in this case experiments are presented in order to prove evidence of the firm handling of the object, even for fast motions.