74 resultados para Leto, Giulio Pomponio, 1428-1497.
Resumo:
Chromatography is the most widely used technique for high-resolution separation and analysis of proteins. This technique is very useful for the purification of delicate compounds, e.g. pharmaceuticals, because it is usually performed at milder conditions than separation processes typically used by chemical industry. This thesis focuses on affinity chromatography. Chromatographic processes are traditionally performed using columns packed with porous resin. However, these supports have several limitations, including the dependence on intra-particle diffusion, a slow mass transfer mechanism, for the transport of solute molecules to the binding sites within the pores and high pressure drop through the packed bed. These limitations can be overcome by using chromatographic supports like membranes or monoliths. Dye-ligands are considered important alternatives to natural ligands. Several reactive dyes, particularly Cibacron Blue F3GA, are used as affinity ligand for protein purification. Cibacron Blue F3GA is a triazine dye that interacts specifically and reversibly with albumin. The aim of this study is to prepare dye-affinity membranes and monoliths for efficient removal of albumin and to compare the three different affinity supports: membranes and monoliths and a commercial column HiTrapTM Blue HP, produced by GE Healthcare. A comparison among the three supports was performed in terms of binding capacity at saturation (DBC100%) and dynamic binding capacity at 10% breakthrough (DBC10%) using solutions of pure BSA. The results obtained show that the CB-RC membranes and CB-Epoxy monoliths can be compared to commercial support, column HiTrapTM Blue HP, for the separation of albumin. These results encourage a further characterization of the new supports examined.
Resumo:
This thesis collects the outcomes of a Ph.D. course in Telecommunications engineering and it is focused on enabling techniques for Spread Spectrum (SS) navigation and communication satellite systems. It provides innovations for both interference management and code synchronization techniques. These two aspects are critical for modern navigation and communication systems and constitute the common denominator of the work. The thesis is organized in two parts: the former deals with interference management. We have proposed a novel technique for the enhancement of the sensitivity level of an advanced interference detection and localization system operating in the Global Navigation Satellite System (GNSS) bands, which allows the identification of interfering signals received with power even lower than the GNSS signals. Moreover, we have introduced an effective cancellation technique for signals transmitted by jammers, exploiting their repetitive characteristics, which strongly reduces the interference level at the receiver. The second part, deals with code synchronization. More in detail, we have designed the code synchronization circuit for a Telemetry, Tracking and Control system operating during the Launch and Early Orbit Phase; the proposed solution allows to cope with the very large frequency uncertainty and dynamics characterizing this scenario, and performs the estimation of the code epoch, of the carrier frequency and of the carrier frequency variation rate. Furthermore, considering a generic pair of circuits performing code acquisition, we have proposed a comprehensive framework for the design and the analysis of the optimal cooperation procedure, which minimizes the time required to accomplish synchronization. The study results particularly interesting since it enables the reduction of the code acquisition time without increasing the computational complexity. Finally, considering a network of collaborating navigation receivers, we have proposed an innovative cooperative code acquisition scheme, which allows exploit the shared code epoch information between neighbor nodes, according to the Peer-to-Peer paradigm.
Resumo:
Alcune patologie dell’occhio come la retinopatia diabetica, il pucker maculare, il distacco della retina possono essere curate con un intervento di vitrectomia. I rischi associati all’intervento potrebbero essere superati ricorrendo alla vitrectomia enzimatica con plasmina in associazione o in sostituzione della vitrectomia convenzionale. Inoltre, l’uso di plasmina autologa eviterebbe problemi di rigetto. La plasmina si ottiene attivando il plasminogeno con enzimi quali l’attivatore tissutale (tPA) e l’urochinasi ( uPA ) . La purificazione del plasminogeno dal sangue avviene normalmente attraverso cromatografia di affinità con resina. Tuttavia, le membrane di affinità costituiscono un supporto ideale per questa applicazione poiché possono essere facilmente impaccate prima dell’intervento, permettendo la realizzazione di un dispositivo monouso che fornisce un processo rapido ed economico. Obiettivo di questo lavoro è la preparazione di membrane di affinità per la purificazione del plasminogeno utilizzando L-lisina come ligando di affinità. Per questo scopo sono state usate membrane in cellulosa rigenerata ad attivazione epossidica, modificate con due diversi protocolli per l’immobilizzazione di L-lisina. La densità ligando è stata misurata mediante un saggio colorimetrico che usa l’acido arancio 7 come indicatore. La resa di immobilizzazione è stata studiata in funzione del tempo di reazione e della concentrazione di L-lisina. Le membrane ottimizzate sono state caratterizzate con esperimenti dinamici usando siero bovino e umano, i risultati sono stati confrontati con quelli ottenuti in esperimenti paralleli condotti con una resina commerciale di affinità con L-lisina. Durante gli esperimenti con siero, le frazioni provenienti da ogni fase cromatografica sono state raccolte e analizzate con HPLC ed elettroforesi SDS-PAGE. In particolare, l’elettroforesi dei campioni eluiti presenta una banda del plasminogeno ben definita indicando che le membrane di affinità con L-lisina sono adatte alla purificazione del plasminogeno. Inoltre, è emerso che le membrane hanno maggiore produttività della resina commerciale di riferimento.
Resumo:
The present doctoral thesis is structured as a collection of three essays. The first essay, “SOC(HE)-Italy: a classification for graduate occupations” presents the conceptual basis, the construction, the validation and the application to the Italian labour force of the occupational classification termed SOC(HE)-Italy. I have developed this classification under the supervision of Kate Purcell during my period as a visiting research student at the Warwick Institute for Emplyment Research. This classification links the constituent tasks and duties of a particular job to the relevant knowledge and skills imparted via Higher Education (HE). It is based onto the SOC(HE)2010, an occupational classification first proposed by Kate Purcell in 2013, but differently constructed. In the second essay “Assessing the incidence and wage effects of overeducation among Italian graduates using a new measure for educational requirements” I utilize this classification to build a valid and reliable measure for job requirements. The lack of an unbiased measure for this dimension constitutes one of the major constraints to achieve a generally accepted measurement of overeducation. Estimations of overeducation incidence and wage effects are run onto AlmaLaurea data from the survey on graduates career paths. I have written this essay and obtained these estimates benefiting of the help and guidance of Giovanni Guidetti and Giulio Pedrini. The third and last essay titled “Overeducation in the Italian labour market: clarifying the concepts and addressing the measurement error problem” addresses a number of theoretical issues concerning the concepts of educational mismatch and overeducation. Using Istat data from RCFL survey I run estimates of the ORU model for the whole Italian labour force. In my knowledge, this is the first time ever such model is estimated on such population. In addition, I adopt the new measure of overeducation based onto the SOC(HE)-Italy classification.
Resumo:
During recent decades, economists' interest in gender-related issues has risen. Researchers aim to show how economic theory can be applied to gender related topics such as peer effect, labor market outcomes, and education. This dissertation aims to contribute to our understandings of the interaction, inequality and sources of differences across genders, and it consists of three empirical papers in the research area of gender economics. The aim of the first paper ("Separating gender composition effect from peer effects in education") is to demonstrate the importance of considering endogenous peer effects in order to identify gender composition effect. This fact is analytically illustrated by employing Manski's (1993) linear-in-means model. The paper derives an innovative solution to the simultaneous identification of endogenous and exogenous peer effects: gender composition effect of interest is estimated from auxiliary reduced-form estimates after identifying the endogenous peer effect by using Graham (2008) variance restriction method. The paper applies this methodology to two different data sets from American and Italian schools. The motivation of the second paper ("Gender differences in vulnerability to an economic crisis") is to analyze the different effect of recent economic crisis on the labor market outcome of men and women. Using triple differences method (before-after crisis, harder-milder hit sectors, men-women) the paper used British data at the occupation level and shows that men suffer more than women in terms of probability of losing their job. Several explanations for the findings are proposed. The third paper ("Gender gap in educational outcome") is concerned with a controversial academic debate on the existence, degree and origin of the gender gap in test scores. The existence of a gap both in mean scores and the variability around the mean is documented and analyzed. The origins of the gap are investigated by looking at wide range of possible explanations.
Resumo:
The Curry-Howard isomorphism is the idea that proofs in natural deduction can be put in correspondence with lambda terms in such a way that this correspondence is preserved by normalization. The concept can be extended from Intuitionistic Logic to other systems, such as Linear Logic. One of the nice conseguences of this isomorphism is that we can reason about functional programs with formal tools which are typical of proof systems: such analysis can also include quantitative qualities of programs, such as the number of steps it takes to terminate. Another is the possiblity to describe the execution of these programs in terms of abstract machines. In 1990 Griffin proved that the correspondence can be extended to Classical Logic and control operators. That is, Classical Logic adds the possiblity to manipulate continuations. In this thesis we see how the things we described above work in this larger context.
Resumo:
Food Security has become an important issue in the international debate, particularly during the latest economic crisis. It relevant issue also for the Mediterranean Countries (MCs), particularly those of the southern shore, as they are is facing complex economic and social changes. On the one hand there is the necessity to satisfy the increasing and changing food demand of the growing population; on the other hand it is important to promote economic growth and adjust the agricultural production to food demand in a sustainable perspective. The assessment of food security conditions is a challenging task due to the multi-dimensional nature and complexity of the matter. Many papers in the scientific literature focus on the nutritional aspects of food security, while its economic issues have been addressed less frequently and only in recent times. Thus, the main objective of the research is to assess food (in)security conditions in the MCs. The study intends to identify and implement appropriate theoretical concepts and methodological tools to be used in the assessment of food security, with a particular emphasis on its economic dimension within MCs. The study follows a composite methodological approach, based on the identification and selection of a number of relevant variables, a refined set of indicators is identified by means of a two-step Principal Component Analysis applied to 90 countries and the PCA findings have been studied with particular attention to the MCs food security situation. The results of the study show that MCs have an higher economic development compared to low-income countries, however the economic and social disparities of this area show vulnerability to food (in)security, due to: dependency on food imports, lack of infrastructure and agriculture investment, climate condition and political stability and inefficiency. In conclusion, the main policy implications of food (in)security conditions in MCs are discussed.
Resumo:
The first part of this work deals with the inverse problem solution in the X-ray spectroscopy field. An original strategy to solve the inverse problem by using the maximum entropy principle is illustrated. It is built the code UMESTRAT, to apply the described strategy in a semiautomatic way. The application of UMESTRAT is shown with a computational example. The second part of this work deals with the improvement of the X-ray Boltzmann model, by studying two radiative interactions neglected in the current photon models. Firstly it is studied the characteristic line emission due to Compton ionization. It is developed a strategy that allows the evaluation of this contribution for the shells K, L and M of all elements with Z from 11 to 92. It is evaluated the single shell Compton/photoelectric ratio as a function of the primary photon energy. It is derived the energy values at which the Compton interaction becomes the prevailing process to produce ionization for the considered shells. Finally it is introduced a new kernel for the XRF from Compton ionization. In a second place it is characterized the bremsstrahlung radiative contribution due the secondary electrons. The bremsstrahlung radiation is characterized in terms of space, angle and energy, for all elements whit Z=1-92 in the energy range 1–150 keV by using the Monte Carlo code PENELOPE. It is demonstrated that bremsstrahlung radiative contribution can be well approximated with an isotropic point photon source. It is created a data library comprising the energetic distributions of bremsstrahlung. It is developed a new bremsstrahlung kernel which allows the introduction of this contribution in the modified Boltzmann equation. An example of application to the simulation of a synchrotron experiment is shown.
Resumo:
Basata sul reperimento di un’ampia mole di testi giornalistici (come cronache, interviste, elzeviri e articoli di “Terza”) dedicati alle pratiche coreiche e pubblicati in Italia nel corso del ventennio fascista, la tesi ricostruisce i lineamenti di quello che, seppure ancora embrionale e certo non specialistico, si può comunque ritenere una sorta di “pensiero italiano” sulla danza del Primo Novecento. A partire dalla ricognizione sistematica di numerose testate quotidiane e periodiche e, pertanto, dalla costruzione di un nutrito corpus di fonti primarie, si è proceduto all’analisi dei testi reperiti attraverso un approccio metodologico che, fondamentalmente storiografico, accoglie tuttavia alcuni rudimenti interpretativi elaborati in ambito semiotico (con particolare riferimento alle teorizzazioni di Jurij Lotman e Umberto Eco), il tutto al fine di cogliere, pur nell’estrema varietà formale e contenutistica offerta dal materiale documentario, alcune dinamiche culturali di fondo attraverso le quali disegnare, da un lato, il panorama delle tipologie di danza effettivamente praticate sulle scene italiane del Ventennio,e, dall’altro, quello dell’insieme di pensieri, opinioni e gusti orbitanti attorno ad esse Ne è scaturita una trattazione fondamentalmente tripartita in cui, dopo la messa in campo delle questioni metodologiche, si passa dapprima attraverso l’indagine dei tre principali generi di danza che, nella stampa del periodo fascista, si ritenevano caratteristici della scena coreica internazionale – qui definiti nei termini di “ballo teatrale”, “ballo russo” e “danze libere” – e, successivamente, si presenta un approfondimento su tre singolari figure di intellettuali che, ognuno con un’attitudine estremamente personale, hanno dedicato alla danza un’attenzione speciale: Anton Giulio Bragaglia, Paolo Fabbri e Marco Ramperti. Un’ampia antologia critica completa il lavoro ripercorrendone gli snodi principali.
Resumo:
Until few years ago, 3D modelling was a topic confined into a professional environment. Nowadays technological innovations, the 3D printer among all, have attracted novice users to this application field. This sudden breakthrough was not supported by adequate software solutions. The 3D editing tools currently available do not assist the non-expert user during the various stages of generation, interaction and manipulation of 3D virtual models. This is mainly due to the current paradigm that is largely supported by two-dimensional input/output devices and strongly affected by obvious geometrical constraints. We have identified three main phases that characterize the creation and management of 3D virtual models. We investigated these directions evaluating and simplifying the classic editing techniques in order to propose more natural and intuitive tools in a pure 3D modelling environment. In particular, we focused on freehand sketch-based modelling to create 3D virtual models, interaction and navigation in a 3D modelling environment and advanced editing tools for free-form deformation and objects composition. To pursuing these goals we wondered how new gesture-based interaction technologies can be successfully employed in a 3D modelling environments, how we could improve the depth perception and the interaction in 3D environments and which operations could be developed to simplify the classical virtual models editing paradigm. Our main aims were to propose a set of solutions with which a common user can realize an idea in a 3D virtual model, drawing in the air just as he would on paper. Moreover, we tried to use gestures and mid-air movements to explore and interact in 3D virtual environment, and we studied simple and effective 3D form transformations. The work was carried out adopting the discrete representation of the models, thanks to its intuitiveness, but especially because it is full of open challenges.
Resumo:
Gas separation membranes of high CO2 permeability and selectivity have great potential in both natural gas sweetening and carbon dioxide capture. Many modified PIM membranes results permselectivity above Robinson upper bound. The big problem that should be solved for these polymers to be commercialized is their aging through time. In high glassy polymeric membrane such as PIM-1 and its modifications, solubility selectivity has more contribution towards permselectivity than diffusivity selectivity. So in this thesis work pure and mixed gas sorption behavior of carbon dioxide and methane in three PIM-based membranes (PIM-1, TZPIM-1 and AO-PIM-1) and Polynonene membrane is rigorously studied. Sorption experiment is performed at different temperatures and molar fraction. Sorption isotherms found from the experiment shows that there is a decrease of solubility as the temperature of the experiment increases for both gases in all polymers. There is also a decrease of solubility due to the presence of the other gas in the system in the mixed gas experiments due to competitive sorption effect. Variation of solubility is more visible in methane sorption than carbon dioxide, which will make the mixed gas solubility selectivity higher than that of pure gas solubility selectivity. Modeling of the system using NELF and Dual mode sorption model estimates the experimental results correctly Sorption of gases in heat treated and untreated membranes show that the sorption isotherms don’t vary due to the application of heat treatment for both carbon dioxide and methane. But there is decrease in the diffusivity coefficient and permeability of pure gases due to heat treatment. Both diffusivity coefficient and permeability decreases with increasing of heat treatment temperature. Diffusivity coefficient calculated from transient sorption experiment and steady state permeability experiment is also compared in this thesis work. The results reveal that transient diffusivity coefficient is higher than steady state diffusivity selectivity.
Resumo:
La tesi analizza il mutamento in atto nelle fonti del diritto del lavoro, attraverso uno studio dei casi di rinvio dalla legge al contratto collettivo. Nella Parte I della tesi è affrontato il tema dei rapporti tra legge e contratto collettivo. In una prospettiva statica, i rapporti tra legge e contratto collettivo sono caratterizzati dall’operare dei principii di gerarchia e del favor: la legge prevede il trattamento minimo di tutela e il contratto collettivo può modificare tale trattamento in senso più favorevole al lavoratore. In una prospettiva dinamica, i rapporti tra legge e contratto collettivo sono più complessi: nell’ordinamento italiano, infatti, la disciplina del rapporto e del mercato del lavoro è caratterizzata da una valorizzazione degli apporti dell’autonomia collettiva. In particolare, il contratto collettivo è destinatario di una serie di rinvii, che lo autorizzano a completare la disciplina legale e a modificarla anche in senso meno favorevole al lavoratore, al fine di creare un mercato del lavoro maggiormente dinamico. Nella Parte II della tesi l’analisi si concentra sull’art. 8 della l. n. 148/2011. Tale disposizione è stata introdotta durante la crisi economico-finanziaria che ha colpito l’Italia tra il 2011 e il 2012, a seguito di trattative tra il Governo italiano e le istituzioni dell’UE, al fine di attribuire alle imprese uno strumento per incrementare la loro competitività e produttività. L’art. 8 autorizza il contratto collettivo a derogare in peius alla legge con riferimento a un arco tematico di materie e istituti che comprende l’intero profilo della disciplina del rapporto di lavoro, con alcune eccezioni. L’art. 8 rappresenta il punto di arrivo di una lunga evoluzione legislativa e consente di mettere in discussione la ricostruzione tradizionale dei rapporti tra legge e contratto collettivo basata sui principii di gerarchia e di favore.
Resumo:
L’elaborato si occupa di fare il punto in materia di indagini difensive a tre lustri dall’entrata in vigore della legge n. 397/2000, epilogo di un lungo processo evolutivo che ha visto da un lato, una gestazione faticosa e travagliata, dall’altro, un prodotto normativo accolto dagli operatori in un contesto di scetticismo generale. In un panorama normativo e giurisprudenziale in continua evoluzione, i paradigmi dettati dagli artt. 24 e 111 della Costituzione, in tema di diritto alla difesa e di formazione della prova penale secondo il principio del contraddittorio tra le parti, in condizioni di parità, richiedono che il sistema giustizia offra sia all’indagato che all’imputato sufficienti strumenti difensivi. Tenuto conto delle diversità che caratterizzano naturalmente i ruoli dell’accusa e della difesa che impongono asimmetrie genetiche inevitabili, l’obiettivo della ricerca consiste nella disamina degli strumenti idonei a garantire il diritto alla prova della difesa in ogni stato e grado del procedimento, nel tentativo di realizzare compiutamente il principio di parità accusa - difesa nel processo penale. La ricerca si dipana attraverso tre direttrici: l’analisi dello statuto sulle investigazioni difensive nella sua evoluzione storica sino ai giorni nostri, lo studio della prova penale nel sistema americano e, infine, in alcune considerazioni finali espresse in chiave comparatistica. Le suggestioni proposte sono caratterizzate da un denominatore comune, ovvero dal presupposto che per contraddire è necessario conoscere e che solo per tale via sia possibile, finalmente, riconoscere il diritto di difendersi indagando.
Resumo:
Can the potential availability of unemployment insurance (UI) affect the behavior of employed workers and the duration of their employment spells? After discussing few straightforward reasons why UI may affect employment duration, I apply a regression kink design (RKD) to address this question using linked employer-employee data from the Brazilian labor market. Exploiting the UI schedule, I find that potential benefit level significantly affects the duration of employment spells. This effect is local to low skilled workers and, surprisingly, indicates that a 1\% increase in unemployment benefits increases job duration by around 0.3\%. Such result is driven by the fact that higher UI decreases the probability of job quits, which are not covered by UI in Brazil. These estimates are robust to permutation tests and a number of falsification tests. I develop a reduced-form welfare formula to assess the economic relevance of this result. Based on that, I show that the positive effect on employment duration implies in a higher optimal benefit level. Moreover, the formula shows that the elasticity of employment duration impacts welfare just with the same weight as the well-known elasticity of unemployment duration to benefit level.