8 resultados para Endogenous Information Structure
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The dissertation consists of four papers that aim at providing new contributions in the field of macroeconomics, monetary policy and financial stability. The first paper proposes a new Dynamic Stochastic General Equilibrium (DSGE) model with credit frictions and a banking sector to study the pro-cyclicality of credit and the role of different prudential regulatory frameworks in affecting business cycle fluctuations and in restoring macroeconomic and financial stability. The second paper develops a simple DSGE model capable of evaluating the effects of large purchases of treasuries by central banks. This theoretical framework is employed to evaluate the impact on yields and the macroeconomy of large purchases of medium- and long-term government bonds recently implemented in the US and UK. The third paper studies the effects of ECB communications about unconventional monetary policy operations on the perceived sovereign risk of Italy over the last five years. The empirical results are derived from both an event-study analysis and a GARCH model, which uses Italian long-term bond futures to disentangle expected from unexpected policy actions. The fourth paper proposes a DSGE model with an endogenous term structure of interest rates, which is able to replicate the stylized facts regarding the yield curve and the term premium in the US over the period 1987:3-2011:3, without compromising its ability to match macro dynamics.
Resumo:
The Reverse Vaccinology (RV) approach allows using genomic information for the delineation of new protein-based vaccines starting from an in silico analysis. The first powerful example of the application of the RV approach is given by the development of a protein-based vaccine against serogroup B Meningococcus. A similar approach was also used to identify new Staphylococcus aureus vaccine candidates, including the ferric hydroxamate-binding lipoprotein FhuD2. S. aureus is a widespread human pathogen, which employs various different strategies for iron uptake, including: (i) siderophore-mediated iron acquisition using the endogenous siderophores staphyloferrin A and B, (ii) siderophore-mediated iron acquisition using xeno-siderophores (the pathway exploited by FhuD2) and (iii) heme-mediated iron acquisition. In this work the high resolution crystal structure of FhuD2 in the iron (III)-siderophore-bound form was determined. FhuD2 belongs to the Periplasmic Binding Protein family (PBP ) class III, and is principally formed by two globular domains, at the N- and C-termini of the protein, that make up a cleft where ferrichrome-iron (III) is bound. The N- and C-terminal domains, connected by a single long α-helix, present Rossmann-like folds, showing a β-stranded core and an α-helical periphery, which do not undergo extensive structural rearrangement when they interact with the ligand, typical of class III PBP members. The structure shows that ferrichrome-bound iron does not come directly into contact with the protein; rather, the metal ion is fully coordinated by six oxygen donors of the hydroxamate groups of three ornithine residues, which, with the three glycine residues, make up the peptide backbone of ferrichrome. Furthermore, it was found that iron-free ferrichrome is able to subtract iron from transferrin. This study shows for the first time the structure of FhuD2, which was found to bind to siderophores ,and that the protein plays an important role in S. aureus colonization and infection phases.
Resumo:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
Resumo:
In this thesis, a strategy to model the behavior of fluids and their interaction with deformable bodies is proposed. The fluid domain is modeled by using the lattice Boltzmann method, thus analyzing the fluid dynamics by a mesoscopic point of view. It has been proved that the solution provided by this method is equivalent to solve the Navier-Stokes equations for an incompressible flow with a second-order accuracy. Slender elastic structures idealized through beam finite elements are used. Large displacements are accounted for by using the corotational formulation. Structural dynamics is computed by using the Time Discontinuous Galerkin method. Therefore, two different solution procedures are used, one for the fluid domain and the other for the structural part, respectively. These two solvers need to communicate and to transfer each other several information, i.e. stresses, velocities, displacements. In order to guarantee a continuous, effective, and mutual exchange of information, a coupling strategy, consisting of three different algorithms, has been developed and numerically tested. In particular, the effectiveness of the three algorithms is shown in terms of interface energy artificially produced by the approximate fulfilling of compatibility and equilibrium conditions at the fluid-structure interface. The proposed coupled approach is used in order to solve different fluid-structure interaction problems, i.e. cantilever beams immersed in a viscous fluid, the impact of the hull of the ship on the marine free-surface, blood flow in a deformable vessels, and even flapping wings simulating the take-off of a butterfly. The good results achieved in each application highlight the effectiveness of the proposed methodology and of the C++ developed software to successfully approach several two-dimensional fluid-structure interaction problems.
Resumo:
The Vrancea region, at the south-eastern bend of the Carpathian Mountains in Romania, represents one of the most puzzling seismically active zones of Europe. Beside some shallow seismicity spread across the whole Romanian territory, Vrancea is the place of an intense seismicity with the presence of a cluster of intermediate-depth foci placed in a narrow nearly vertical volume. Although large-scale mantle seismic tomographic studies have revealed the presence of a narrow, almost vertical, high-velocity body in the upper mantle, the nature and the geodynamic of this deep intra-continental seismicity is still questioned. High-resolution seismic tomography could help to reveal more details in the subcrustal structure of Vrancea. Recent developments in computational seismology as well as the availability of parallel computing now allow to potentially retrieve more information out of seismic waveforms and to reach such high-resolution models. This study was aimed to evaluate the application of a full waveform inversion tomography at regional scale for the Vrancea lithosphere using data from the 1999 six months temporary local network CALIXTO. Starting from a detailed 3D Vp, Vs and density model, built on classical travel-time tomography together with gravity data, I evaluated the improvements obtained with the full waveform inversion approach. The latter proved to be highly problem dependent and highly computational expensive. The model retrieved after the first two iterations does not show large variations with respect to the initial model but remains in agreement with previous tomographic models. It presents a well-defined downgoing slab shape high velocity anomaly, composed of a N-S horizontal anomaly in the depths between 40 and 70km linked to a nearly vertical NE-SW anomaly from 70 to 180km.
Resumo:
This research was designed to answer the question of which direction the restructuring of financial regulators should take – consolidation or fragmentation. This research began by examining the need for financial regulation and its related costs. It then continued to describe what types of regulatory structures exist in the world; surveying the regulatory structures in 15 jurisdictions, comparing them and discussing their strengths and weaknesses. This research analyzed the possible regulatory structures using three methodological tools: Game-Theory, Institutional-Design, and Network-Effects. The incentives for regulatory action were examined in Chapter Four using game theory concepts. This chapter predicted how two regulators with overlapping supervisory mandates will behave in two different states of the world (where they can stand to benefit from regulating and where they stand to lose). The insights derived from the games described in this chapter were then used to analyze the different supervisory models that exist in the world. The problem of information-flow was discussed in Chapter Five using tools from institutional design. The idea is based on the need for the right kind of information to reach the hands of the decision maker in the shortest time possible in order to predict, mitigate or stop a financial crisis from occurring. Network effects and congestion in the context of financial regulation were discussed in Chapter Six which applied the literature referring to network effects in general in an attempt to conclude whether consolidating financial regulatory standards on a global level might also yield other positive network effects. Returning to the main research question, this research concluded that in general the fragmented model should be preferable to the consolidated model in most cases as it allows for greater diversity and information-flow. However, in cases in which close cooperation between two authorities is essential, the consolidated model should be used.
Resumo:
The aim of this thesis is to explore the possible influence of the food matrix on food quality attributes. Using nuclear magnetic resonance techniques, the matrix-dependent properties of different foods were studied and some useful indices were defined to classify food products based on the matrix behaviour when responding to processing phenomena. Correlations were found between fish freshness indices, assessed by certain geometric parameters linked to the morphology of the animal, i.e. a macroscopic structure, and the degradation of the product structure. The same foodomics approach was also applied to explore the protective effect of modified atmospheres on the stability of fish fillets, which are typically susceptible to oxidation of the polyunsaturated fatty acids incorporated in the meat matrix. Here, freshness is assessed by evaluating the time-dependent change in the fish metabolome, providing an established freshness index, and its relationship to lipid oxidation. In vitro digestion studies, focusing on food products with different matrixes, alone and in combination with other meal components (e.g. seasoning), were conducted to investigate possible interactions between enzymes and food, modulated by matrix structure, which influence digestibility. The interaction between water and the gelatinous matrix of the food, consisting of a network of protein gels incorporating fat droplets, was also studied by means of nuclear magnetic relaxometry, in order to create a prediction tool for the correct classification of authentic and counterfeit food products protected by a quality label. This is one of the first applications of an NMR method focusing on the supramolecular structure of the matrix, rather than the chemical composition, to assess food authenticity. The effect of innovative processing technologies, such as PEF applied to fruit products, has been assessed by magnetic resonance imaging, exploiting information associated with the rehydration kinetics exerted by a modified food structure.
Resumo:
The study of the spectroscopic phenomena in organic solids, in combination with other techniques, is an effective tool for the understanding of the structural properties of materials based on these compounds. This Ph.D. work was dedicated to the spectroscopic investigation of some relevant processes occurring in organic molecular crystals, with the goal of expanding the knowledge on the relationship between structure, dynamics and photoreactivity of these systems. Vibrational spectroscopy has been the technique of choice, always in combination with X-ray diffraction structural studies and often the support of computational methods. The vibrational study of the molecular solid state reaches its full potential when it includes the low-wavenumber region of the lattice-phonon modes, which probe the weak intermolecular interactions and are the fingerprints of the lattice itself. Microscopy is an invaluable addition in the investigation of processes that take place in the micro-meter scale of the crystal micro-domains. In chemical and phase transitions, as well as in polymorph screening and identification, the combination of Raman microscopy and lattice-phonon detection has provided useful information. Research on the fascinating class of single-crystal-to-single-crystal photoreactions, has shown how the homogeneous mechanism of these transformations can be identified by lattice-phonon microscopy, in agreement with the continuous evolution of their XRD patterns. On describing the behavior of the photodimerization mechanism of vitamin K3, the focus was instead on the influence of its polymorphism in governing the product isomerism. Polymorphism is the additional degree of freedom of molecular functional materials, and by advancing in its control and properties, functionalities can be promoted for useful applications. Its investigation focused on thin-film phases, widely employed in organic electronics. The ambiguities in phase identification often emerging by other experimental methods were successfully solved by vibrational measurements.