9 resultados para EFFICIENCY OF PROTEIN UTILIZATION
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
Resumo:
We present a study of the metal sites of different proteins through X-ray Absorption Fine Structure (XAFS) spectroscopy. First of all, the capabilities of XAFS analysis have been improved by ab initio simulation of the near-edge region of the spectra, and an original analysis method has been proposed. The method subsequently served ad a tool to treat diverse biophysical problems, like the inhibition of proton-translocating proteins by metal ions and the matrix effect exerted on photosynthetic proteins (the bacterial Reaction Center, RC) by strongly dehydrate sugar matrices. A time-resolved study of Fe site of RC with μs resolution has been as well attempted. Finally, a further step aimed to improve the reliability of XAFS analysis has been performed by calculating the dynamical parameters of the metal binding cluster by means of DFT methods, and the theoretical result obtained for MbCO has been successfully compared with experimental data.
Resumo:
As land is developed, the impervious surfaces that are created increase the amount of runoff during rainfall events, disrupting the natural hydrologic cycle, with an increment in volume of runoff and in pollutant loadings. Pollutants deposited or derived from an activity on the land surface will likely end up in stormwater runoff in some concentration, such as nutrients, sediment, heavy metals, hydrocarbons, gasoline additives, pathogens, deicers, herbicides and pesticides. Several of these pollutants are particulate-bound, so it appears clear that sediment removal can provide significant water-quality improvements and it appears to be important the knowledge of the ability of stromwater treatment devices to retain particulate matter. For this reason three different units which remove sediments have been tested through laboratory. In particular a roadside gully pot has been tested under steady hydraulic conditions, varying the characteristics of the influent solids (diameter, particle size distribution and specific gravity). The efficiency in terms of particles retained has been evaluated as a function of influent flow rate and particles characteristics; results have been compared to efficiency evaluated applying an overflow rate model. Furthermore the role of particles settling velocity in efficiency determination has been investigated. After the experimental runs on the gully pot, a standard full-scale model of an hydrodynamic separator (HS) has been tested under unsteady influent flow rate condition, and constant solid concentration at the input. The results presented in this study illustrate that particle separation efficiency of the unit is predominately influenced by operating flow rate, which strongly affects the particles and hydraulic residence time of the system. The efficiency data have been compared to results obtained from a modified overflow rate model; moreover the residence time distribution has been experimentally determined through tracer analyses for several steady flow rates. Finally three testing experiments have been performed for two different configurations of a full-scale model of a clarifier (linear and crenulated) under unsteady influent flow rate condition, and constant solid concentration at the input. The results illustrate that particle separation efficiency of the unit is predominately influenced by the configuration of the unit itself. Turbidity measures have been used to compare turbidity with the suspended sediments concentration, in order to find a correlation between these two values, which can allow to have a measure of the sediments concentration simply installing a turbidity probe.
Resumo:
Microcredit has been a tool to alleviate poverty since long. This research is aimed to observe the efficiency of microcredit in the field of social exclusion. The development of questionnaires and use of existing tools was used to observe the tangible and intangible intertwining of microcredit and by doing so the effort was concentrated to observe whether microcredit has a direct effect on social exclusion or not. Bangladesh was chosen for the field study and 85 samples were taken for the analysis. It is a time period research and one year time was set to receive the sample and working on the statistical analysis. The tangible aspect was based on a World Bank questionnaire and the social capital questionnaire was developed through different well observed tools. The borrowers of Grameen Bank in Bangladesh, is the research sample whish shows a strong correlation between their tangible activity and social life. There are significant changes in tangible aspect and social participation observed from the research. Strong correlation between the two aspects was also found taking into account that the borrowers themselves have a vibrant social life in the village.
Resumo:
In this work we studied the efficiency of the benchmarks used in the asset management industry. In chapter 2 we analyzed the efficiency of the benchmark used for the government bond markets. We found that for the Emerging Market Bonds an equally weighted index for the country weights is probably the more suited because guarantees maximum diversification of country risk but for the Eurozone government bond market we found a GDP weighted index is better because the most important matter is to avoid a higher weight for highly indebted countries. In chapter 3 we analyzed the efficiency of a Derivatives Index to invest in the European corporate bond market instead of a Cash Index. We can state that the two indexes are similar in terms of returns, but that the Derivatives Index is less risky because it has a lower volatility, has values of skewness and kurtosis closer to those of a normal distribution and is a more liquid instrument, as the autocorrelation is not significant. In chapter 4 it is analyzed the impact of fallen angels on the corporate bond portfolios. Our analysis investigated the impact of the month-end rebalancing of the ML Emu Non Financial Corporate Index for the exit of downgraded bond (the event). We can conclude a flexible approach to the month-end rebalancing is better in order to avoid a loss of valued due to the benchmark construction rules. In chapter 5 we did a comparison between the equally weighted and capitalization weighted method for the European equity market. The benefit which results from reweighting the portfolio into equal weights can be attributed to the fact that EW portfolios implicitly follow a contrarian investment strategy, because they mechanically rebalance away from stocks that increase in price.
Resumo:
Chromatography is the most widely used technique for high-resolution separation and analysis of proteins. This technique is very useful for the purification of delicate compounds, e.g. pharmaceuticals, because it is usually performed at milder conditions than separation processes typically used by chemical industry. This thesis focuses on affinity chromatography. Chromatographic processes are traditionally performed using columns packed with porous resin. However, these supports have several limitations, including the dependence on intra-particle diffusion, a slow mass transfer mechanism, for the transport of solute molecules to the binding sites within the pores and high pressure drop through the packed bed. These limitations can be overcome by using chromatographic supports like membranes or monoliths. Dye-ligands are considered important alternatives to natural ligands. Several reactive dyes, particularly Cibacron Blue F3GA, are used as affinity ligand for protein purification. Cibacron Blue F3GA is a triazine dye that interacts specifically and reversibly with albumin. The aim of this study is to prepare dye-affinity membranes and monoliths for efficient removal of albumin and to compare the three different affinity supports: membranes and monoliths and a commercial column HiTrapTM Blue HP, produced by GE Healthcare. A comparison among the three supports was performed in terms of binding capacity at saturation (DBC100%) and dynamic binding capacity at 10% breakthrough (DBC10%) using solutions of pure BSA. The results obtained show that the CB-RC membranes and CB-Epoxy monoliths can be compared to commercial support, column HiTrapTM Blue HP, for the separation of albumin. These results encourage a further characterization of the new supports examined.
Resumo:
The diameters of traditional dish concentrators can reach several tens of meters, the construction of monolithic mirrors being difficult at these scales: cheap flat reflecting facets mounted on a common frame generally reproduce a paraboloidal surface. When a standard imaging mirror is coupled with a PV dense array, problems arise since the solar image focused is intrinsically circular. Moreover, the corresponding irradiance distribution is bell-shaped in contrast with the requirement of having all the cells under the same illumination. Mismatch losses occur when interconnected cells experience different conditions, in particular in series connections. In this PhD Thesis, we aim at solving these issues by a multidisciplinary approach, exploiting optical concepts and applications developed specifically for astronomical use, where the improvement of the image quality is a very important issue. The strategy we propose is to boost the spot uniformity acting uniquely on the primary reflector and avoiding the big mirrors segmentation into numerous smaller elements that need to be accurately mounted and aligned. In the proposed method, the shape of the mirrors is analytically described by the Zernike polynomials and its optimization is numerically obtained to give a non-imaging optics able to produce a quasi-square spot, spatially uniform and with prescribed concentration level. The freeform primary optics leads to a substantial gain in efficiency without secondary optics. Simple electrical schemes for the receiver are also required. The concept has been investigated theoretically modeling an example of CPV dense array application, including the development of non-optical aspects as the design of the detector and of the supporting mechanics. For the method proposed and the specific CPV system described, a patent application has been filed in Italy with the number TO2014A000016. The patent has been developed thanks to the collaboration between the University of Bologna and INAF (National Institute for Astrophysics).
Resumo:
Sweet sorghum, a C4 crop of tropical origin, is gaining momentum as a multipurpose feedstock to tackle the growing environmental, food and energy security demands. Under temperate climates sweet sorghum is considered as a potential bioethanol feedstock, however, being a relatively new crop in such areas its physiological and metabolic adaptability has to be evaluated; especially to the more frequent and severe drought spells occurring throughout the growing season and to the cold temperatures during the establishment period of the crop. The objective of this thesis was to evaluate some adaptive photosynthetic traits of sweet sorghum to drought and cold stress, both under field and controlled conditions. To meet such goal, a series of experiments were carried out. A new cold-tolerant sweet sorghum genotype was sown in rhizotrons of 1 m3 in order to evaluate its tolerance to progressive drought until plant death at young and mature stages. Young plants were able to retain high photosynthetic rate for 10 days longer than mature plants. Such response was associated to the efficient PSII down-regulation capacity mediated by light energy dissipation, closure of reaction centers (JIP-test parameters), and accumulation of glucose and sucrose. On the other hand, when sweet sorghum plants went into blooming stage, neither energy dissipation nor sugar accumulation counteracted the negative effect of drought. Two hybrids with contrastable cold tolerance, selected from an early sowing field trial were subjected to chilling temperatures under controlled growth conditions to evaluate in deep their physiological and metabolic cold adaptation mechanisms. The hybrid which poorly performed under field conditions (ICSSH31), showed earlier metabolic changes (Chl a + b, xanthophyll cycle) and greater inhibition of enzymatic activity (Rubisco and PEPcase activity) than the cold tolerant hybrid (Bulldozer). Important insights on the potential adaptability of sweet sorghum to temperate climates are given.
Resumo:
Nowadays, in developed countries, the excessive food intake, in conjunction with a decreased physical activity, has led to an increase in lifestyle-related diseases, such as obesity, cardiovascular diseases, type -2 diabetes, a range of cancer types and arthritis. The socio-economic importance of such lifestyle-related diseases has encouraged countries to increase their efforts in research, and many projects have been initiated recently in research that focuses on the relationship between food and health. Thanks to these efforts and to the growing availability of technologies, the food companies are beginning to develop healthier food. The necessity of rapid and affordable methods, helping the food industries in the ingredient selection has stimulated the development of in vitro systems that simulate the physiological functions to which the food components are submitted when administrated in vivo. One of the most promising tool now available appears the in vitro digestion, which aims at predicting, in a comparative way among analogue food products, the bioaccessibility of the nutrients of interest.. The adoption of the foodomics approach has been chosen in this work to evaluate the modifications occurring during the in vitro digestion of selected protein-rich food products. The measure of the proteins breakdown was performed via NMR spectroscopy, the only techniques capable of observing, directly in the simulated gastric and duodenal fluids, the soluble oligo- and polypeptides released during the in vitro digestion process. The overall approach pioneered along this PhD work, has been discussed and promoted in a large scientific community, with specialists networked under the INFOGEST COST Action, which recently released a harmonized protocol for the in vitro digestion. NMR spectroscopy, when used in tandem with the in vitro digestion, generates a new concept, which provides an additional attribute to describe the food quality: the comparative digestibility, which measures the improvement of the nutrients bioaccessibility.