20 resultados para ingredients
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This PhD thesis describes set up of technological models for obtaining high health value foods and ingredients that preserve the final product characteristics as well as enrich with nutritional components. In particular, the main object of my research has been Virgin Olive Oil (VOO) and its important antioxidant compounds which differentiate it from all other vegetables oils. It is well known how the qualitative and quantitative presence of phenolic molecules extracted from olives during oil production is fundamental for its oxidative and nutritional quality. For this purpose, agronomic and technological conditions of its production have been investigated. It has also been examined how this fraction can be better preserved during storage. Moreover, its relation with VOO sensorial characteristics and its interaction with a protein in emulsion foods have also been studied. Finally, an experimental work was carried out to determine the antioxidative and heat resistance properties of a new antioxidant (EVS-OL) when used for high temperature frying such as is typically employed for the preparation of french fries. Results of the scientific research have been submitted for a publication and some data has already been published in national and international scientific journals.
Resumo:
Galaxy clusters occupy a special position in the cosmic hierarchy as they are the largest bound structures in the Universe. There is now general agreement on a hierarchical picture for the formation of cosmic structures, in which galaxy clusters are supposed to form by accretion of matter and merging between smaller units. During merger events, shocks are driven by the gravity of the dark matter in the diffuse barionic component, which is heated up to the observed temperature. Radio and hard-X ray observations have discovered non-thermal components mixed with the thermal Intra Cluster Medium (ICM) and this is of great importance as it calls for a “revision” of the physics of the ICM. The bulk of present information comes from the radio observations which discovered an increasing number of Mpcsized emissions from the ICM, Radio Halos (at the cluster center) and Radio Relics (at the cluster periphery). These sources are due to synchrotron emission from ultra relativistic electrons diffusing through µG turbulent magnetic fields. Radio Halos are the most spectacular evidence of non-thermal components in the ICM and understanding the origin and evolution of these sources represents one of the most challenging goal of the theory of the ICM. Cluster mergers are the most energetic events in the Universe and a fraction of the energy dissipated during these mergers could be channelled into the amplification of the magnetic fields and into the acceleration of high energy particles via shocks and turbulence driven by these mergers. Present observations of Radio Halos (and possibly of hard X-rays) can be best interpreted in terms of the reacceleration scenario in which MHD turbulence injected during these cluster mergers re-accelerates high energy particles in the ICM. The physics involved in this scenario is very complex and model details are difficult to test, however this model clearly predicts some simple properties of Radio Halos (and resulting IC emission in the hard X-ray band) which are almost independent of the details of the adopted physics. In particular in the re-acceleration scenario MHD turbulence is injected and dissipated during cluster mergers and thus Radio Halos (and also the resulting hard X-ray IC emission) should be transient phenomena (with a typical lifetime <» 1 Gyr) associated with dynamically disturbed clusters. The physics of the re-acceleration scenario should produce an unavoidable cut-off in the spectrum of the re-accelerated electrons, which is due to the balance between turbulent acceleration and radiative losses. The energy at which this cut-off occurs, and thus the maximum frequency at which synchrotron radiation is produced, depends essentially on the efficiency of the acceleration mechanism so that observations at high frequencies are expected to catch only the most efficient phenomena while, in principle, low frequency radio surveys may found these phenomena much common in the Universe. These basic properties should leave an important imprint in the statistical properties of Radio Halos (and of non-thermal phenomena in general) which, however, have not been addressed yet by present modellings. The main focus of this PhD thesis is to calculate, for the first time, the expected statistics of Radio Halos in the context of the re-acceleration scenario. In particular, we shall address the following main questions: • Is it possible to model “self-consistently” the evolution of these sources together with that of the parent clusters? • How the occurrence of Radio Halos is expected to change with cluster mass and to evolve with redshift? How the efficiency to catch Radio Halos in galaxy clusters changes with the observing radio frequency? • How many Radio Halos are expected to form in the Universe? At which redshift is expected the bulk of these sources? • Is it possible to reproduce in the re-acceleration scenario the observed occurrence and number of Radio Halos in the Universe and the observed correlations between thermal and non-thermal properties of galaxy clusters? • Is it possible to constrain the magnetic field intensity and profile in galaxy clusters and the energetic of turbulence in the ICM from the comparison between model expectations and observations? Several astrophysical ingredients are necessary to model the evolution and statistical properties of Radio Halos in the context of re-acceleration model and to address the points given above. For these reason we deserve some space in this PhD thesis to review the important aspects of the physics of the ICM which are of interest to catch our goals. In Chapt. 1 we discuss the physics of galaxy clusters, and in particular, the clusters formation process; in Chapt. 2 we review the main observational properties of non-thermal components in the ICM; and in Chapt. 3 we focus on the physics of magnetic field and of particle acceleration in galaxy clusters. As a relevant application, the theory of Alfv´enic particle acceleration is applied in Chapt. 4 where we report the most important results from calculations we have done in the framework of the re-acceleration scenario. In this Chapter we show that a fraction of the energy of fluid turbulence driven in the ICM by the cluster mergers can be channelled into the injection of Alfv´en waves at small scales and that these waves can efficiently re-accelerate particles and trigger Radio Halos and hard X-ray emission. The main part of this PhD work, the calculation of the statistical properties of Radio Halos and non-thermal phenomena as expected in the context of the re-acceleration model and their comparison with observations, is presented in Chapts.5, 6, 7 and 8. In Chapt.5 we present a first approach to semi-analytical calculations of statistical properties of giant Radio Halos. The main goal of this Chapter is to model cluster formation, the injection of turbulence in the ICM and the resulting particle acceleration process. We adopt the semi–analytic extended Press & Schechter (PS) theory to follow the formation of a large synthetic population of galaxy clusters and assume that during a merger a fraction of the PdV work done by the infalling subclusters in passing through the most massive one is injected in the form of magnetosonic waves. Then the processes of stochastic acceleration of the relativistic electrons by these waves and the properties of the ensuing synchrotron (Radio Halos) and inverse Compton (IC, hard X-ray) emission of merging clusters are computed under the assumption of a constant rms average magnetic field strength in emitting volume. The main finding of these calculations is that giant Radio Halos are naturally expected only in the more massive clusters, and that the expected fraction of clusters with Radio Halos is consistent with the observed one. In Chapt. 6 we extend the previous calculations by including a scaling of the magnetic field strength with cluster mass. The inclusion of this scaling allows us to derive the expected correlations between the synchrotron radio power of Radio Halos and the X-ray properties (T, LX) and mass of the hosting clusters. For the first time, we show that these correlations, calculated in the context of the re-acceleration model, are consistent with the observed ones for typical µG strengths of the average B intensity in massive clusters. The calculations presented in this Chapter allow us to derive the evolution of the probability to form Radio Halos as a function of the cluster mass and redshift. The most relevant finding presented in this Chapter is that the luminosity functions of giant Radio Halos at 1.4 GHz are expected to peak around a radio power » 1024 W/Hz and to flatten (or cut-off) at lower radio powers because of the decrease of the electron re-acceleration efficiency in smaller galaxy clusters. In Chapt. 6 we also derive the expected number counts of Radio Halos and compare them with available observations: we claim that » 100 Radio Halos in the Universe can be observed at 1.4 GHz with deep surveys, while more than 1000 Radio Halos are expected to be discovered in the next future by LOFAR at 150 MHz. This is the first (and so far unique) model expectation for the number counts of Radio Halos at lower frequency and allows to design future radio surveys. Based on the results of Chapt. 6, in Chapt.7 we present a work in progress on a “revision” of the occurrence of Radio Halos. We combine past results from the NVSS radio survey (z » 0.05 − 0.2) with our ongoing GMRT Radio Halos Pointed Observations of 50 X-ray luminous galaxy clusters (at z » 0.2−0.4) and discuss the possibility to test our model expectations with the number counts of Radio Halos at z » 0.05 − 0.4. The most relevant limitation in the calculations presented in Chapt. 5 and 6 is the assumption of an “averaged” size of Radio Halos independently of their radio luminosity and of the mass of the parent clusters. This assumption cannot be released in the context of the PS formalism used to describe the formation process of clusters, while a more detailed analysis of the physics of cluster mergers and of the injection process of turbulence in the ICM would require an approach based on numerical (possible MHD) simulations of a very large volume of the Universe which is however well beyond the aim of this PhD thesis. On the other hand, in Chapt.8 we report our discovery of novel correlations between the size (RH) of Radio Halos and their radio power and between RH and the cluster mass within the Radio Halo region, MH. In particular this last “geometrical” MH − RH correlation allows us to “observationally” overcome the limitation of the “average” size of Radio Halos. Thus in this Chapter, by making use of this “geometrical” correlation and of a simplified form of the re-acceleration model based on the results of Chapt. 5 and 6 we are able to discuss expected correlations between the synchrotron power and the thermal cluster quantities relative to the radio emitting region. This is a new powerful tool of investigation and we show that all the observed correlations (PR − RH, PR − MH, PR − T, PR − LX, . . . ) now become well understood in the context of the re-acceleration model. In addition, we find that observationally the size of Radio Halos scales non-linearly with the virial radius of the parent cluster, and this immediately means that the fraction of the cluster volume which is radio emitting increases with cluster mass and thus that the non-thermal component in clusters is not self-similar.
Resumo:
Phenol and cresols represent a good example of primary chemical building blocks of which 2.8 million tons are currently produced in Europe each year. Currently, these primary phenolic building blocks are produced by refining processes from fossil hydrocarbons: 5% of the world-wide production comes from coal (which contains 0.2% of phenols) through the distillation of the tar residue after the production of coke, while 95% of current world production of phenol is produced by the distillation and cracking of crude oil. In nature phenolic compounds are present in terrestrial higher plants and ferns in several different chemical structures while they are essentially absent in lower organisms and in animals. Biomass (which contain 3-8% of phenols) represents a substantial source of secondary chemical building blocks presently underexploited. These phenolic derivatives are currently used in tens thousand of tons to produce high cost products such as food additives and flavours (i.e. vanillin), fine chemicals (i.e. non-steroidal anti-inflammatory drugs such as ibuprofen or flurbiprofen) and polymers (i.e. poly p-vinylphenol, a photosensitive polymer for electronic and optoelectronic applications). European agrifood waste represents a low cost abundant raw material (250 millions tons per year) which does not subtract land use and processing resources from necessary sustainable food production. The class of phenolic compounds is essentially constituted by simple phenols, phenolic acids, hydroxycinnamic acid derivatives, flavonoids and lignans. As in the case of coke production, the removal of the phenolic contents from biomass upgrades also the residual biomass. Focusing on the phenolic component of agrifood wastes, huge processing and marketing opportunities open since phenols are used as chemical intermediates for a large number of applications, ranging from pharmaceuticals, agricultural chemicals, food ingredients etc. Following this approach we developed a biorefining process to recover the phenolic fraction of wheat bran based on enzymatic commercial biocatalysts in completely water based process, and polymeric resins with the aim of substituting secondary chemical building blocks with the same compounds naturally present in biomass. We characterized several industrial enzymatic product for their ability to hydrolize the different molecular features that are present in wheat bran cell walls structures, focusing on the hydrolysis of polysaccharidic chains and phenolics cross links. This industrial biocatalysts were tested on wheat bran and the optimized process allowed to liquefy up to the 60 % of the treated matter. The enzymatic treatment was also able to solubilise up to the 30 % of the alkali extractable ferulic acid. An extraction process of the phenolic fraction of the hydrolyzed wheat bran based on an adsorbtion/desorption process on styrene-polyvinyl benzene weak cation-exchange resin Amberlite IRA 95 was developed. The efficiency of the resin was tested on different model system containing ferulic acid and the adsorption and desorption working parameters optimized for the crude enzymatic hydrolyzed wheat bran. The extraction process developed had an overall yield of the 82% and allowed to obtain concentrated extracts containing up to 3000 ppm of ferulic acid. The crude enzymatic hydrolyzed wheat bran and the concentrated extract were finally used as substrate in a bioconversion process of ferulic acid into vanillin through resting cells fermentation. The bioconversion process had a yields in vanillin of 60-70% within 5-6 hours of fermentation. Our findings are the first step on the way to demonstrating the economical feasibility for the recovery of biophenols from agrifood wastes through a whole crop approach in a sustainable biorefining process.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
CHEMICAL COMMUNICATION IN BLATTARIA: CONTRIBUTION TO THE IMPROVEMENT OF THE CONTROL TECNIQUES The management of cockroach infestations in urban environment has undergone some changes in recent years by moving to the predominant use of baits, thanks to the awareness of the risks connected with the use of spray insecticides. The effectiveness of a bait is determined by the collective performance of its components, including active and inert ingredients, the food attractant and any other attractive odour. This research has focused on the behavioral responses of Italian sinanthropic cockroaches to semiochemicals and food attractants, for the purpose of evaluating a possible practical application and of contributing to the improvement of control techniques. Behavioral and morphological studies have been carried out on Supella longipalpa (F.), a small cockroach that is spreading in Italy. Behavioral assays showed that the fourth and fifth tergites of females are significantly more attractive than other region of the body. Cuticular pores and ducts ending in glandular structures (observed with a S.E.M. = Scanning Electron Microscope) are present in large number on these tergites, suggesting that they could be involved in the production and the release of sexual pheromone. Cockroaches produce an aggregation pheromone that is excreted along with their frass and that consists of volatile and non-volatile compounds, mainly amines and steroidal glycosides. The effectiveness of faecal extracts obtained from excrements of Blattella germanica (L.), Blatta orientalis L., Periplaneta americana (L.) and S. longipalpa was evaluated, both at intraspecific and interspecific level, using a "Y" tube olfactometer. Bioassays showed that faecal extracts obtained with methanol have a good intraspecific attractiveness and, in some cases, they showed also interspecific behavioral responses. A gel was prepared, having physical characteristics that could give a good resistance to dehydration, as a potential basis for a new bait; the gel was then added faecal extracts, obtained with methanol from B. germanica and S. longipalpa frass. Arena-tests showed that the new gel containing faecal extracts is more attractive than some commercial gel formulations used as comparison: it was the only product that could attract 100% of insects placed in the arenas in 4-5 days. In conclusion, the substances involved in chemical communication of Blattaria may be able to effectively increase the attractiveness of products for monitoring and controlling cockroaches.
Resumo:
Mixed integer programming is up today one of the most widely used techniques for dealing with hard optimization problems. On the one side, many practical optimization problems arising from real-world applications (such as, e.g., scheduling, project planning, transportation, telecommunications, economics and finance, timetabling, etc) can be easily and effectively formulated as Mixed Integer linear Programs (MIPs). On the other hand, 50 and more years of intensive research has dramatically improved on the capability of the current generation of MIP solvers to tackle hard problems in practice. However, many questions are still open and not fully understood, and the mixed integer programming community is still more than active in trying to answer some of these questions. As a consequence, a huge number of papers are continuously developed and new intriguing questions arise every year. When dealing with MIPs, we have to distinguish between two different scenarios. The first one happens when we are asked to handle a general MIP and we cannot assume any special structure for the given problem. In this case, a Linear Programming (LP) relaxation and some integrality requirements are all we have for tackling the problem, and we are ``forced" to use some general purpose techniques. The second one happens when mixed integer programming is used to address a somehow structured problem. In this context, polyhedral analysis and other theoretical and practical considerations are typically exploited to devise some special purpose techniques. This thesis tries to give some insights in both the above mentioned situations. The first part of the work is focused on general purpose cutting planes, which are probably the key ingredient behind the success of the current generation of MIP solvers. Chapter 1 presents a quick overview of the main ingredients of a branch-and-cut algorithm, while Chapter 2 recalls some results from the literature in the context of disjunctive cuts and their connections with Gomory mixed integer cuts. Chapter 3 presents a theoretical and computational investigation of disjunctive cuts. In particular, we analyze the connections between different normalization conditions (i.e., conditions to truncate the cone associated with disjunctive cutting planes) and other crucial aspects as cut rank, cut density and cut strength. We give a theoretical characterization of weak rays of the disjunctive cone that lead to dominated cuts, and propose a practical method to possibly strengthen those cuts arising from such weak extremal solution. Further, we point out how redundant constraints can affect the quality of the generated disjunctive cuts, and discuss possible ways to cope with them. Finally, Chapter 4 presents some preliminary ideas in the context of multiple-row cuts. Very recently, a series of papers have brought the attention to the possibility of generating cuts using more than one row of the simplex tableau at a time. Several interesting theoretical results have been presented in this direction, often revisiting and recalling other important results discovered more than 40 years ago. However, is not clear at all how these results can be exploited in practice. As stated, the chapter is a still work-in-progress and simply presents a possible way for generating two-row cuts from the simplex tableau arising from lattice-free triangles and some preliminary computational results. The second part of the thesis is instead focused on the heuristic and exact exploitation of integer programming techniques for hard combinatorial optimization problems in the context of routing applications. Chapters 5 and 6 present an integer linear programming local search algorithm for Vehicle Routing Problems (VRPs). The overall procedure follows a general destroy-and-repair paradigm (i.e., the current solution is first randomly destroyed and then repaired in the attempt of finding a new improved solution) where a class of exponential neighborhoods are iteratively explored by heuristically solving an integer programming formulation through a general purpose MIP solver. Chapters 7 and 8 deal with exact branch-and-cut methods. Chapter 7 presents an extended formulation for the Traveling Salesman Problem with Time Windows (TSPTW), a generalization of the well known TSP where each node must be visited within a given time window. The polyhedral approaches proposed for this problem in the literature typically follow the one which has been proven to be extremely effective in the classical TSP context. Here we present an overall (quite) general idea which is based on a relaxed discretization of time windows. Such an idea leads to a stronger formulation and to stronger valid inequalities which are then separated within the classical branch-and-cut framework. Finally, Chapter 8 addresses the branch-and-cut in the context of Generalized Minimum Spanning Tree Problems (GMSTPs) (i.e., a class of NP-hard generalizations of the classical minimum spanning tree problem). In this chapter, we show how some basic ideas (and, in particular, the usage of general purpose cutting planes) can be useful to improve on branch-and-cut methods proposed in the literature.
Resumo:
In the wide range of data that the nutrition subject offers to the historical observation, this investigation focuses on one of the functions that food serves in the social context: that is, to signify cultural identity. In this context, we will analyse the ways in which industrially produced pasta has come to its status as one of the symbolic forms of twentieth-century Italian food, contributing to a sense of social identity that forms part of the process of nation-building developed during the XX century. The nature of the relationship between pasta and Italian food is analysed for a period of almost a century (1886-1984) through a variety of different sources: government enquiries, cookery books, gastronomic guides and menus of official dinners. The assemblage of such documents in one study allows investigation of certain themes throughout a wide range of gastronomical cultures active within the national borders. In this way, links are made between the production, adoption, reception and dissemination of the ingredients and Italian Unification.This method has made it possible to restore one possible form of historical knowledge of twentieth-century gastronomy and of the experiences by which it was influenced.
Resumo:
Bread dough and particularly wheat dough, due to its viscoelastic behaviour, is probably the most dynamic and complicated rheological system and its characteristics are very important since they highly affect final products’ textural and sensorial properties. The study of dough rheology has been a very challenging task for many researchers since it can provide numerous information about dough formulation, structure and processing. This explains why dough rheology has been a matter of investigation for several decades. In this research rheological assessment of doughs and breads was performed by using empirical and fundamental methods at both small and large deformation, in order to characterize different types of doughs and final products such as bread. In order to study the structural aspects of food products, image analysis techniques was used for the integration of the information coming from empirical and fundamental rheological measurements. Evaluation of dough properties was carried out by texture profile analysis (TPA), dough stickiness (Chen and Hoseney cell) and uniaxial extensibility determination (Kieffer test) by using a Texture Analyser; small deformation rheological measurements, were performed on a controlled stress–strain rheometer; moreover the structure of different doughs was observed by using the image analysis; while bread characteristics were studied by using texture profile analysis (TPA) and image analysis. The objective of this research was to understand if the different rheological measurements were able to characterize and differentiate the different samples analysed. This in order to investigate the effect of different formulation and processing conditions on dough and final product from a structural point of view. For this aim the following different materials were performed and analysed: - frozen dough realized without yeast; - frozen dough and bread made with frozen dough; - doughs obtained by using different fermentation method; - doughs made by Kamut® flour; - dough and bread realized with the addition of ginger powder; - final products coming from different bakeries. The influence of sub-zero storage time on non-fermented and fermented dough viscoelastic performance and on final product (bread) was evaluated by using small deformation and large deformation methods. In general, the longer the sub-zero storage time the lower the positive viscoelastic attributes. The effect of fermentation time and of different type of fermentation (straight-dough method; sponge-and-dough procedure and poolish method) on rheological properties of doughs were investigated using empirical and fundamental analysis and image analysis was used to integrate this information throughout the evaluation of the dough’s structure. The results of fundamental rheological test showed that the incorporation of sourdough (poolish method) provoked changes that were different from those seen in the others type of fermentation. The affirmative action of some ingredients (extra-virgin olive oil and a liposomic lecithin emulsifier) to improve rheological characteristics of Kamut® dough has been confirmed also when subjected to low temperatures (24 hours and 48 hours at 4°C). Small deformation oscillatory measurements and large deformation mechanical tests performed provided useful information on the rheological properties of samples realized by using different amounts of ginger powder, showing that the sample with the highest amount of ginger powder (6%) had worse rheological characteristics compared to the other samples. Moisture content, specific volume, texture and crumb grain characteristics are the major quality attributes of bread products. The different sample analyzed, “Coppia Ferrarese”, “Pane Comune Romagnolo” and “Filone Terra di San Marino”, showed a decrease of crumb moisture and an increase in hardness over the storage time. Parameters such as cohesiveness and springiness, evaluated by TPA that are indicator of quality of fresh bread, decreased during the storage. By using empirical rheological tests we found several differences among the samples, due to the different ingredients used in formulation and the different process adopted to prepare the sample, but since these products are handmade, the differences could be account as a surplus value. In conclusion small deformation (in fundamental units) and large deformation methods showed a significant role in monitoring the influence of different ingredients used in formulation, different processing and storage conditions on dough viscoelastic performance and on final product. Finally the knowledge of formulation, processing and storage conditions together with the evaluation of structural and rheological characteristics is fundamental for the study of complex matrices like bakery products, where numerous variable can influence their final quality (e.g. raw material, bread-making procedure, time and temperature of the fermentation and baking).
Resumo:
Gnocchi is a typical Italian potato-based fresh pasta that can be either homemade or industrially manufactured. The homemade traditional product is consumed fresh on the day it is produced, whereas the industrially manufactured one is vacuum-packed in polyethylene and usually stored at refrigerated conditions. At industrial level, most kinds of gnocchi are usually produced by using some potato derivatives (i.e. flakes, dehydrated products or flour) to which soft wheat flour, salt, some emulsifiers and aromas are added. Recently, a novel type of gnocchi emerged on the Italian pasta market, since it would be as much similar as possible to the traditional homemade one. It is industrially produced from fresh potatoes as main ingredient and soft wheat flour, pasteurized liquid eggs and salt, moreover this product undergoes steam cooking and mashing industrial treatments. Neither preservatives nor emulsifiers are included in the recipe. The main aim of this work was to get inside the industrial manufacture of gnocchi, in order to improve the quality characteristics of the final product, by the study of the main steps of the production, starting from the raw and steam cooked tubers, through the semi-finished materials, such as the potato puree and the formulated dough. For this purpose the investigation of the enzymatic activity of the raw and steam cooked potatoes, the main characteristics of the puree (colour, texture and starch), the interaction among ingredients of differently formulated doughs and the basic quality aspects of the final product have been performed. Results obtained in this work indicated that steam cooking influenced the analysed enzymes (Pectin methylesterase and α- and β-amylases) in different tissues of the tuber. PME resulted still active in the cortex, it therefore may affect the texture of cooked potatoes to be used as main ingredient in the production of gnocchi. Starch degrading enzymes (α- and β-amylases) were inactivated both in the cortex and in the pith of the tuber. The study performed on the potato puree showed that, between the two analysed samples, the product which employed dual lower pressure treatments seemed to be the most suitable to the production of gnocchi, in terms of its better physicochemical and textural properties. It did not evidence aggregation phenomena responsible of hard lumps, which may occur in this kind of semi-finished product. The textural properties of gnocchi doughs were not influenced by the different formulation as expected. Among the ingredients involved in the preparation of the different samples, soft wheat flour seemed to be the most crucial in affecting the quality features of gnocchi doughs. As a consequence of the interactive effect of the ingredients on the physicochemical and textural characteristics of the different doughs, a uniform and well-defined split-up among samples was not obtained. In the comparison of different kinds of gnocchi, the optimal physicochemical and textural properties were detected in the sample made with fresh tubers. This was probably caused not only by the use of fresh steam cooked potatoes, but also by the pasteurized liquid eggs and by the absence of any kind of emulsifier, additive or preserving substance.
Resumo:
The role of the human gut microbiota in impacting host’s health has been widely studied in the last decade. Notably, it has been recently demonstrated that diet and nutritional status are among the most important modifiable determinants of human health, through a plethora of presumptive mechanisms among which microbiota-mediated processes are thought to have a relevant role. At present, probiotics and prebiotics represent a useful dietary approach for influencing the composition and activity of the human gut microbial community. The present study is composed of two main sections, aimed at elucidating the probiotic potential of the yeast strain K. marxianus B0399, as well as the promising putative prebiotic activity ascribable to four different flours, naturally enriched in dietary fibres content. Here, by in vitro studies we demonstrated that K. marxianus B0399 possesses a number of beneficial and strain-specific properties desirable for a microorganism considered for application as a probiotics. Successively, we investigated the impact of a novel probiotic yoghurt containing B. animalis subsp. lactis Bb12 and K. marxianus B0399 on the gut microbiota of a cohort of subjects suffering from IBS and enrolled in a in vivo clinical study. We demonstrated that beneficial effects described for the probiotic yoghurt were not associated to significant modifications of the human intestinal microbiota. Additionally, using a colonic model system we investigated the impact of different flours (wholegrain rye and wheat, chickpeas and lentils 50:50, and barley milled grains) on the intestinal microbiota composition and metabolomic output, combining molecular and cellular analysis with a NMR metabolomics approach. We demonstrated that each tested flour showed peculiar and positive modulations of the intestinal microbiota composition and its small molecule metabolome, thus supporting the utilisation of these ingredients in the development of a variety of potentially prebiotic food products aimed at improving human health.
Resumo:
The Mediterranean diet is rich in healthy substances such as fibres, vitamins and phenols. Often these molecules are lost during food processing. Olive oil milling waste waters, brans, grape skins are some of the most relevant agri-food by-products in the Mediterranean countries. These wastes are still rich in extremely valuable molecules, such as phenolic antioxidants, that have several interesting health promoting properties. Using innovative environmental friendly technologies based in the rational use of enzymatic treatment is possible to obtain from agri-food by-products new ingredients containing antioxidants that can be used as functional ingredients in order to produce fortified foods. These foods, having health protecting/promoting properties, on top of the traditional nutritional properties, are attracting consumer’s attentions due to the increasing awareness on health protection through prevention. The use of these new ingredients in different food preparation was studied in order to evaluate the effects that the food processing might have on the antioxidant fraction, the effect of these ingredient on foods appearances as well as the impact in terms of taste and scent, crucial feature for the acceptability of the final product. Using these new ingredients was possible to produce antioxidant bred, pasta, cheese, cookies and ice-cream. These food products retains very well the antioxidant properties conferred by the added ingredients despite the very different treatments that were performed. The food obtained had a good palatability and in some cases the final product had also a good success on the market.
Resumo:
Tuta absoluta (Meyrick) è un lepidottero originario dell’America meridionale, infeudato a pomodoro e ad altre solanacee coltivate e spontanee. Con l’attività trofica le larve causano mine fogliari e gallerie nei frutti, con conseguenti ingenti danni alle colture. T. absoluta è stato segnalato per la prima volta in Italia nel 2008 e in Piemonte nel 2009. Pertanto le ricerche sono state condotte per rilevarne la distribuzione in Piemonte, studiarne l’andamento di popolazione in condizioni naturali e controllate, e valutare l’efficacia di differenti mezzi di lotta al fine di definire le strategie di difesa. Il monitoraggio, condotto nel 2010, ha evidenziato come T. absoluta sia ormai largamente diffuso sul territorio regionale già pochi mesi dopo la segnalazione. L’insetto ha mostrato di prediligere condizioni climatiche più miti; infatti è stato ritrovato con maggiore frequenza nelle aree più calde. Il fitofago ha raggiunto densità di popolazione elevate a partire dalla seconda metà dell’estate, a ulteriore dimostrazione che, in una regione a clima temperato come il Piemonte, T. absoluta dà origine a infestazioni economicamente rilevanti solo dopo il culmine della stagione estiva. Per definire le strategie di lotta, sono state condotte prove in laboratorio, semi-campo e campo volte a valutare la tossicità nei confronti del lepidottero di preparati a base di emamectina benzoato, rynaxypyr, spinosad e Bacillus thuringiensis Berliner. In campo è stata verificata anche l’efficacia del miride dicifino Macrolophus pygmaeus (Rambur), reperibile in commercio. In tutte le prove, è stata riscontrata una maggiore efficacia di rynaxypyr ed emamectina benzoato. In campo M. pygmaeus ha mostrato difficoltà d’insediamento ed è stato in grado di contenere efficacemente il fitofago soltanto con bassi livelli d’infestazione. Per contro è stata costantemente osservata la presenza naturale di un altro miride dicifino Dicyphus errans (Wolff), che in laboratorio ha mostrato di non essere particolarmente disturbato dalle sostanze saggiate.
Resumo:
The purpose of the PhD research was the identification of new strategies of farming and processing, with the aim to improve the nutritional and technological characteristics of poultry meat. Part of the PhD research was focused on evaluation of alternative farming systems, with the aim to increase animal welfare and to improve the meat quality and sensorial characteristics in broiler chickens. It was also assessed the use of innovative ingredients for marination of poultry meat (sodium bicarbonate and natural antioxidants) The research was developed by studying the following aspects: - Meat quality characteristics, oxidative stability and sensorial traits of chicken meat obtained from two different farming systems: free range vs conventional; - Meat quality traits of frozen chicken breast pre-salted using increasing concentrations of sodium chloride; - Use of sodium bicarbonate in comparison with sodium trypolyphosphate for marination of broiler breast meat and phase; - Marination with thyme and orange essential oils mixture to improve chicken meat quality traits, susceptibility to lipid oxidation and sensory traits. The following meat quality traits analyseswere performed: Colour, pH, water holding capacity by conventional (gravimetric methods, pressure application, centrifugation and cooking) and innovative methods (low-field NMR and DSC analysis) ability to absorb marinade soloutions, texture (shear force using different probes and texture profile analysis), proximate analysis (moisture, proteins, lipids, ash content, collagen, fatty acid), susceptibility to lipid oxidation (determinations of reactive substances with thiobarbituric acid and peroxide value), sensorial analysis (triangle test and consumer test).
Resumo:
The overall objective of this PhD was to investigate the possibility to increase the nutritional value of confectionary products by the use of natural ingredients with healthy functions. The first part of the thesis focused on the possible substitution of the most characteristic component of confectionary products, i.e. refined sugar. Many natural whole sweetening alternatives are available, though not widely used; the use of molasses, the byproduct of sugar beet and cane production, still rich in healthy components as minerals and phytochemicals is hereby discussed; after having verified molasses effectiveness in oxidative stress counteraction on liver cultured cells, the higher antioxidant capacity of a sweet food prepared with molasses instead of refined sugar was confirmed. A second step of the project dealt with another main ingredient of various sweet products, namely wheat. Particularly, the exploitation of soft and durum wheat byproducts could be another sustainable strategy to improve the healthy value of confectionery. The isolation of oligosaccharides with bioactive functions form different fractions of the wheat milling stream was studied and the new ingredients were shown to have a high dietary fiber and antioxidants content. As valid alternative, product developers should consider the appealing and healthy addition of ancient grains flour to sweet baked goods. The possibility of substituting the modern whole durum wheat with the ancient Kamut® khorasan was considered, and the antioxidant and anti-inflammatory effects of these grains were evaluated and compared both in vitro and in vivo on rats. Finally, since high consumption of confectionery is a risk factor for obesity, a possible strategy for the counteraction of this disease was investigated. The ability of three bioactives in inhibiting adipocytes differentiation was investigated. In fact, theoretically, compounds able to influence adipogenesis could be used in the formulation of functional sweet products and contribute to prevent obesity.
Resumo:
The aim of this thesis was to investigate some important key factors able to promote the prospected growth of the aquaculture sector. The limited availability of fishmeal and fish oil led the attention of the aquafeed industry to reduce the dependency on marine raw materials in favor of vegetable ingredients. In Chapter 2, we reported the effects of fishmeal replacement by a mixture of plant proteins in turbot (Psetta maxima L.) juveniles. At the end of the trial, it was found that over the 15% plant protein inclusion can cause stress and exert negative effects on growth performance and welfare. Climate change aroused the attention of the aquafeed industry toward the production of specific diets capable to counteract high temperatures. In Chapter 3, we investigated the most suitable dietary lipid level for gilthead seabream (Sparus aurata L.) reared at Mediterranean summer temperature. In this trial, it was highlighted that 18% dietary lipid allows a protein sparing effect, thus making the farming of this species economically and environmentally more sustainable. The introduction of new farmed fish species makes necessary the development of new species-specific diets. In Chapter 4, we assessed growth response and feed utilization of common sole (Solea solea L.) juveniles fed graded dietary lipid levels. At the end of the trial, it was found that increasing dietary lipids over 8% led to a substantial decline in growth performance and feed utilization indices. In Chapter 5, we investigated the suitability of mussel meal as alternative ingredient in diets for common sole juveniles. Mussel meal proved to be a very effective alternative ingredient for enhancing growth performance, feed palatability and feed utilization in sole irrespectively to the tested inclusion levels. This thesis highlighted the importance of formulating more specific diets in order to support the aquaculture growth in a sustainable way.