24 resultados para New approaches

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Food commodity prices fluctuations have important impacts on poverty and food insecurity across the world. Conventional models have not provided a complete picture of recent price spikes in agricultural commodity markets, while there is an urgent need for appropriate policy responses. Perhaps new approaches are needed in order to better understand international spill-overs, the feedback between the real and the financial sectors and also the link between food and energy prices. In this paper, we present results from a new worldwide dynamic model that provides short and long-run impulse responses of wheat international prices to various real shocks.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this thesis, new advances in the development of spectroscopic based methods for the characterization of heritage materials have been achieved. As concern FTIR spectroscopy new approaches aimed at exploiting near and far IR region for the characterization of inorganic or organic materials have been tested. Paint cross-section have been analysed by FTIR spectroscopy in the NIR range and an “ad hoc” chemometric approach has been developed for the elaboration of hyperspectral maps. Moreover, a new method for the characterization of calcite based on the use of grinding curves has been set up both in MIR and in FAR region. Indeed, calcite is a material widely applied in cultural heritage, and this spectroscopic approach is an efficient and rapid tool to distinguish between different calcite samples. Different enhanced vibrational techniques for the characterisation of dyed fibres have been tested. First a SEIRA (Surface Enhanced Infra-Red Absorption) protocol has been optimised allowing the analysis of colorant micro-extracts thanks to the enhancement produced by the addition of gold nanoparticles. These preliminary studies permitted to identify a new enhanced FTIR method, named ATR/RAIRS, which allowed to reach lower detection limits. Regarding Raman microscopy, the research followed two lines, which have in common the aim of avoiding the use of colloidal solutions. AgI based supports obtained after deposition on a gold-coated glass slides have been developed and tested spotting colorant solutions. A SERS spectrum can be obtained thanks to the photoreduction, which the laser may induce on the silver salt. Moreover, these supports can be used for the TLC separation of a mixture of colorants and the analyses by means of both Raman/SERS and ATR-RAIRS can be successfully reached. Finally, a photoreduction method for the “on fiber” analysis of colorant without the need of any extraction have been optimised.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Herpes simplex virus entry into cells requires a multipartite fusion apparatus made of gD, gB and heterodimer gH/gL. gD serves as receptor-binding glycoprotein and trigger of fusion; its ectodomain is organized in a N-terminal domain carrying the receptor-binding sites, and a C-terminal domain carrying the profusion domain, required for fusion but not receptor-binding. gB and gH/gL execute fusion. To understand how the four glycoproteins cross-talk to each other we searched for biochemical defined complexes in infected and transfected cells, and in virions. We report that gD formed complexes with gB in absence of gH/gL, and with gH/gL in absence of gB. Complexes with similar composition were formed in infected and transfected cells. They were also present in virions prior to entry, and did not increase at virus fusion with cell. A panel of gD mutants enabled the preliminary location of part of the binding site in gD to gB to the aa 240-260 portion and downstream, with T306P307 as critical residues, and of the binding site to gH/gL at aa 260-310 portion, with P291P292 as critical residues. The results indicate that gD carries composite independent binding sites for gB and gH/gL, both of which partly located in the profusion domain. The second part of the project dealt with rational design of peptides inhibiting virus entry has been performed. Considering gB and gD, the crystal structure is known, so we designed peptides that dock in the structure or prevent the adoption of the final conformation of target molecule. Considering the other glycoproteins, of which the structure is not known, peptide libraries were analyzed. Among several peptides, some were identified as active, designed on glycoprotein B. Two of them were further analyzed. We identified peptide residues fundamental for the inhibiting activity, suggesting a possible mechanism of action. Furthermore, changing the flexibility of peptides, an increased activity was observed,with an EC50 under 10μM. New approaches will try to demonstrate the direct interaction between these peptides and the target glycoprotein B.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La tesi di ricerca si propone di esaminare due tipologie della canzone sociale nel XIX secolo, ed in particolare attorno al 1848. Lo studio del canto nei contesti presi in esame (l’Italia e la Francia) viene analizzato attraverso due piste di ricerca parallele tra loro. Da una parte si è utilizzato il concetto di sociabilité per conoscere i luoghi di produzione e di diffusione del canto (l’importanza della strada, dell’osteria, delle goguette parigine, degli chansonniers des rues e dei cantastorie) e le circostanze di utilizzazione della canzone (la canzone in quanto forma d’espressione orale ma anche come scrittura murale, foglio volante e volantino). Dall’altra l’analisi si è focalizzata sui contenuti dei testi musicali per mette in luce le differenti tematiche, le immagini linguistiche e le figure retoriche cantate dall’artigiano-operaio per far emergere le differenze dell’idea di nazione tra i due contesti presi in esame. L’attenzione posta alla comparazione condurrà all’evidenziazione di punti di contatto tra le due nazioni. Il canto, infatti, costituisce un terreno privilegiato per comprendere l’immagine dell’“altro”: quale immagine possedevano i lavoratori francesi dell’Italia risorgimentale? E gli artigiani italiani come percepivano la nazione francese? Il canto viene analizzato non solamente come un “testo” ma anche come una “pratica sociale”. Queste operazioni permetteranno di sondare più in profondità la funzione sociale svolta dalla canzone all’interno della cultura popolare e la sua importanza in quanto forma d’espressione e vettore di politicizzazione. La duplice utilizzazione del canto, in quanto “testo” e “pratica”, consente di inserire la ricerca all’interno di un filone storiografico che dalla storia sociale si muove a quella culturale. La canzone sociale rappresenta un fertile terreno di ricerca, non solamente all’interno di un singolo territorio nazionale, ma possiede un prezioso valore euristico in funzione comparativa.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’obiettivo della tesi è studiare il virus HIV-1 in relazione alle alterazioni sistemiche, riscontrate nel paziente HIV-infetto, in particolare alterazioni a carico del sistema scheletrico, indotte dal virus o dall’azione dei farmaci utilizzati nella terapia antiretrovirale (HAART). L’incidenza dell’osteoporosi nei pazienti HIV-positivi è drammaticamente elevata rispetto alla popolazione sana. Studi clinici hanno evidenziato come alcuni farmaci, ad esempio inibitori della proteasi virale, portino alla compromissione dell’omeostasi ossea, con aumento del rischio fratturativo. Il nostro studio prevede un follow-up di 12 mesi dall’inizio della HAART in una coorte di pazienti naïve, monitorando diversi markers ossei. I risultati ottenuti mostrano un incremento dei markers metabolici del turnover osseo, confermando l’impatto della HAART sull’omeostasi ossea. Successivamente abbiamo focalizzato la nostra attenzione sugli osteoblasti, il citotipo che regola la sintesi di nuova matrice ossea. Gli esperimenti condotti sulla linea HOBIT mettono in evidenza come il trattamento, in particolare con inibitori della proteasi, porti ad apoptosi nel caso in cui vi sia una concentrazione di farmaco maggiore di quella fisiologica. Tuttavia, anche concentrazioni fisiologiche di farmaci possono regolare negativamente alcuni marker ossei, come ALP e osteocalcina. Infine esiste la problematica dell’eradicazione di HIV-1 dai reservoirs virali. La HAART riesce a controllare i livelli viremici, ciononostante diversi studi propongono alcuni citotipi come potenziali reservoir di infezione, vanificando l’effetto della terapia. Abbiamo, perciò, sviluppato un nuovo approccio molecolare all’eradicazione: sfruttare l’enzima virale integrasi per riconoscere in modo selettivo le sequenze LTR virali per colpire il virus integrato. Fondendo integrasi e l’endonucleasi FokI, abbiamo generato diversi cloni. Questi sono stati transfettati stabilmente in cellule Jurkat, suscettibili all’infezione. Una volta infettate, abbiamo ottenuto una significativa riduzione dei markers di infezione. Successivamente la transfezione nella linea linfoblastica 8E5/LAV, che porta integrata nel genoma una copia di HIV, ha dato risultati molto incoraggianti, come la forte riduzione del DNA virale integrato.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The protein silk fibroin (SF) from the silkworm Bombyx mori is a FDA-approved biomaterial used over centuries as sutures wire. Importantly, several evidences highlighted the potential of silk biomaterials obtained by using so-called regenerated silk fibroin (RSF) in biomedicine, tissue engineering and drug delivery. Indeed, by a water-based protocol, it is possible to obtain protein water-solution, by extraction and purification of fibroin from silk fibres. Notably, RSF can be processed in a variety of biomaterials forms used in biomedical and technological fields, displaying remarkable properties such as biocompatibility, controllable biodegradability, optical transparency, mechanical robustness. Moreover, RSF biomaterials can be doped and/or chemical functionalized with drugs, optically active molecules, growth factors and/or chemicals In this view, activities of my PhD research program were focused to standardize the process of extraction and purification of protein to get the best physical and chemical characteristics. The analysis of the chemo-physical properties of the fibroin involved both the RSF water-solution and the protein processed in film. Chemo-physical properties have been studied through: vibrational (FT-IR and Raman-FT) and optical (absorption and emission UV-VIS) spectroscopy, nuclear magnetic resonance (1H and 13C NMR), thermal analysis and thermo-gravimetric scan (DSC and TGA). In the last year of my PhD, activities were focused to study and define innovative methods of functionalization of the silk fibroin solution and films. Indeed, research program was the application of different methods of manufacturing approaches of the films of fibroin without the use of harsh treatments and organic solvents. New approaches to doping and chemical functionalization of the silk fibroin were studied. Two different methods have been identified: 1) biodoping that consists in the doping of fibroin with optically active molecules through the addition of fluorescent molecules in the standard diet used for the breeding of silkworms; 2) chemical functionalization via silylation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis regards the study and the development of new cognitive assessment and rehabilitation techniques of subjects with traumatic brain injury (TBI). In particular, this thesis i) provides an overview about the state of art of this new assessment and rehabilitation technologies, ii) suggests new methods for the assessment and rehabilitation and iii) contributes to the explanation of the neurophysiological mechanism that is involved in a rehabilitation treatment. Some chapters provide useful information to contextualize TBI and its outcome; they describe the methods used for its assessment/rehabilitation. The other chapters illustrate a series of experimental studies conducted in healthy subjects and TBI patients that suggest new approaches to assessment and rehabilitation. The new proposed approaches have in common the use of electroencefalografy (EEG). EEG was used in all the experimental studies with a different purpose, such as diagnostic tool, signal to command a BCI-system, outcome measure to evaluate the effects of a treatment, etc. The main achieved results are about: i) the study and the development of a system for the communication with patients with disorders of consciousness. It was possible to identify a paradigm of reliable activation during two imagery task using EEG signal or EEG and NIRS signal; ii) the study of the effects of a neuromodulation technique (tDCS) on EEG pattern. This topic is of great importance and interest. The emerged founding showed that the tDCS can manipulate the cortical network activity and through the research of optimal stimulation parameters, it is possible move the working point of a neural network and bring it in a condition of maximum learning. In this way could be possible improved the performance of a BCI system or to improve the efficacy of a rehabilitation treatment, like neurofeedback.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Starch is the main form in which plants store carbohydrates reserves, both in terms of amounts and distribution among different plant species. Carbohydrates are direct products of photosynthetic activity, and it is well know that yield efficiency and production are directly correlated to the amount of carbohydrates synthesized and how these are distributed among vegetative and reproductive organs. Nowadays, in pear trees, due to the modernization of orchards, through the introduction of new rootstocks and the development of new training systems, the understanding and the development of new approaches regarding the distribution and storage of carbohydrates, are required. The objective of this research work was to study the behavior of carbohydrate reserves, mainly starch, in different pear tree organs and tissues: i.e., fruits, leaves, woody organs, roots and flower buds, at different physiological stages during the season. Starch in fruit is accumulated at early stages, and reached a maximum concentration during the middle phase of fruit development; after that, its degradation begins with a rise in soluble carbohydrates. Moreover, relationships between fruit starch degradation and different fruit traits, soluble sugars and organic acids were established. In woody organs and roots, an interconversion between starch and soluble carbohydrates was observed during the dormancy period that confirms its main function in supporting the growth and development of new tissues during the following spring. Factors as training systems, rootstocks, types of bearing wood, and their position on the canopy, influenced the concentrations of starch and soluble carbohydrates at different sampling dates. Also, environmental conditions and cultural practices must be considered to better explain these results. Thus, a deeper understanding of the dynamics of carbohydrates reserves within the plant could provide relevant information to improve several management practices to increase crop yield efficiency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Choosing natural enemies to suppress pest population has been for a long the key of biological control. Overtime the term biological control has also been applied to the use of suppressive soils, bio-disinfection and biopesticides. Biological control agents (BCA) and natural compounds, extracted or fermented from various sources, are the resources for containing phytopathogens. BCA can act through direct antagonism mechanisms or inducing hypovirulence of the pathogen. The first part of the thesis focused on mycoviruses infecting phytopathogenic fungi belonging to the genus Fusarium. The development of new approaches capable of faster dissecting the virome of filamentous fungi samples was performed. The semiconductor-based sequencer Ion Torrent™ and the nanopore-based sequencer MinION have been exploited to analyze DNA and RNA referable to viral genomes. Comparison with GeneBank accessions and sequence analysis allowed to identify more than 40 putative viral species, some of these mycovirus genera have been studied as inducers of hypovirulence in several phytopathogenic fungi, therefore future works will focus on the comparison of the morphology and physiology of the fungal strain infected and cured by the viruses identified and their possible use as a biocontrol agent. In a second part of the thesis the potential of botanical pesticides has been evaluated for the biocontrol of phloem limited phytopathogens such as phytoplasmas. The only active compounds able to control phytoplasmas are the antibiotic oxytetracyclines and in vitro direct and fast screening of new antimicrobials compounds on media is almost impossible due to the difficulty to culture phytoplasmas. For this reason, a simple and reliable screening method was developed to evaluate the effects of antimicrobials directly on phytoplasmas by an “ex-vivo” approach. Using scanning electron microscopy (SEM) in parallel with molecular tools (ddRT-PCR), the direct activity of tetracyclines on phytoplasma cells was verified, identifying also a promising compound showing similar activity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biomedicine is a highly interdisciplinary research area at the interface of sciences, anatomy, physiology, and medicine. In the last decade, biomedical studies have been greatly enhanced by the introduction of new technologies and techniques for automated quantitative imaging, thus considerably advancing the possibility to investigate biological phenomena through image analysis. However, the effectiveness of this interdisciplinary approach is bounded by the limited knowledge that a biologist and a computer scientist, by professional training, have of each other’s fields. The possible solution to make up for both these lacks lies in training biologists to make them interdisciplinary researchers able to develop dedicated image processing and analysis tools by exploiting a content-aware approach. The aim of this Thesis is to show the effectiveness of a content-aware approach to automated quantitative imaging, by its application to different biomedical studies, with the secondary desirable purpose of motivating researchers to invest in interdisciplinarity. Such content-aware approach has been applied firstly to the phenomization of tumour cell response to stress by confocal fluorescent imaging, and secondly, to the texture analysis of trabecular bone microarchitecture in micro-CT scans. Third, this approach served the characterization of new 3-D multicellular spheroids of human stem cells, and the investigation of the role of the Nogo-A protein in tooth innervation. Finally, the content-aware approach also prompted to the development of two novel methods for local image analysis and colocalization quantification. In conclusion, the content-aware approach has proved its benefit through building new approaches that have improved the quality of image analysis, strengthening the statistical significance to allow unveiling biological phenomena. Hopefully, this Thesis will contribute to inspire researchers to striving hard for pursuing interdisciplinarity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background L’incidenza di malattie valvolari aortiche è in costante aumento. La terapia definitiva è chirurgica o interventistica, determinando un evidente miglioramento della qualità di vita, a fronte di un rischio operatorio ormai estremamente basso. Le linee guida internazionali più recenti pongono in classe I entrambe le procedure nella fascia di età fra 65 e 80 anni. Materiali e metodi È stata effettuata un’analisi retrospettiva dei pazienti di età compresa fra 65 e 80 anni, sottoposti a sostituzione valvolare aortica isolata chirurgica con bioprotesi sutureless (gruppo SU-AVR), oppure trans-catetere (gruppo TAVR), presso Maria Cecilia Hospital tra gennaio 2011 e dicembre 2021. Mediante propensity score matching sono stati analizzati, nei due gruppi risultanti, gli outcomes di mortalità e complicanze intraospedaliere, a 30 giorni, ad un anno e attuariale. Risultati Sono stati inclusi nello studio 638 pazienti, di cui 338 (52.98%) nel gruppo SU-AVR e 300 (47.02%) nel gruppo TAVR. Dopo propensity score matching, sono stati ottenuti due gruppi di pazienti (124 per gruppo) senza differenze statisticamente significative nelle comorbidità preoperatorie. La mortalità a 30 giorni è risultata sovrapponibile nei 2 gruppi. Il gruppo TAVR ha mostrato un’incidenza significativamente maggiore di impianto di pacemaker definitivo e di danni vascolari maggiori, mentre il gruppo SU-AVR ha mostrato una maggior incidenza di fibrillazione atriale, di trasfusioni e di insufficienza renale. La mortalità per tutte le cause a un anno è risultata significativamente maggiore per il gruppo TAVR e il divario continua ad aumentare con il tempo. Conclusioni La sostituzione valvolare aortica trans-catetere (TAVR) mostra risultati molto buoni nel breve termine nei pazienti fra 65 e 80 anni di età. Al follow-up a medio termine, tuttavia, i risultati preliminari mostrano un miglior outcome dei pazienti sottoposti a sostituzione valvolare chirurgica, sia in termini di mortalità per qualsiasi causa che di eventi cardiovascolari e cerebrovascolari maggiori.