12 resultados para Propellente random-close packing sferico frazione volumetrica
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
Resumo:
High-frequency seismograms contain features that reflect the random inhomogeneities of the earth. In this work I use an imaging method to locate the high contrast small- scale heterogeneity respect to the background earth medium. This method was first introduced by Nishigami (1991) and than applied to different volcanic and tectonically active areas (Nishigami, 1997, Nishigami, 2000, Nishigami, 2006). The scattering imaging method is applied to two volcanic areas: Campi Flegrei and Mt. Vesuvius. Volcanic and seismological active areas are often characterized by complex velocity structures, due to the presence of rocks with different elastic properties. I introduce some modifications to the original method in order to make it suitable for small and highly complex media. In particular, for very complex media the single scattering approximation assumed by Nishigami (1991) is not applicable as the mean free path becomes short. The multiple scattering or diffusive approximation become closer to the reality. In this thesis, differently from the ordinary Nishigami’s method (Nishigami, 1991), I use the mean of the recorded coda envelope as reference curve and calculate the variations from this average envelope. In this way I implicitly do not assume any particular scattering regime for the "average" scattered radiation, whereas I consider the variations as due to waves that are singularly scattered from the strongest heterogeneities. The imaging method is applied to a relatively small area (20 x 20 km), this choice being justified by the small length of the analyzed codas of the low magnitude earthquakes. I apply the unmodified Nishigami’s method to the volcanic area of Campi Flegrei and compare the results with the other tomographies done in the same area. The scattering images, obtained with frequency waves around 18 Hz, show the presence of high scatterers in correspondence with the submerged caldera rim in the southern part of the Pozzuoli bay. Strong scattering is also found below the Solfatara crater, characterized by the presence of densely fractured, fluid-filled rocks and by a strong thermal anomaly. The modified Nishigami’s technique is applied to the Mt. Vesuvius area. Results show a low scattering area just below the central cone and a high scattering area around it. The high scattering zone seems to be due to the contrast between the high rigidity body located beneath the crater and the low rigidity materials located around it. The central low scattering area overlaps the hydrothermal reservoirs located below the central cone. An interpretation of the results in terms of geological properties of the medium is also supplied, aiming to find a correspondence of the scattering properties and the geological nature of the material. A complementary result reported in this thesis is that the strong heterogeneity of the volcanic medium create a phenomenon called "coda localization". It has been verified that the shape of the seismograms recorded from the stations located at the top of the volcanic edifice of Mt. Vesuvius is different from the shape of the seismograms recorded at the bottom. This behavior is justified by the consideration that the coda energy is not uniformly distributed within a region surrounding the source for great lapse time.
Resumo:
The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.
Resumo:
Blue mould caused by Penicillium expansum Link is one of the most destructive rot of pome fruit in all growing areas (Snowdon, 1990; Jones and Aldwinckle, 1991; Tonini,1996) In the past, Penicillium rot has been controlled by fungicide postharvest treatment mainly by thiabendazole (TBZ) and benomyl (Hardenburg and Spalding, 1972), but their intense use produced the appearance of resistant strains with a great reduction of their activity The aims of the present study were to characterize the isolates of Pencillium sp causing blue mold on pear in Italy by physiological and biochemical parameters. In particular differencing also the behavior of isolates to relationship with sensitivity or resistance to TBZ treatments. We have examined the early stage of infection in relation to enzyme activity, local modulation of pH, production of organic acids, and to secondary metabolism of pathogen. The results described here confirm that the majority of P. expansum isolates from pears packing houses are resistant to TBZ, Among the TBZ-resistant isolates scored in this work, different isolates (RR) showed higher percentage of conidial germination on TBZ-amended medium compared to non amended medium. This may indicate a stimulatory effect of TBZ on conidial germination. Therefore TBZ treatments are not only ineffective for controlling P. expansum, but they may also increase the severity of blue mould on fruits. In the absence of fungicide, isolates showed a significant difference for infection severity, R and RR isolates are characterized by higher pathogenic fitness on fruits, producing larger lesions than S isolates. These data are supported by the study with laboratory-induced resistant isolates, which shows the lack of correlation between TBZ resistance and osmotic sensitivity, and highlights the association between TBZ resistance and infection severity (Baraldi et al 2003). Enzymatic screening gave a positive reaction to esterase, urease, pectinase activity, in addition, the pathogen is able to synthesize a complex enzyme act to degrade the main components of the cell wall especially pectin and cellulose. Isolated sensitive and resistant are characterized by a good activity of pectinase, especially from poligactoronase, which, as already reported by several studies (D'hallewin et al, 2004; Prusky et al, 2004), are the basis of degradative process of cell wall. Also, although the measure was minor also highlighted some activities of cellulase, but even note in the production of this kind of cellulase and hemicellulase P. Expansum were not targeted, studies have found no other source of information in this regard. Twenty isolates of Penicillium expansum, were tested in vitro ad in vivo for acid production ability and pH drop. We have found that modulation of pH and the organic acids extrusion were influence to various parameter: Initial pH: in general, the greatest reduction of pH was observed in isolates grown at pH 7, except for four isolates that maintained the pH of the medium close to 7, the others significantly decreased the pH, ranging from 5.5 to 4.1.. In extreme acid condition (pH 3,0) growth and modulation of pH is most lower respect optimal condition (pH 5,0). Also isolates R and RR have showed a greater adaptation to environmental condition more than isolates S. Time: although the acidification continues for some days, PH modulation is strongest in early hours (48-72 hours)of inoculation process. Time also affects the quality of organic acids, for example in vitro results showed an initial abundant production of succinc acid, followed to important production of galacturoinc acid. Substrates: there are many differences for the type of acids produced in vitro and in vivo. Results showed in vivo an abundant production of galacturonic, malic, and citric acids and some unknown organic acids in smaller concentrations. Secondary metabolite analysis revealed intra-specific differences, and patulin was found in all isolates, but most significant reduction was observed between in vitro and in vivo samples. There was no correlation between the concentration of patulin, and the percentage of infected fruits, but sample with a lower infection severity of rotten area than the others, showed a significantly lower mycotoxin concentration than samples with a higher lesion diameter of rotten area. Beyond of patulin was detected the presence of another secondary metabolite, penitrem A.
Computer simulation of ordering and dynamics in liquid crystals in the bulk and close to the surface
Resumo:
The aim of this PhD thesis is to investigate the orientational and dynamical properties of liquid crystalline systems, at molecular level and using atomistic computer simulations, to reach a better understanding of material behavior from a microscopic point view. In perspective this should allow to clarify the relation between the micro and macroscopic properties with the objective of predicting or confirming experimental results on these systems. In this context, we developed four different lines of work in the thesis. The first one concerns the orientational order and alignment mechanism of rigid solutes of small dimensions dissolved in a nematic phase formed by the 4-pentyl,4 cyanobiphenyl (5CB) nematic liquid crystal. The orientational distribution of solutes have been obtained with Molecular Dynamics Simulation (MD) and have been compared with experimental data reported in literature. we have also verified the agreement between order parameters and dipolar coupling values measured in NMR experiments. The MD determined effective orientational potentials have been compared with the predictions of MaierSaupe and Surface tensor models. The second line concerns the development of a correct parametrization able to reproduce the phase transition properties of a prototype of the oligothiophene semiconductor family: sexithiophene (T6). T6 forms two crystalline polymorphs largely studied, and possesses liquid crystalline phases still not well characterized, From simulations we detected a phase transition from crystal to liquid crystal at about 580 K, in agreement with available experiments, and in particular we found two LC phases, smectic and nematic. The crystalsmectic transition is associated to a relevant density variation and to strong conformational changes of T6, namely the molecules in the liquid crystal phase easily assume a bent shape, deviating from the planar structure typical of the crystal. The third line explores a new approach for calculating the viscosity in a nematic through a virtual exper- iment resembling the classical falling sphere experiment. The falling sphere is replaced by an hydrogenated silicon nanoparticle of spherical shape suspended in 5CB, and gravity effects are replaced by a constant force applied to the nanoparticle in a selected direction. Once the nanoparticle reaches a constant velocity, the viscosity of the medium can be evaluated using Stokes' law. With this method we successfully reproduced experimental viscosities and viscosity anisotropy for the solvent 5CB. The last line deals with the study of order induction on nematic molecules by an hydrogenated silicon surface. Gaining predicting power for the anchoring behavior of liquid crystals at surfaces will be a very desirable capability, as many properties related to devices depend on molecular organization close to surfaces. Here we studied, by means of atomistic MD simulations, the flat interface between an hydrogenated (001) silicon surface in contact with a sample of 5CB molecules. We found a planar anchoring of the first layers of 5CB where surface interactions are dominating with respect to the mesogen intermolecular interactions. We also analyzed the interface 5CBvacuum, finding a homeotropic orientation of the nematic at this interface.
Resumo:
An anaerobic consortium, capable of efficiently converting into methane the organic fraction of mechanically sorted municipal solid waste (MS-OFMSW), was obtained through a dedicated enrichment procedure in a 0.36 L up-flow anaerobic recirculated reactor. This result was obtained after several micro-reactor fed-batch procedures that allowed to obtain only a few methanization of the MS-OFMSW.
Resumo:
This work of thesis involves various aspects of crystal engineering. Chapter 1 focuses on crystals containing crown ether complexes. Aspects such as the possibility of preparing these materials by non-solution methods, i.e. by direct reaction of the solid components, thermal behavior and also isomorphism and interconversion between hydrates are taken into account. In chapter 2 a study is presented aimed to understanding the relationship between hydrogen bonding capability and shape of the building blocks chosen to construct crystals. The focus is on the control exerted by shape on the organization of sandwich cations such as cobalticinium, decamethylcobalticinium and bisbenzenchromium(I) and on the aggregation of monoanions all containing carboxylic and carboxylate groups, into 0-D, 1-D, 2-D and 3-D networks. Reactions conducted in multi-component molecular assemblies or co-crystals have been recognized as a way to control reactivity in the solid state. The [2+2] photodimerization of olefins is a successful demonstration of how templated solid state synthesis can efficiently synthesize unique materials with remarkable stereoselectivity and under environment-friendly conditions. A demonstration of this synthetic strategy is given in chapter 3. The combination of various types of intermolecular linkages, leading to formation of high order aggregation and crystalline materials or to a random aggregation resulting in an amorphous precipitate, may not go to completeness. In such rare cases an aggregation process intermediate between crystalline and amorphous materials is observed, resulting in the formation of a gel, i.e. a viscoelastic solid-like or liquid-like material. In chapter 4 design of new Low Molecular Weight Gelators is presented. Aspects such as the relationships between molecular structure, crystal packing and gelation properties and the application of this kind of gels as a medium for crystal growth of organic molecules, such as APIs, are also discussed.
Resumo:
It is usual to hear a strange short sentence: «Random is better than...». Why is randomness a good solution to a certain engineering problem? There are many possible answers, and all of them are related to the considered topic. In this thesis I will discuss about two crucial topics that take advantage by randomizing some waveforms involved in signals manipulations. In particular, advantages are guaranteed by shaping the second order statistic of antipodal sequences involved in an intermediate signal processing stages. The first topic is in the area of analog-to-digital conversion, and it is named Compressive Sensing (CS). CS is a novel paradigm in signal processing that tries to merge signal acquisition and compression at the same time. Consequently it allows to direct acquire a signal in a compressed form. In this thesis, after an ample description of the CS methodology and its related architectures, I will present a new approach that tries to achieve high compression by design the second order statistics of a set of additional waveforms involved in the signal acquisition/compression stage. The second topic addressed in this thesis is in the area of communication system, in particular I focused the attention on ultra-wideband (UWB) systems. An option to produce and decode UWB signals is direct-sequence spreading with multiple access based on code division (DS-CDMA). Focusing on this methodology, I will address the coexistence of a DS-CDMA system with a narrowband interferer. To do so, I minimize the joint effect of both multiple access (MAI) and narrowband (NBI) interference on a simple matched filter receiver. I will show that, when spreading sequence statistical properties are suitably designed, performance improvements are possible with respect to a system exploiting chaos-based sequences minimizing MAI only.
Resumo:
La regolazione dei sistemi di propulsione a razzo a propellente solido (Solid Rocket Motors) ha da sempre rappresentato una delle principali problematiche legate a questa tipologia di motori. L’assenza di un qualsiasi genere di controllo diretto del processo di combustione del grano solido, fa si che la previsione della balistica interna rappresenti da sempre il principale strumento utilizzato sia per definire in fase di progetto la configurazione ottimale del motore, sia per analizzare le eventuali anomalie riscontrate in ambito sperimentale. Variazioni locali nella struttura del propellente, difettosità interne o eterogeneità nelle condizioni di camera posso dare origine ad alterazioni del rateo locale di combustione del propellente e conseguentemente a profili di pressione e di spinta sperimentali differenti da quelli previsti per via teorica. Molti dei codici attualmente in uso offrono un approccio piuttosto semplificato al problema, facendo per lo più ricorso a fattori correttivi (fattori HUMP) semi-empirici, senza tuttavia andare a ricostruire in maniera più realistica le eterogeneità di prestazione del propellente. Questo lavoro di tesi vuole dunque proporre un nuovo approccio alla previsione numerica delle prestazioni dei sistemi a propellente solido, attraverso la realizzazione di un nuovo codice di simulazione, denominato ROBOOST (ROcket BOOst Simulation Tool). Richiamando concetti e techiche proprie della Computer Grafica, questo nuovo codice è in grado di ricostruire in processo di regressione superficiale del grano in maniera puntuale, attraverso l’utilizzo di una mesh triangolare mobile. Variazioni locali del rateo di combustione posso quindi essere facilmente riprodotte ed il calcolo della balistica interna avviene mediante l’accoppiamento di un modello 0D non-stazionario e di uno 1D quasi-stazionario. L’attività è stata svolta in collaborazione con l’azienda Avio Space Division e il nuovo codice è stato implementato con successo sul motore Zefiro 9.
Resumo:
This thesis, after presenting recent advances obtained for the two-dimensional bin packing problem, focuses on the case where guillotine restrictions are imposed. A mathematical characterization of non-guillotine patterns is provided and the relation between the solution value of the two-dimensional problem with guillotine restrictions and the two-dimensional problem unrestricted is being studied from a worst-case perspective. Finally it presents a new heuristic algorithm, for the two-dimensional problem with guillotine restrictions, based on partial enumeration, and computationally evaluates its performance on a large set of instances from the literature. Computational experiments show that the algorithm is able to produce proven optimal solutions for a large number of problems, and gives a tight approximation of the optimum in the remaining cases.