9 resultados para Cobalt aluminate. Curing method of complex (CPM). Inorganic pigments. Fractional factorial planning
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The present PhD thesis was focused on the development and application of chemical methodology (Py-GC-MS) and data-processing method by multivariate data analysis (chemometrics). The chromatographic and mass spectrometric data obtained with this technique are particularly suitable to be interpreted by chemometric methods such as PCA (Principal Component Analysis) as regards data exploration and SIMCA (Soft Independent Models of Class Analogy) for the classification. As a first approach, some issues related to the field of cultural heritage were discussed with a particular attention to the differentiation of binders used in pictorial field. A marker of egg tempera the phosphoric acid esterified, a pyrolysis product of lecithin, was determined using HMDS (hexamethyldisilazane) rather than the TMAH (tetramethylammonium hydroxide) as a derivatizing reagent. The validity of analytical pyrolysis as tool to characterize and classify different types of bacteria was verified. The FAMEs chromatographic profiles represent an important tool for the bacterial identification. Because of the complexity of the chromatograms, it was possible to characterize the bacteria only according to their genus, while the differentiation at the species level has been achieved by means of chemometric analysis. To perform this study, normalized areas peaks relevant to fatty acids were taken into account. Chemometric methods were applied to experimental datasets. The obtained results demonstrate the effectiveness of analytical pyrolysis and chemometric analysis for the rapid characterization of bacterial species. Application to a samples of bacterial (Pseudomonas Mendocina), fungal (Pleorotus ostreatus) and mixed- biofilms was also performed. A comparison with the chromatographic profiles established the possibility to: • Differentiate the bacterial and fungal biofilms according to the (FAMEs) profile. • Characterize the fungal biofilm by means the typical pattern of pyrolytic fragments derived from saccharides present in the cell wall. • Individuate the markers of bacterial and fungal biofilm in the same mixed-biofilm sample.
Resumo:
The aim of this work is to present various aspects of numerical simulation of particle and radiation transport for industrial and environmental protection applications, to enable the analysis of complex physical processes in a fast, reliable, and efficient way. In the first part we deal with speed-up of numerical simulation of neutron transport for nuclear reactor core analysis. The convergence properties of the source iteration scheme of the Method of Characteristics applied to be heterogeneous structured geometries has been enhanced by means of Boundary Projection Acceleration, enabling the study of 2D and 3D geometries with transport theory without spatial homogenization. The computational performances have been verified with the C5G7 2D and 3D benchmarks, showing a sensible reduction of iterations and CPU time. The second part is devoted to the study of temperature-dependent elastic scattering of neutrons for heavy isotopes near to the thermal zone. A numerical computation of the Doppler convolution of the elastic scattering kernel based on the gas model is presented, for a general energy dependent cross section and scattering law in the center of mass system. The range of integration has been optimized employing a numerical cutoff, allowing a faster numerical evaluation of the convolution integral. Legendre moments of the transfer kernel are subsequently obtained by direct quadrature and a numerical analysis of the convergence is presented. In the third part we focus our attention to remote sensing applications of radiative transfer employed to investigate the Earth's cryosphere. The photon transport equation is applied to simulate reflectivity of glaciers varying the age of the layer of snow or ice, its thickness, the presence or not other underlying layers, the degree of dust included in the snow, creating a framework able to decipher spectral signals collected by orbiting detectors.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
Complex Networks analysis turn out to be a very promising field of research, testified by many research projects and works that span different fields. Those analysis have been usually focused on characterize a single aspect of the system and a study that considers many informative axes along with a network evolve is lacking. We propose a new multidimensional analysis that is able to inspect networks in the two most important dimensions, space and time. To achieve this goal, we studied them singularly and investigated how the variation of the constituting parameters drives changes to the network as a whole. By focusing on space dimension, we characterized spatial alteration in terms of abstraction levels. We proposed a novel algorithm that, by applying a fuzziness function, can reconstruct networks under different level of details. We verified that statistical indicators depend strongly on the granularity with which a system is described and on the class of networks. We keep fixed the space axes and we isolated the dynamics behind networks evolution process. We detected new instincts that trigger social networks utilization and spread the adoption of novel communities. We formalized this enhanced social network evolution by adopting special nodes (called sirens) that, thanks to their ability to attract new links, were able to construct efficient connection patterns. We simulated the dynamics of the system by considering three well-known growth models. Applying this framework to real and synthetic networks, we showed that the sirens, even when used for a limited time span, effectively shrink the time needed to get a network in mature state. In order to provide a concrete context of our findings, we formalized the cost of setting up such enhancement and provided the best combinations of system's parameters, such as number of sirens, time span of utilization and attractiveness.
Resumo:
In Chapter 1 I will present a brief introduction on the state of art of nanotechnologies, nanofabrication techniques and unconventional lithography as a technique to fabricate the novel electronic device as resistive switch so-called memristor is shown. In Chapter 2 a detailed description of the main fabrication and characterization techniques employed in this work is reported. Chapter 3 parallel local oxidation lithography (pLOx) describes as a main technique to obtain accurate patterning process. All the effective parameters has been studied and the optimized condition observed to highly reproducible with excellent patterned nanostructures. The effect of negative bias, calls local reduction (LR) studied. Moreover, the use of AC bias shows faster patterning process respect to DC bias. In Chapter 4 (metal/ e-SiO2/ Si nanojunction) it is shown how the electrochemical oxide nanostructures by using pLOx can be used in the fabrication of novel devices call memristor. We demonstrate a new concept, based on conventional materials, where the lifetime problem is resolved by introducing a “regeneration” step, which restores the nano-memristor to its pristine condition by applying an appropriate voltage cycle. In Chapter 5 (Graphene/ e-SiO2/ Si), Graphene as a building block material is used as an electrode to selectively oxidize the silicon substrate by pLOx set up for the fabrication of novel resistive switch device. In Chapter 6 (surface architecture) I will show another application of pLOx in biotechnology is shown. So the surface functionalization combine with nano-patterning by pLOx used to design a new surface to accurately bind biomolecules with the possibility of studying those properties and more application in nano-bio device fabrication. So, in order to obtain biochips, electronic and optical/photonics devices Nano patterning of DNA used as scaffolds to fabricate small functional nano-components.
Resumo:
The study of polymorphism has an important role in several fields of materials science, because structural differences lead to different physico-chemical properties of the system. This PhD work was dedicated to the investigation of polymorphism in Indigo, Thioindigo and Quinacridone, as case studies among the organic pigments employed as semiconductors, and in Paracetamol, Phenytoin and Nabumetone, chosen among some commonly used API. The aim of the research was to improve the understanding on the structures of bulk crystals and thin films, adopting Raman spectroscopy as the method of choice, while resorting to other experimental techniques to complement the gathered information. Different crystalline polymorphs, in fact, may be conveniently distinguished by their Raman spectra in the region of the lattice phonons (10-150 cm-1), the frequencies of which, probing the inter-molecular interactions, are very sensitive to even slight modifications in the molecular packing. In particular, we have used Confocal Raman Microscopy, which is a powerful, yet simple, technique for the investigation of crystal polymorphism in organic and inorganic materials, being capable of monitoring physical modifications, chemical transformations and phase inhomogeneities in crystal domains at the micrometre scale. In this way, we have investigated bulk crystals and thin film samples obtained with a variety of crystal growth and deposition techniques. Pure polymorphs and samples with phase mixing were found and fully characterized. Raman spectroscopy was complemented mainly by XRD measurements for bulk crystals and by AFM, GIXD and TEM for thin films. Structures and phonons of the investigated polymorphs were computed by DFT methods, and the comparison between theoretical and experimental results was used to assess the relative stability of the polymorphs and to assist the spectroscopic investigation. The Raman measurements were thus found to be able to clarify ambiguities in the phase assignments which otherwise the other methods were unable to solve.
Resumo:
In this study, the lubrication theory is used to model flow in geological fractures and analyse the compound effect of medium heterogeneity and complex fluid rheology. Such studies are warranted as the Newtonian rheology is adopted in most numerical models because of its ease of use, despite non-Newtonian fluids being ubiquitous in subsurface applications. Past studies on Newtonian and non-Newtonian flow in single rock fractures are summarized in Chapter 1. Chapter 2 presents analytical and semi-analytical conceptual models for flow of a shear-thinning fluid in rock fractures having a simplified geometry, providing a first insight on their permeability. in Chapter 3, a lubrication-based 2-D numerical model is first implemented to solve flow of an Ellis fluid in rough fractures; the finite-volumes model developed is more computationally effective than conducting full 3-D simulations, and introduces an acceptable approximation as long as the flow is laminar and the fracture walls relatively smooth. The compound effect of shear-thinning fluid nature and fracture heterogeneity promotes flow localization, which in turn affects the performance of industrial activities and remediation techniques. In Chapter 4, a Monte Carlo framework is adopted to produce multiple realizations of synthetic fractures, and analyze their ensemble statistics pertaining flow for a variety of real non-Newtonian fluids; the Newtonian case is used as a benchmark. In Chapter 5 and Chapter 6, a conceptual model of the hydro-mechanical aspects of backflow occurring in the last phase of hydraulic fracturing is proposed and experimentally validated, quantifying the effects of the relaxation induced by the flow.
Resumo:
The experimental projects discussed in this thesis are all related to the field of artificial molecular machines, specifically to systems composed of pseudorotaxane and rotaxane architectures. The characterization of the peculiar properties of these mechano-molecules is frequently associated with the analysis and elucidation of complex reaction networks; this latter aspect represents the main focus and central thread tying my thesis work. In each chapter, a specific project is described as summarized below: the focus of the first chapter is the realization and characterization of a prototype model of a photoactivated molecular transporter based on a pseudorotaxane architecture; in the second chapter is reported the design, synthesis, and characterization of a [2]rotaxane endowed with a dibenzylammonium station and a novel photochromic unit that acts as a recognition site for a DB24C8 crown ether macrocycle; in the last chapter is described the synthesis and characterization of a [3]rotaxane in which the relative number of rings and stations can be changed on command.
Resumo:
Background: The frozen elephant trunk(FET) technique is one of the last evolution in the treatment of complex pathologies of the aortic arch and the descending thoracic aorta.Materials and methods: Between January 2007 and March 2021, a total of 396 patients underwent total aortic arch replacements with the FET technique in our centre.The main indications were thoracic aortic aneurysm(n=104,28.2%), chronic aortic dissection(n=224,53.4%) and acute aortic dissection(n=68, 18.4%). We divided the population in two groups according the position of the distal anastomosis (zone 2 vs zone 3) and the length of the stent graft (< 150 mm vs > 150 mm): conservative group (Zone 2 anastomosis + stent length < 150mm, n. 140 pts) and aggressive group (zone 3 anastomosis + stent length > 150mm, n. 141). Results: The overall 30-day mortality rate was 13%(48/369); the risk factor analysis showed that an aggressive approach was neither a risk factor for major complication (permanent dialysis, tracheostomy, bowel malperfusion and permanent paraplegia) neither for 30-day mortality. The survival rate at 1, 5,10 and 15 years was 87.7%,75%,61.3% and 58.4% respectively. During the follow up, an aortic reintervention was performed in 122 patients (38%), 5 patients received a non-aortic cardiac surgery. Freedom from aortic reintervention at 1-,5- and 10-year was 77%,54% and 44% respectively. The freedom from aortic reintervention was higher in the ‘aggressive’ group (62.5%vs40.0% at 5 years, log-rank=0.056). An aggressive approach was not protective for aortic reintervention at follow up and for death at follow up. Conclusions: The FET technique represents a feasible and efficient option in the treatment of complex thoracic aortic pathologies. An aortic reintervention after FET is very common and the decision-making approach should consider and balance the higher risk of an aggressive approach in terms of post-operative complication versus the higher risk of a second aortic reintervention at follow-up.