19 resultados para effect and process

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the current thesis is to investigate the temporal dynamics (i.e., time courses) of the Simon effect, both from a theoretical and experimental point of view, for a better understanding of whether a) one or more process are responsible for the Simon effect and b) how this/these mechanism/s differently influence performance. In the first theoretical (i.e., “Theoretical Overview”) part, I examined in detail the process and justification for analyzing the temporal dynamics of the Simon effect and the assumptions that underlie interpretation of the results which have been obtained in the existing literature so far. In the second part (“Experimental Investigations”), though, I experimentally investigated several issues which the existing literature left unsolved, in order to get further evidence in favor or in contrast with the mainstream models which are currently used to account for the different Simon effect time courses. Some points about the experiments are worth mentioning: First, all the experiments were conducted in the laboratory, facing participants with stimuli presented on a PC screen and then recording their responses. Both stimuli presentation and response collection was controlled by the E-Prime software. The dependent variables of interest were always behavioral measures of performance, such as velocity and accuracy. Second, the most part of my experiments had been conducted at the Communication Sciences Department (University of Bologna), under Prof. Nicoletti’s supervision. The remaining part, though, had been conducted at the Psychological Sciences Department of Purdue University (West Lafayette, Indiana, USA), where I collaborated for one year as a visiting student with Prof. Proctor and his team. Third, my experimental pool was entirely composed by healthy and young students, since the cognitive functioning of elderly people was not the target of my research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research aims at contributing to a better understanding of changes in local governments’ accounting and reporting practices. Particularly, ‘why’, ‘what’ and ‘how’ environmental aspects are included and the significance of changes across time. It adopts an interpretative approach to conduct a longitudinal analysis of case studies. Pettigrew and Whipp’s framework on context, content and process is used as a lens to distinguish changes under each dimension and analyse their interconnections. Data is collected from official documents and triangulated with semi-structured interviews. The legal framework defines as boundaries of the accounting information the territory under local governments’ jurisdiction and their immediate surrounding area. Organisational environmental performance and externalities are excluded from the requirements. An interplay between the local outer context, political commitment and organisational culture justifies the implementation of changes beyond what is regulated and the implementation of transformational changes. Local governments engage in international networks to gain access to funding and implement changes, leading to adopting the dominant environmental agenda. Key stakeholders, like citizens, are not engaged in the accounting and reporting process. Thus, there is no evidence that the environmental aspects addressed and related changes align with stakeholders’ needs and expectations, which jeopardises its significance. Findings from the current research have implications in other EU member states due to the harmonisation of accounting and reporting practices and the common practice across the EU in using external funding to conceptualise and implement changes. This implies that other local governments could also be representing a limited account related to environmental aspects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, it is clear that the target of creating a sustainable future for the next generations requires to re-think the industrial application of chemistry. It is also evident that more sustainable chemical processes may be economically convenient, in comparison with the conventional ones, because fewer by-products means lower costs for raw materials, for separation and for disposal treatments; but also it implies an increase of productivity and, as a consequence, smaller reactors can be used. In addition, an indirect gain could derive from the better public image of the company, marketing sustainable products or processes. In this context, oxidation reactions play a major role, being the tool for the production of huge quantities of chemical intermediates and specialties. Potentially, the impact of these productions on the environment could have been much worse than it is, if a continuous efforts hadn’t been spent to improve the technologies employed. Substantial technological innovations have driven the development of new catalytic systems, the improvement of reactions and process technologies, contributing to move the chemical industry in the direction of a more sustainable and ecological approach. The roadmap for the application of these concepts includes new synthetic strategies, alternative reactants, catalysts heterogenisation and innovative reactor configurations and process design. Actually, in order to implement all these ideas into real projects, the development of more efficient reactions is one primary target. Yield, selectivity and space-time yield are the right metrics for evaluating the reaction efficiency. In the case of catalytic selective oxidation, the control of selectivity has always been the principal issue, because the formation of total oxidation products (carbon oxides) is thermodynamically more favoured than the formation of the desired, partially oxidized compound. As a matter of fact, only in few oxidation reactions a total, or close to total, conversion is achieved, and usually the selectivity is limited by the formation of by-products or co-products, that often implies unfavourable process economics; moreover, sometimes the cost of the oxidant further penalizes the process. During my PhD work, I have investigated four reactions that are emblematic of the new approaches used in the chemical industry. In the Part A of my thesis, a new process aimed at a more sustainable production of menadione (vitamin K3) is described. The “greener” approach includes the use of hydrogen peroxide in place of chromate (from a stoichiometric oxidation to a catalytic oxidation), also avoiding the production of dangerous waste. Moreover, I have studied the possibility of using an heterogeneous catalytic system, able to efficiently activate hydrogen peroxide. Indeed, the overall process would be carried out in two different steps: the first is the methylation of 1-naphthol with methanol to yield 2-methyl-1-naphthol, the second one is the oxidation of the latter compound to menadione. The catalyst for this latter step, the reaction object of my investigation, consists of Nb2O5-SiO2 prepared with the sol-gel technique. The catalytic tests were first carried out under conditions that simulate the in-situ generation of hydrogen peroxide, that means using a low concentration of the oxidant. Then, experiments were carried out using higher hydrogen peroxide concentration. The study of the reaction mechanism was fundamental to get indications about the best operative conditions, and improve the selectivity to menadione. In the Part B, I explored the direct oxidation of benzene to phenol with hydrogen peroxide. The industrial process for phenol is the oxidation of cumene with oxygen, that also co-produces acetone. This can be considered a case of how economics could drive the sustainability issue; in fact, the new process allowing to obtain directly phenol, besides avoiding the co-production of acetone (a burden for phenol, because the market requirements for the two products are quite different), might be economically convenient with respect to the conventional process, if a high selectivity to phenol were obtained. Titanium silicalite-1 (TS-1) is the catalyst chosen for this reaction. Comparing the reactivity results obtained with some TS-1 samples having different chemical-physical properties, and analyzing in detail the effect of the more important reaction parameters, we could formulate some hypothesis concerning the reaction network and mechanism. Part C of my thesis deals with the hydroxylation of phenol to hydroquinone and catechol. This reaction is already industrially applied but, for economical reason, an improvement of the selectivity to the para di-hydroxilated compound and a decrease of the selectivity to the ortho isomer would be desirable. Also in this case, the catalyst used was the TS-1. The aim of my research was to find out a method to control the selectivity ratio between the two isomers, and finally to make the industrial process more flexible, in order to adapt the process performance in function of fluctuations of the market requirements. The reaction was carried out in both a batch stirred reactor and in a re-circulating fixed-bed reactor. In the first system, the effect of various reaction parameters on catalytic behaviour was investigated: type of solvent or co-solvent, and particle size. With the second reactor type, I investigated the possibility to use a continuous system, and the catalyst shaped in extrudates (instead of powder), in order to avoid the catalyst filtration step. Finally, part D deals with the study of a new process for the valorisation of glycerol, by means of transformation into valuable chemicals. This molecule is nowadays produced in big amount, being a co-product in biodiesel synthesis; therefore, it is considered a raw material from renewable resources (a bio-platform molecule). Initially, we tested the oxidation of glycerol in the liquid-phase, with hydrogen peroxide and TS-1. However, results achieved were not satisfactory. Then we investigated the gas-phase transformation of glycerol into acrylic acid, with the intermediate formation of acrolein; the latter can be obtained by dehydration of glycerol, and then can be oxidized into acrylic acid. Actually, the oxidation step from acrolein to acrylic acid is already optimized at an industrial level; therefore, we decided to investigate in depth the first step of the process. I studied the reactivity of heterogeneous acid catalysts based on sulphated zirconia. Tests were carried out both in aerobic and anaerobic conditions, in order to investigate the effect of oxygen on the catalyst deactivation rate (one main problem usually met in glycerol dehydration). Finally, I studied the reactivity of bifunctional systems, made of Keggin-type polyoxometalates, either alone or supported over sulphated zirconia, in this way combining the acid functionality (necessary for the dehydrative step) with the redox one (necessary for the oxidative step). In conclusion, during my PhD work I investigated reactions that apply the “green chemistry” rules and strategies; in particular, I studied new greener approaches for the synthesis of chemicals (Part A and Part B), the optimisation of reaction parameters to make the oxidation process more flexible (Part C), and the use of a bioplatform molecule for the synthesis of a chemical intermediate (Part D).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research is aimed at contributing to the identification of reliable fully predictive Computational Fluid Dynamics (CFD) methods for the numerical simulation of equipment typically adopted in the chemical and process industries. The apparatuses selected for the investigation, specifically membrane modules, stirred vessels and fluidized beds, were characterized by a different and often complex fluid dynamic behaviour and in some cases the momentum transfer phenomena were coupled with mass transfer or multiphase interactions. Firs of all, a novel modelling approach based on CFD for the prediction of the gas separation process in membrane modules for hydrogen purification is developed. The reliability of the gas velocity field calculated numerically is assessed by comparison of the predictions with experimental velocity data collected by Particle Image Velocimetry, while the applicability of the model to properly predict the separation process under a wide range of operating conditions is assessed through a strict comparison with permeation experimental data. Then, the effect of numerical issues on the RANS-based predictions of single phase stirred tanks is analysed. The homogenisation process of a scalar tracer is also investigated and simulation results are compared to original passive tracer homogenisation curves determined with Planar Laser Induced Fluorescence. The capability of a CFD approach based on the solution of RANS equations is also investigated for describing the fluid dynamic characteristics of the dispersion of organics in water. Finally, an Eulerian-Eulerian fluid-dynamic model is used to simulate mono-disperse suspensions of Geldart A Group particles fluidized by a Newtonian incompressible fluid as well as binary segregating fluidized beds of particles differing in size and density. The results obtained under a number of different operating conditions are compared with literature experimental data and the effect of numerical uncertainties on axial segregation is also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of soil incorporation of 7 Meliaceae derivatives (6 commercial neem cakes and leaves of Melia azedarach L.) on C and N dynamics and on nutrient availability to micropropagated GF677 rootstock was investigated. In a first laboratory incubation experiment the derivatives showed different N mineralization dynamics, generally well predicted by their C:N ratio and only partly by their initial N concentration. All derivatives increased microbial biomass C, thus representing a source of C for the soil microbial population. Soil addition of all neem cakes (8 g kg-1) and melia leaves (16 g kg-1) had a positive effect on plant growth and increased root N uptake and leaf green colour of micropropagated plants of GF677. In addition, the neem cakes characterized by higher nutrient concentration increased P and K concentration in shoot and leaves 68 days after the amendment. In another experiment, soil incorporation of 15N labeled melia leaves (16 g kg-1) had no effect on the total amount of plant N, however the percentage of melia derived-N of treated plants ranged between 0.8% and 34% during the experiment. At the end of the growing season, about 7% of N added as melia leaves was recovered in plant, while 70% of it was still present in soil. Real C mineralization and the priming effect induced by the addition of the derivatives were quantified by a natural 13C abundance method. The real C mineralization of the derivatives ranged between 22% and 40% of added-C. All the derivatives studied induced a positive priming effect and, 144 days after the amendment, the amount of C primed corresponded to 26% of added-C, for all the derivatives. Despite this substantial priming effect, the C balance of the soil, 144 days after the amendment, always resulted positive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the most recent years, Additive Manufacturing (AM) has drawn the attention of both academic research and industry, as it might deeply change and improve several industrial sectors. From the material point of view, AM results in a peculiar microstructure that strictly depends on the conditions of the additive process and directly affects mechanical properties. The present PhD research project aimed at investigating the process-microstructure-properties relationship of additively manufactured metal components. Two technologies belonging to the AM family were considered: Laser-based Powder Bed Fusion (LPBF) and Wire-and-Arc Additive Manufacturing (WAAM). The experimental activity was carried out on different metals of industrial interest: a CoCrMo biomedical alloy and an AlSi7Mg0.6 alloy processed by LPBF, an AlMg4.5Mn alloy and an AISI 304L austenitic stainless steel processed by WAAM. In case of LPBF, great attention was paid to the influence that feedstock material and process parameters exert on hardness, morphological and microstructural features of the produced samples. The analyses, targeted at minimizing microstructural defects, lead to process optimization. For heat-treatable LPBF alloys, innovative post-process heat treatments, tailored on the peculiar hierarchical microstructure induced by LPBF, were developed and deeply investigated. Main mechanical properties of as-built and heat-treated alloys were assessed and they were well-correlated to the specific LPBF microstructure. Results showed that, if properly optimized, samples exhibit a good trade-off between strength and ductility yet in the as-built condition. However, tailored heat treatments succeeded in improving the overall performance of the LPBF alloys. Characterization of WAAM alloys, instead, evidenced the microstructural and mechanical anisotropy typical of AM metals. Experiments revealed also an outstanding anisotropy in the elastic modulus of the austenitic stainless-steel that, along with other mechanical properties, was explained on the basis of microstructural analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What causes faster or slower procedures in the parliaments when considering international treaties? This question motivates the current research, which aims to understand how the nature of coalitions influence the duration of the legislative processes. For this, the analysis covers all the treaties signed by Mercosur between 1991 and 2021 and the internalisation processes in four member states (Argentina, Brazil, Paraguay and Uruguay). It observes how long each parliament took to approve the treaties and which was the effect of political and economic variables. A mixed-methods approach was adopted for the empirical research, combining Survival Analysis, Qualitative Comparative Analysis and Process Tracing. While the quantitative work investigates all the cases, the qualitative study illuminates the enlargement of Mercosur, with in-depth analysis of the Paraguayan approval of the Venezuelan and Bolivian accessions. This study provides important insights into the role of national legislatures in the Latin American regionalism, concluding that the government-opposition cleavage drives the parliamentarians’ behaviour on the topic of regional integration. The study also contributes to the field Mercosur studies with the characterisation of the treaties ratified domestically, by undertaking a longitudinal analysis at the 30th anniversary of the bloc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microstructure of 6XXX aluminum alloys deeply affects mechanical, crash, corrosion and aesthetic properties of extruded profiles. Unfortunately, grain structure evolution during manufacturing processes is a complex phenomenon because several process and material parameters such as alloy chemical composition, temperature, extrusion speed, tools geometries, quenching and thermal treatment parameters affect the grain evolution during the manufacturing process. The aim of the present PhD thesis was the analysis of the recrystallization kinetics during the hot extrusion of 6XXX aluminum alloys and the development of reliable recrystallization models to be used in FEM codes for the microstructure prediction at a die design stage. Experimental activities have been carried out in order to acquire data for the recrystallization models development, validation and also to investigate the effect of process parameters and die design on the microstructure of the final component. The experimental campaign reported in this thesis involved the extrusion of AA6063, AA6060 and AA6082 profiles with different process parameters in order to provide a reliable amount of data for the models validation. A particular focus was made to investigate the PCG defect evolution during the extrusion of medium-strength alloys such as AA6082. Several die designs and process conditions were analysed in order to understand the influence of each of them on the recrystallization behaviour of the investigated alloy. From the numerical point of view, innovative models for the microstructure prediction were developed and validated over the extrusion of industrial-scale profiles with complex geometries, showing a good matching in terms of the grain size and surface recrystallization prediction. The achieved results suggest the reliability of the developed models and their application in the industrial field for process and material properties optimization at a die-design stage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understandand process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Crowding is defined as the negative effect obtained by adding visual distractors around a central target which has to be identified. Some studies have suggested the presence of a marked crowding effect in developmental dyslexia (e.g. Atkinson, 1991; Spinelli et al., 2002). Inspired by Spinelli’s (2002) experimental design, we explored the hypothesis that the crowding effect may affect dyslexics’ response times (RTs) and accuracy in identification tasks dealing with words, pseudowords, illegal non-words and symbolstrings. Moreover, our study aimed to clarify the relationship between the crowding phenomenon and the word-reading process, in an inter-language comparison perspective. For this purpose we studied twenty-two French dyslexics and twenty-two Italian dyslexics (total forty-four dyslexics), compared to forty-four subjects matched for reading level (22 French and 22 Italians) and forty-four chronological age-matched subjects (22 French and 22 Italians). Children were all tested on reading and cognitive abilities. Results showed no differences between French and Italian participants suggesting that performances were homogenous. Dyslexic children were all significantly impaired in words and pseudowords reading compared to their normal reading controls. Regarding the identification task with which we assessed crowding effect, both accuracy and RTs showed a lexicality effect which meant that the recognition of words was more accurate and faster in words than pseudowords, non-words and symbolstrings. Moreover, compared to normal readers, dyslexics’ RTs and accuracy were impaired only for verbal materials but not for non-verbal material; these results are in line with the phonological hypothesis (Griffiths & Snowling, 2002; Snowling, 2000; 2006) . RTs revealed a general crowding effect (RTs in the crowding condition were slower than those recorded in the isolated condition) affecting all the subjects’ performances. This effect, however, emerged to be not specific for dyslexics. Data didn’t reveal a significant effect of language, allowing the generalization of the obtained results. We also analyzed the performance of two subgroups of dyslexics, categorized according to their reading abilities. The two subgroups produced different results regarding the crowding effect and type of material, suggesting that it is meaningful to take into account also the heterogeneity of the dyslexia disorder. Finally, we also analyzed the relationship of the identification task with both reading and cognitive abilities. In conclusion, this study points out the importance of comparing visual tasks performances of dyslexic participants with those of their reading level-matched controls. This approach may improve our comprehension of the potential causal link between crowding and reading (Goswami, 2003).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tissue engineering is a discipline that aims at regenerating damaged biological tissues by using a cell-construct engineered in vitro made of cells grown into a porous 3D scaffold. The role of the scaffold is to guide cell growth and differentiation by acting as a bioresorbable temporary substrate that will be eventually replaced by new tissue produced by cells. As a matter or fact, the obtainment of a successful engineered tissue requires a multidisciplinary approach that must integrate the basic principles of biology, engineering and material science. The present Ph.D. thesis aimed at developing and characterizing innovative polymeric bioresorbable scaffolds made of hydrolysable polyesters. The potentialities of both commercial polyesters (i.e. poly-e-caprolactone, polylactide and some lactide copolymers) and of non-commercial polyesters (i.e. poly-w-pentadecalactone and some of its copolymers) were explored and discussed. Two techniques were employed to fabricate scaffolds: supercritical carbon dioxide (scCO2) foaming and electrospinning (ES). The former is a powerful technology that enables to produce 3D microporous foams by avoiding the use of solvents that can be toxic to mammalian cells. The scCO2 process, which is commonly applied to amorphous polymers, was successfully modified to foam a highly crystalline poly(w-pentadecalactone-co-e-caprolactone) copolymer and the effect of process parameters on scaffold morphology and thermo-mechanical properties was investigated. In the course of the present research activity, sub-micrometric fibrous non-woven meshes were produced using ES technology. Electrospun materials are considered highly promising scaffolds because they resemble the 3D organization of native extra cellular matrix. A careful control of process parameters allowed to fabricate defect-free fibres with diameters ranging from hundreds of nanometers to several microns, having either smooth or porous surface. Moreover, versatility of ES technology enabled to produce electrospun scaffolds from different polyesters as well as “composite” non-woven meshes by concomitantly electrospinning different fibres in terms of both fibre morphology and polymer material. The 3D-architecture of the electrospun scaffolds fabricated in this research was controlled in terms of mutual fibre orientation by properly modifying the instrumental apparatus. This aspect is particularly interesting since the micro/nano-architecture of the scaffold is known to affect cell behaviour. Since last generation scaffolds are expected to induce specific cell response, the present research activity also explored the possibility to produce electrospun scaffolds bioactive towards cells. Bio-functionalized substrates were obtained by loading polymer fibres with growth factors (i.e. biomolecules that elicit specific cell behaviour) and it was demonstrated that, despite the high voltages applied during electrospinning, the growth factor retains its biological activity once released from the fibres upon contact with cell culture medium. A second fuctionalization approach aiming, at a final stage, at controlling cell adhesion on electrospun scaffolds, consisted in covering fibre surface with highly hydrophilic polymer brushes of glycerol monomethacrylate synthesized by Atom Transfer Radical Polymerization. Future investigations are going to exploit the hydroxyl groups of the polymer brushes for functionalizing the fibre surface with desired biomolecules. Electrospun scaffolds were employed in cell culture experiments performed in collaboration with biochemical laboratories aimed at evaluating the biocompatibility of new electrospun polymers and at investigating the effect of fibre orientation on cell behaviour. Moreover, at a preliminary stage, electrospun scaffolds were also cultured with tumour mammalian cells for developing in vitro tumour models aimed at better understanding the role of natural ECM on tumour malignity in vivo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-Processor SoC (MPSOC) design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. Scaling down of process technologies has increased process and dynamic variations as well as transistor wearout. Because of this, delay variations increase and impact the performance of the MPSoCs. The interconnect architecture inMPSoCs becomes a single point of failure as it connects all other components of the system together. A faulty processing element may be shut down entirely, but the interconnect architecture must be able to tolerate partial failure and variations and operate with performance, power or latency overhead. This dissertation focuses on techniques at different levels of abstraction to face with the reliability and variability issues in on-chip interconnection networks. By showing the test results of a GALS NoC testchip this dissertation motivates the need for techniques to detect and work around manufacturing faults and process variations in MPSoCs’ interconnection infrastructure. As a physical design technique, we propose the bundle routing framework as an effective way to route the Network on Chips’ global links. For architecture-level design, two cases are addressed: (I) Intra-cluster communication where we propose a low-latency interconnect with variability robustness (ii) Inter-cluster communication where an online functional testing with a reliable NoC configuration are proposed. We also propose dualVdd as an orthogonal way of compensating variability at the post-fabrication stage. This is an alternative strategy with respect to the design techniques, since it enforces the compensation at post silicon stage.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

My Doctorate Research has been focused on the evaluation of the pharmacological activity of a natural extract of chestnut wood (ENC) towards the cardiovascular and gastrointestinal system and on the identification of the active compounds. The ENC has been shown to contain more than 10% (w/w) of phenolic compounds, of which tannins as Vescalgin and Castalgin are the more representative. ENC cardiovascular effects have been investigated in guinea pig cardiac preparations; furthermore its activity has been evalueted in guinea pig aorta strips. ENC induced transient negative chronotropic effect in isolated spontaneously beating right atria and simultaneously positive inotropic effect in left atria driven at 1 Hz. Cardiac cholinergic receptors are not involved in the negative chronotropic effect and positive inotropic effects are not related to adrenergic receptors. In vascular smooth muscle, natural extract of chestnut did not significantly change the contraction induced by potassium (80 mM) or that induced by noradrenaline (1μM). In guinea pig ileum, ENC reduced the maximum response to carbachol in a concentrationdependent manner and behaved as a reversible non competitive antagonist. In guinea pig ileum, the antispasmodic activity of ENC showed a significant antispasmodic activity against a variety of different spasmogenic agents including histamine, KCl, BaCl2. In guinea pig proximal colon, stomach and jejunum, ENC reduced the maximum response to carbachol in a concentrationdependent manner and behaved as a reversible non competitive antagonist. ENC contracted gallbladder guinea pig in a reversible and concentration-dependent manner. This effect does not involve cholinergic and cholecystokinin receptors and it is reduced by nifedipine. ENC relaxed Oddi sphincter smooth muscle. The cholecystokinetic and Oddi sphincter relaxing activities occurred also in guinea pigs fed a lithogenic diet. The cholecystokinetic occurred also in human gallbladder. The Fractionation of the extract led to the identification of the active fraction.