904 resultados para Extrusion Instabilities
Resumo:
As increasingly more sophisticated materials and products are being developed and times-to-market need to be minimized, it is important to make available fast response characterization tools using small amounts of sample, capable of conveying data on the relationships between rheological response, process-induced material structure and product characteristics. For this purpose, a single / twin-screw mini-extrusion system of modular construction, with well-controlled outputs in the range 30-300 g/h, was coupled to a in- house developed rheo-optical slit die able to measure shear viscosity and normal-stress differences, as well as performing rheo-optical experiments, namely small angle light scattering (SALS) and polarized optical microscopy (POM). In addition, the mini-extruder is equipped with ports that allow sample collection, and the extrudate can be further processed into products to be tested later. Here, we present the concept and experimental set-up [1, 2]. As a typical application, we report on the characterization of the processing of a polymer blend and of the properties of extruded sheets. The morphological evolution of a PS/PMMA industrial blend along the extruder, the flow-induced structures developed and the corresponding rheological characteristics are presented, together with the mechanical and structural characteristics of produced sheets. The application of this experimental tool to other material systems will also be discussed.
Resumo:
Microinjection molding of polymer composites with carbon nanotubes (CNT) requires previous production of the nanocomposites, often by melt extrusion. Each processing step has a thermo-mechanical effect on the polymer melt, conveying different properties to the final product. In this work, polyamide 6 and its composites with pristine and functionalized CNT (f-CNT) were processed by a mini twin-screw extrusion, followed by microinjection molding. The morphology induced on the polymer by each process was analyzed by differential scanning calorimetry and wide angle X-ray diffraction. Calorimetric analysis showed a secondary crystallization for the microinjected materials, absent for the extruded materials. The characterization of microinjected polyamide 6 by X-ray diffraction revealed a large contribution of the c phase to the total crystallinity, mainly in the skin region, while the nanocomposites and extruded materials were characterized by a larger contribution of the a phase. Functionalization of CNT did not affect significantly the polymer morphology compared to composites with pristine CNT.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
This paper addresses the potential of polypropylene (PP) as a candidate for fused deposition modeling (FDM)-based 3D printing technique. The entire filament production chain is evaluated, starting with the PP pellets, filament production by extrusion and test samples printing. This strategy enables a true comparison between parts printed with parts manufactured by compression molding, using the same grade of raw material. Printed samples were mechanically characterized and the influence of filament orientation, layer thickness, infill degree and material was assessed. Regarding the latter, two grades of PP were evaluated: a glass-fiber reinforced and a neat, non-reinforced, one. The results showed the potential of the FDM to compete with conventional techniques, especially for the production of small series of parts/components; also, it was showed that this technique allows the production of parts with adequate mechanical performance and, therefore, does not need to be restricted to the production of mockups and prototypes.
Resumo:
The currently available clinical imaging methods do not provide highly detailed information about location and severity of axonal injury or the expected recovery time of patients with traumatic brain injury [1]. High-Definition Fiber Tractography (HDFT) is a novel imaging modality that allows visualizing and quantifying, directly, the degree of axons damage, predicting functional deficits due to traumatic axonal injury and loss of cortical projections. This imaging modality is based on diffusion technology [2]. The inexistence of a phantom able to mimic properly the human brain hinders the possibility of testing, calibrating and validating these medical imaging techniques. Most research done in this area fails in key points, such as the size limit reproduced of the brain fibers and the quick and easy reproducibility of phantoms [3]. For that reason, it is necessary to develop similar structures matching the micron scale of axon tubes. Flexible textiles can play an important role since they allow producing controlled packing densities and crossing structures that match closely the human crossing patterns of the brain. To build a brain phantom, several parameters must be taken into account in what concerns to the materials selection, like hydrophobicity, density and fiber diameter, since these factors influence directly the values of fractional anisotropy. Fiber cross-section shape is other important parameter. Earlier studies showed that synthetic fibrous materials are a good choice for building a brain phantom [4]. The present work is integrated in a broader project that aims to develop a brain phantom made by fibrous materials to validate and calibrate HDFT. Due to the similarity between thousands of hollow multifilaments in a fibrous arrangement, like a yarn, and the axons, low twist polypropylene multifilament yarns were selected for this development. In this sense, extruded hollow filaments were analysed in scanning electron microscope to characterize their main dimensions and shape. In order to approximate the dimensional scale to human axons, five types of polypropylene yarns with different linear density (denier) were used, aiming to understand the effect of linear density on the filament inner and outer areas. Moreover, in order to achieve the required dimensions, the polypropylene filaments cross-section was diminished in a drawing stage of a filament extrusion line. Subsequently, tensile tests were performed to characterize the mechanical behaviour of hollow filaments and to evaluate the differences between stretched and non-stretched filaments. In general, an increase of the linear density causes the increase in the size of the filament cross section. With the increase of structure orientation of filaments, induced by stretching, breaking tenacity increases and elongation at break decreases. The production of hollow fibers, with the required characteristics, is one of the key steps to create a brain phantom that properly mimics the human brain that may be used for the validation and calibration of HDFT, an imaging approach that is expected to contribute significantly to the areas of brain related research.
Resumo:
Dissertação de mestrado integrado em Engenharia de Materiais
Resumo:
Dissertação de mestrado integrado em Engenharia de Materiais
Resumo:
BACKGROUND Most cancers, including breast cancer, have high rates of glucose consumption, associated with lactate production, a process referred as "Warburg effect". Acidification of the tumour microenvironment by lactate extrusion, performed by lactate transporters (MCTs), is associated with higher cell proliferation, migration, invasion, angiogenesis and increased cell survival. Previously, we have described MCT1 up-regulation in breast carcinoma samples and demonstrated the importance of in vitro MCT inhibition. In this study, we performed siRNA knockdown of MCT1 and MCT4 in basal-like breast cancer cells in both normoxia and hypoxia conditions to validate the potential of lactate transport inhibition in breast cancer treatment. RESULTS The effect of MCT knockdown was evaluated on lactate efflux, proliferation, cell biomass, migration and invasion and induction of tumour xenografts in nude mice. MCT knockdown led to a decrease in in vitro tumour cell aggressiveness, with decreased lactate transport, cell proliferation, migration and invasion and, importantly, to an inhibition of in vivo tumour formation and growth. CONCLUSIONS This work supports MCTs as promising targets in cancer therapy, demonstrates the contribution of MCTs to cancer cell aggressiveness and, more importantly, shows, for the first time, the disruption of in vivo breast tumour growth by targeting lactate transport.
Resumo:
"Available online 28 March 2016"
Resumo:
Tese de Doutoramento em Ciência e Engenharia de Polímeros e Compósitos.
Resumo:
It is well-known that couples that look jointly for jobs in the same centralized labor market may cause instabilities. We demonstrate that for a natural preference domain for couples, namely the domain of responsive preferences, the existence of stable matchings can easily be established. However, a small deviation from responsiveness in one couple's preference relation that models the wish of a couple to be closer together may already cause instability. This demonstrates that the nonexistence of stable matchings in couples markets is not a singular theoretical irregularity. Our nonexistence result persists even when a weaker stability notion is used that excludes myopic blocking. Moreover, we show that even if preferences are responsive there are problems that do not arise for singles markets. Even though for couples markets with responsive preferences the set of stable matchings is nonempty, the lattice structure that this set has for singles markets does not carry over. Furthermore we demonstrate that the new algorithm adopted by the National Resident Matching Program to fill positions for physicians in the United States may cycle, while in fact a stable matchings does exist, and be prone to strategic manipulation if the members of a couple pretend to be single.
Resumo:
Estudi realitzat a partir d’una estada al Physics Department de la New York University, United States, Estats Units, entre 2006 i 2008. Una de les observacions de més impacte en la cosmologia moderna ha estat la determinació empírica que l’Univers es troba actualment en una fase d’Expansió Accelerada (EA). Aquest fenòmen implica que o bé l’Univers està dominat per un nou sector de matèria/energia, o bé la Relativitat General deixa de tenir validesa a escales cosmològiques. La primera possibilitat comprèn els models d’Energia Fosca (EF), i el seu principal problema és que l’EF ha de tenir propietats tan especials que es fan difícils de justificar teòricament. La segona possibilitat requereix la construcció de teories consistents de Gravetat Modificada a Grans Distàncies (GMGD), que són una generalització dels models de gravetat massiva. L’interès fenomenològic per aquestes teories també va resorgir amb l’aparició dels primers exemples de models de GMGD, com ara el model de Dvali, Gabadadze i Porrati (DGP), que consisteix en un tipus de brana en una dimensió extra. Malauradament, però, aquest model no permet explicar de forma consistent l’EA de l’Univers. Un dels objectius d’aquest projecte ha estat establir la viabilitat interna i fenomenològica dels models de GMGD. Des del punt de vista fenomenològic, ens hem centrat en la questió més important a la pràctica: trobar signatures observacionals que permetin distingir els models de GMGD dels d’EF. A nivell més teòric, també hem investigat el significat de les inestabilitats del model DGP.L’altre gran objectiu que ens vam proposar va ser la construcció de noves teories de GMGD. En la segona part d’aquest projecte, hem elaborat i mostrat la consistència del model “DGP en Cascada”, que generalitza el model DGP a més dimensions extra, i representa el segon model consistent i invariant-Lorentz a l’espai pla conegut. L’existència d’altres models de GMGD més enllà de DGP és de gran interès atès que podria permetre obtenir l’EA de l’Univers de forma purament geomètrica.
Resumo:
An expanding literature articulates the view that Taylor rules are helpful in predicting exchange rates. In a changing world however, Taylor rule parameters may be subject to structural instabilities, for example during the Global Financial Crisis. This paper forecasts exchange rates using such Taylor rules with Time Varying Parameters (TVP) estimated by Bayesian methods. In core out-of-sample results, we improve upon a random walk benchmark for at least half, and for as many as eight out of ten, of the currencies considered. This contrasts with a constant parameter Taylor rule model that yields a more limited improvement upon the benchmark. In further results, Purchasing Power Parity and Uncovered Interest Rate Parity TVP models beat a random walk benchmark, implying our methods have some generality in exchange rate prediction.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
We analyse the role of time-variation in coefficients and other sources of uncertainty in exchange rate forecasting regressions. Our techniques incorporate the notion that the relevant set of predictors and their corresponding weights, change over time. We find that predictive models which allow for sudden rather than smooth, changes in coefficients significantly beat the random walk benchmark in out-of-sample forecasting exercise. Using innovative variance decomposition scheme, we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients' variability, as the main factors hindering models' forecasting performance. The uncertainty regarding the choice of the predictor is small.