925 resultados para damping dynamic mechanical analysis DMA CFRP electrospinning tan(delta)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract

The goal of modern radiotherapy is to precisely deliver a prescribed radiation dose to delineated target volumes that contain a significant amount of tumor cells while sparing the surrounding healthy tissues/organs. Precise delineation of treatment and avoidance volumes is the key for the precision radiation therapy. In recent years, considerable clinical and research efforts have been devoted to integrate MRI into radiotherapy workflow motivated by the superior soft tissue contrast and functional imaging possibility. Dynamic contrast-enhanced MRI (DCE-MRI) is a noninvasive technique that measures properties of tissue microvasculature. Its sensitivity to radiation-induced vascular pharmacokinetic (PK) changes has been preliminary demonstrated. In spite of its great potential, two major challenges have limited DCE-MRI’s clinical application in radiotherapy assessment: the technical limitations of accurate DCE-MRI imaging implementation and the need of novel DCE-MRI data analysis methods for richer functional heterogeneity information.

This study aims at improving current DCE-MRI techniques and developing new DCE-MRI analysis methods for particular radiotherapy assessment. Thus, the study is naturally divided into two parts. The first part focuses on DCE-MRI temporal resolution as one of the key DCE-MRI technical factors, and some improvements regarding DCE-MRI temporal resolution are proposed; the second part explores the potential value of image heterogeneity analysis and multiple PK model combination for therapeutic response assessment, and several novel DCE-MRI data analysis methods are developed.

I. Improvement of DCE-MRI temporal resolution. First, the feasibility of improving DCE-MRI temporal resolution via image undersampling was studied. Specifically, a novel MR image iterative reconstruction algorithm was studied for DCE-MRI reconstruction. This algorithm was built on the recently developed compress sensing (CS) theory. By utilizing a limited k-space acquisition with shorter imaging time, images can be reconstructed in an iterative fashion under the regularization of a newly proposed total generalized variation (TGV) penalty term. In the retrospective study of brain radiosurgery patient DCE-MRI scans under IRB-approval, the clinically obtained image data was selected as reference data, and the simulated accelerated k-space acquisition was generated via undersampling the reference image full k-space with designed sampling grids. Two undersampling strategies were proposed: 1) a radial multi-ray grid with a special angular distribution was adopted to sample each slice of the full k-space; 2) a Cartesian random sampling grid series with spatiotemporal constraints from adjacent frames was adopted to sample the dynamic k-space series at a slice location. Two sets of PK parameters’ maps were generated from the undersampled data and from the fully-sampled data, respectively. Multiple quantitative measurements and statistical studies were performed to evaluate the accuracy of PK maps generated from the undersampled data in reference to the PK maps generated from the fully-sampled data. Results showed that at a simulated acceleration factor of four, PK maps could be faithfully calculated from the DCE images that were reconstructed using undersampled data, and no statistically significant differences were found between the regional PK mean values from undersampled and fully-sampled data sets. DCE-MRI acceleration using the investigated image reconstruction method has been suggested as feasible and promising.

Second, for high temporal resolution DCE-MRI, a new PK model fitting method was developed to solve PK parameters for better calculation accuracy and efficiency. This method is based on a derivative-based deformation of the commonly used Tofts PK model, which is presented as an integrative expression. This method also includes an advanced Kolmogorov-Zurbenko (KZ) filter to remove the potential noise effect in data and solve the PK parameter as a linear problem in matrix format. In the computer simulation study, PK parameters representing typical intracranial values were selected as references to simulated DCE-MRI data for different temporal resolution and different data noise level. Results showed that at both high temporal resolutions (<1s) and clinically feasible temporal resolution (~5s), this new method was able to calculate PK parameters more accurate than the current calculation methods at clinically relevant noise levels; at high temporal resolutions, the calculation efficiency of this new method was superior to current methods in an order of 102. In a retrospective of clinical brain DCE-MRI scans, the PK maps derived from the proposed method were comparable with the results from current methods. Based on these results, it can be concluded that this new method can be used for accurate and efficient PK model fitting for high temporal resolution DCE-MRI.

II. Development of DCE-MRI analysis methods for therapeutic response assessment. This part aims at methodology developments in two approaches. The first one is to develop model-free analysis method for DCE-MRI functional heterogeneity evaluation. This approach is inspired by the rationale that radiotherapy-induced functional change could be heterogeneous across the treatment area. The first effort was spent on a translational investigation of classic fractal dimension theory for DCE-MRI therapeutic response assessment. In a small-animal anti-angiogenesis drug therapy experiment, the randomly assigned treatment/control groups received multiple fraction treatments with one pre-treatment and multiple post-treatment high spatiotemporal DCE-MRI scans. In the post-treatment scan two weeks after the start, the investigated Rényi dimensions of the classic PK rate constant map demonstrated significant differences between the treatment and the control groups; when Rényi dimensions were adopted for treatment/control group classification, the achieved accuracy was higher than the accuracy from using conventional PK parameter statistics. Following this pilot work, two novel texture analysis methods were proposed. First, a new technique called Gray Level Local Power Matrix (GLLPM) was developed. It intends to solve the lack of temporal information and poor calculation efficiency of the commonly used Gray Level Co-Occurrence Matrix (GLCOM) techniques. In the same small animal experiment, the dynamic curves of Haralick texture features derived from the GLLPM had an overall better performance than the corresponding curves derived from current GLCOM techniques in treatment/control separation and classification. The second developed method is dynamic Fractal Signature Dissimilarity (FSD) analysis. Inspired by the classic fractal dimension theory, this method measures the dynamics of tumor heterogeneity during the contrast agent uptake in a quantitative fashion on DCE images. In the small animal experiment mentioned before, the selected parameters from dynamic FSD analysis showed significant differences between treatment/control groups as early as after 1 treatment fraction; in contrast, metrics from conventional PK analysis showed significant differences only after 3 treatment fractions. When using dynamic FSD parameters, the treatment/control group classification after 1st treatment fraction was improved than using conventional PK statistics. These results suggest the promising application of this novel method for capturing early therapeutic response.

The second approach of developing novel DCE-MRI methods is to combine PK information from multiple PK models. Currently, the classic Tofts model or its alternative version has been widely adopted for DCE-MRI analysis as a gold-standard approach for therapeutic response assessment. Previously, a shutter-speed (SS) model was proposed to incorporate transcytolemmal water exchange effect into contrast agent concentration quantification. In spite of richer biological assumption, its application in therapeutic response assessment is limited. It might be intriguing to combine the information from the SS model and from the classic Tofts model to explore potential new biological information for treatment assessment. The feasibility of this idea was investigated in the same small animal experiment. The SS model was compared against the Tofts model for therapeutic response assessment using PK parameter regional mean value comparison. Based on the modeled transcytolemmal water exchange rate, a biological subvolume was proposed and was automatically identified using histogram analysis. Within the biological subvolume, the PK rate constant derived from the SS model were proved to be superior to the one from Tofts model in treatment/control separation and classification. Furthermore, novel biomarkers were designed to integrate PK rate constants from these two models. When being evaluated in the biological subvolume, this biomarker was able to reflect significant treatment/control difference in both post-treatment evaluation. These results confirm the potential value of SS model as well as its combination with Tofts model for therapeutic response assessment.

In summary, this study addressed two problems of DCE-MRI application in radiotherapy assessment. In the first part, a method of accelerating DCE-MRI acquisition for better temporal resolution was investigated, and a novel PK model fitting algorithm was proposed for high temporal resolution DCE-MRI. In the second part, two model-free texture analysis methods and a multiple-model analysis method were developed for DCE-MRI therapeutic response assessment. The presented works could benefit the future DCE-MRI routine clinical application in radiotherapy assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthesis of Polyhydroxyalkanoates (PHAs) by Pseudomonas mendocina, using different vegetable oils such as, coconut oil, groundnut oil, corn oil and olive oil, as the sole carbon source was investigated for the first time. The PHA yield obtained was compared with that obtained during the production of PHAs using sodium octanoate as the sole carbon source. The fermentation profiles at shaken flask and bioreactor levels revealed that vegetable oils supported the growth of Pseudomonas mendocina and PHA accumulation in this organism. Moreover, when vegetable oil (coconut oil) was used as the sole carbon source, fermentation profiles showed better growth and polymer production as compared to conditions when sodium octanoate was used as the carbon source. In addition, comparison of PHA accumulation at shaken flask and fermenter level confirmed the higher PHA yield at shaken flask level production. The highest cell mass found using sodium octanoate was 1.8 g/L, whereas cell mass as high as 5.1 g/L was observed when coconut oil was used as the feedstock at flask level production. Moreover, the maximum PHA yield of 60.5% dry cell weight (dcw) was achieved at shaken flask level using coconut oil as compared to the PHA yield of 35.1% dcw obtained using sodium octanoate as the sole carbon source. Characterisations of the chemical, physical, mechanical, surface and biocompatibility properties of the polymers produced have been carried out by performing different analyses as described in the second chapter of this study. Chemical analysis using GC and FTIR investigations showed medium chain length (MCL) PHA production in all conditions. GC-MS analysis revealed a unique terpolymer production, containing 3-hydroxyoctanoic acid, 3-hydroxydecanoic acid and 3-hydroxydodecanoic acid when coconut oil, groundnut oil, olive oil, and corn oil were used as the carbon source. Whereas production of the homopolymer containing 3-hydroxyoctanoic acid was observed when sodium octanoate was used as the carbon source. MCL-PHAs produced in this study using sodium octanoate, coconut oil, and olive oil exhibited melting transitions, indicating that each of the PHA was crystalline or semi-crystalline polymer. In contrast, the thermal properties of PHAs produced from groundnut and corn oils showed no melting transition, indicating that they were completely amorphous or semi-crystalline, which was also confirmed by the X-Ray Diffraction (XRD) results obtained in this study. Mechanical analysis of the polymers produced showed higher stiffness of the polymer produced from coconut oil than the polymer from sodium octanoate. Surface characterisation of the polymers using Scanning Electron Microscopy (SEM) revealed a rough surface topography and surface contact angle measurement revealed their hydrophobic nature. Moreover, to investigate the potential applicability of the produced polymers as the scaffold materials for dental pulp regeneration, multipotent human Mesenchymal stem cells (hMSCs) were cultured onto the polymer films. Results indicated that these polymers are not cytotoxic towards the hMSCs and could support their attachment and proliferation. Highest cell growth was observed on the polymer samples produced from corn oil, followed by the polymer produced using coconut oil. In conclusion, this work established, for the first time, that vegetable oils are a good economical source of carbon for production of MCL-PHA copolymers effectively by Pseudomonas mendocina. Moreover, biocompatibility studies suggest that the produced polymers may have potential for dental tissue engineering application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La gestion intégrée de la ressource en eau implique de distinguer les parcours de l’eau qui sont accessibles aux sociétés de ceux qui ne le sont pas. Les cheminements de l’eau sont nombreux et fortement variables d’un lieu à l’autre. Il est possible de simplifier cette question en s’attardant plutôt aux deux destinations de l’eau. L’eau bleue forme les réserves et les flux dans l’hydrosystème : cours d’eau, nappes et écoulements souterrains. L’eau verte est le flux invisible de vapeur d’eau qui rejoint l’atmosphère. Elle inclut l’eau consommée par les plantes et l’eau dans les sols. Or, un grand nombre d’études ne portent que sur un seul type d’eau bleue, en ne s’intéressant généralement qu’au devenir des débits ou, plus rarement, à la recharge des nappes. Le portrait global est alors manquant. Dans un même temps, les changements climatiques viennent impacter ce cheminement de l’eau en faisant varier de manière distincte les différents composants de cycle hydrologique. L’étude réalisée ici utilise l’outil de modélisation SWAT afin de réaliser le suivi de toutes les composantes du cycle hydrologique et de quantifier l’impact des changements climatiques sur l’hydrosystème du bassin versant de la Garonne. Une première partie du travail a permis d’affiner la mise en place du modèle pour répondre au mieux à la problématique posée. Un soin particulier a été apporté à l’utilisation de données météorologiques sur grille (SAFRAN) ainsi qu’à la prise en compte de la neige sur les reliefs. Le calage des paramètres du modèle a été testé dans un contexte differential split sampling, en calant puis validant sur des années contrastées en terme climatique afin d’appréhender la robustesse de la simulation dans un contexte de changements climatiques. Cette étape a permis une amélioration substantielle des performances sur la période de calage (2000-2010) ainsi que la mise en évidence de la stabilité du modèle face aux changements climatiques. Par suite, des simulations sur une période d’un siècle (1960-2050) ont été produites puis analysées en deux phases : i) La période passée (1960-2000), basée sur les observations climatiques, a servi de période de validation à long terme du modèle sur la simulation des débits, avec de très bonnes performances. L’analyse des différents composants hydrologiques met en évidence un impact fort sur les flux et stocks d’eau verte, avec une diminution de la teneur en eau des sols et une augmentation importante de l’évapotranspiration. Les composantes de l’eau bleue sont principalement perturbées au niveau du stock de neige et des débits qui présentent tous les deux une baisse substantielle. ii) Des projections hydrologiques ont été réalisées (2010-2050) en sélectionnant une gamme de scénarios et de modèles climatiques issus d’une mise à l’échelle dynamique. L’analyse de simulation vient en bonne part confirmer les conclusions tirées de la période passée : un impact important sur l’eau verte, avec toujours une baisse de la teneur en eau des sols et une augmentation de l’évapotranspiration potentielle. Les simulations montrent que la teneur en eau des sols pendant la période estivale est telle qu’elle en vient à réduire les flux d’évapotranspiration réelle, mettant en évidence le possible déficit futur des stocks d’eau verte. En outre, si l’analyse des composantes de l’eau bleue montre toujours une diminution significative du stock de neige, les débits semblent cette fois en hausse pendant l’automne et l’hiver. Ces résultats sont un signe de l’«accélération» des composantes d’eau bleue de surface, probablement en relation avec l’augmentation des évènements extrêmes de précipitation. Ce travail a permis de réaliser une analyse des variations de la plupart des composantes du cycle hydrologique à l’échelle d’un bassin versant, confirmant l’importance de prendre en compte toutes ces composantes pour évaluer l’impact des changements climatiques et plus largement des changements environnementaux sur la ressource en eau.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A proposta deste artigo foi estudar a evolução do tamanho das cidades dos estados do nordeste do Brasil para os anos de 1990, 2000 e 2010 através da regularidade empírica conhecida como lei de Zipf, a qual pode ser representada por meio da distribuição de Pareto. Por meio da análise na dinâmica da distribuição das populações através do tempo, o crescimento urbano revelou uma persistência hierárquica das cidades de Salvador, Fortaleza e Recife, enquanto que São Luís experimentou o quarto lugar no rankinging das maiores cidades, que persistiu nas duas últimas décadas. A lei de Zipf não se verificou quando se considerou as cidades do Nordeste em conjunto, que pode ser devido ao menor grau de desenvolvimento urbano das cidades dessa região. Na análise dos estados em separado, também não se observou a lei de Zipf, embora tenha se verificado a lei de Gibrat, a qual postula que o crescimento das cidades é independente de seu tamanho. Por fim, acredita-se que a instalação do complexo minerometalúrgico do Maranhão tenha contribuído para o desenvolvimento e para a redução da desigualdade urbana intracidade nesta área.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is no agreement between experimental researchers whether the point where a granular material responds with a large change of stresses, strains or excess pore water pressure given a prescribed small input of some of the same variables defines a straight line or a curve in the stress space. This line, known as the instability line, may also vary in shape and position if the onset of instability is measured from drained or undrained triaxial tests. Failure of granular materials, which might be preceded by the onset of instability, is a subject that the geotechnical engineers have to deal with in the daily practice, and generally speaking it is associated to different phenomena observed not only in laboratory tests but also in the field. Examples of this are the liquefaction of loose sands subjected to undrained loading conditions and the diffuse instability under drained loading conditions. This research presents results of DEM simulations of undrained triaxial tests with the aim of studying the influence of stress history and relative density on the onset of instability in granular materials. Micro-mechanical analysis including the evolution of coordination numbers and fabric tensors is performed aiming to gain further insight on the particle-scale interactions that underlie the occurrence of this instability. In addition to provide a greater understanding, the results presented here may be useful as input for macro-scale constitutive models that enable the prediction of the onset of instability in boundary value problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The low tenacity presented by the Portland cement pastes used in the oil wells cementation has been motivating several researches with attention focused on alternative materials. Additives have been developed to generate flexible pastes with mechanical resistance capable to support the expansions and retractions of the metallic covering of the wells that submit to the steam injection, technique very used to increase the recovery factor in oil reservoirs with high viscosity. A fresh paste with inadequate rheological behavior may commit the cementation process seriously, involving flaws that affect the performance of the paste substantially in the hardened state. This work proposes the elaboration and the rheological analysis of Portland cement pastes with addition of residues of rubber tire in several proportions, with the aim of minimizing the damages provoked in the hem cementing of these wells. By thermogravimetric analysis, the particles of eraser that go by the sieve of 0,5mm (35 mesh) opening and treated superficially with NaOH solution of 1 mol/L presented appropriate thermal resistance for wells that submit to thermal cyclic. The evaluation of the study based on the results of the rheological analysis of the pastes, complemented by the mechanical analysis, thickening, stability, tenor of free water and filtrate loss, being used as parameter a paste reference, without rubber addition. The results showed satisfactory rheology, passive of few corrections; considerable loss of mechanical resistance (traction and compression), compensated by earnings of tenacity, however with established limits for its application in oil wells; satisfactory stability, free water and thickening time

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time series of commercial landings from the Algarve (southern Portugal) from 1982 to 1999 were analyzed using min/max autocorrelation factor analysis (MAFA) and dynamic factor analysis (DFA). These techniques were used to identify trends and explore the relationships between the response variables (annual landings of 12 species) and explanatory variables [sea surface temperature, rainfall, an upwelling index, Guadiana river (south-east Portugal) flow, the North Atlantic oscillation, the number of licensed fishing vessels and the number of commercial fishermen]. Landings were more highly correlated with non-lagged environmental variables and in particular with Guadiana river flow. Both techniques gave coherent results, with the most important trend being a steady decline over time. A DFA model with two explanatory variables (Guadiana river flow and number of fishermen) and three common trends (smoothing functions over time) gave good fits to 10 of the 12 species. Results of other models indicated that river flow is the more important explanatory variable in this model. Changes in the mean flow and discharge regime of the Guadiana river resulting from the construction of the Alqueva dam, completed in 2002, are therefore likely to have a significant and deleterious impact on Algarve fisheries landings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the fluctuations in population abundance is a central question in fisheries. Sardine fisheries is of great importance to Portugal and is data-rich and of primary concern to fisheries managers. In Portugal, sub-stocks of Sardina pilchardus (sardine) are found in different regions: the Northwest (IXaCN), Southwest (IXaCS) and the South coast (IXaS-Algarve). Each of these sardine sub-stocks is affected differently by a unique set of climate and ocean conditions, mainly during larval development and recruitment, which will consequently affect sardine fisheries in the short term. Taking this hypothesis into consideration we examined the effects of hydrographic (river discharge), sea surface temperature, wind driven phenomena, upwelling, climatic (North Atlantic Oscillation) and fisheries variables (fishing effort) on S. pilchardus catch rates (landings per unit effort, LPUE, as a proxy for sardine biomass). A 20-year time series (1989-2009) was used, for the different subdivisions of the Portuguese coast (sardine sub-stocks). For the purpose of this analysis a multi-model approach was used, applying different time series models for data fitting (Dynamic Factor Analysis, Generalised Least Squares), forecasting (Autoregressive Integrated Moving Average), as well as Surplus Production stock assessment models. The different models were evaluated, compared and the most important variables explaining changes in LPUE were identified. The type of relationship between catch rates of sardine and environmental variables varied across regional scales due to region-specific recruitment responses. Seasonality plays an important role in sardine variability within the three study regions. In IXaCN autumn (season with minimum spawning activity, larvae and egg concentrations) SST, northerly wind and wind magnitude were negatively related with LPUE. In IXaCS none of the explanatory variables tested was clearly related with LPUE. In IXaS-Algarve (South Portugal) both spring (period when large abundances of larvae are found) northerly wind and wind magnitude were negatively related with LPUE, revealing that environmental effects match with the regional peak in spawning time. Overall, results suggest that management of small, short-lived pelagic species, such as sardine quotas/sustainable yields, should be adapted to a regional scale because of regional environmental variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenging requirements set on new full composite aeronautical structures are mostly related to the demonstration of damage tolerance capability of their primary structures, required by the airworthiness bodies. And while composite-made structures inherently demonstrate exceptional fatigue properties, when put in real life working conditions, a number of external factors can lead to impact damages thus reducing drastically their fatigue resistance due to fiber delamination, disbonding or breaking. This PhD aims towards contributing to the better understanding of the behavior of the primary composite aeronautical structure after near-edge impacts which are inevitable during the service life of an aircraft. The behavior of CFRP structures after impacts in only one small piece of the big picture which is the certification of CFRP built aircraft, where several other parameters need to be evaluated in order to fulfill the airworthiness requirements. These parameters are also discussed in this PhD thesis in order to give a better understanding of the complex task of CFRP structure certification, in which behavior of the impacted structure plays an important role. An experimental and numerical campaign was carried out in order to determine the level of delamination damage in CFRP specimens after near-edge impacts. By calibrating the numerical model with experimental data, it was possible, for different configurations and energy levels, to predict the extension of a delamination in a CFRP structure and to estimate its residual static strength using a very simple but robust technique. The original contribution of this work to the analysis of CFRP structures is the creation of a model which could be applicable to wide range of thicknesses and stacking sequences of CFRP structures, thus potentially being suitable for industrial application, as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: Children with cleft palate (CP) have a high prevalence of sinusitis. Considering that nasal mucus properties play a pivotal role in the upper airway defense mechanism, the aim of the study was to evaluate nasal mucus transportability and physical properties from children with CP. Setting: Hospital for Rehabilitation of Craniofacial Anomalies, School of Dentistry, University of Sao Paulo, Bauru, SP, Brazil and Laboratory of Experimental Air Pollution, School of Medicine, University of Sao Paulo, Sao Paulo, SP, Brazil. Methods: Nasal mucus samples were collected by nasal aspiration from children with CP and without CP (non-CP). Sneeze clearance (SC) was evaluated by the simulated sneeze machine. In vitro mucus transportability (MCT) by cilia was evaluated by the frog palate preparation. Mucus physical surface properties were assessed by measuring the contact angle (CA). Mucus rheology was determined by means of a magnetic rheometer, and the results were expressed as log G* (vectorial sum of viscosity and elasticity) and tan delta (relationship between viscosity and elasticity) measured at 1 and 100 rad/s. Results: Mucus samples from children with CP had a higher SC than non-CP children (67 +/- 30 and 41 +/- 24 mm, respectively, p < 0.05). Mucus samples from children with CP had a lower CA (24 +/- 16 degrees and 35 +/- 11 degrees, p < 0.05) and a higher tan delta 100 (0.79 +/- 0.24 and 0.51 +/- 0.12, p < 0.05) than non-CP children. There were no significant differences in mucus MCT, log G* 1, tan delta 1 and log G* 100 obtained for CP and non-CP children. Conclusions: Nasal mucus physical properties from children with CP are associated with higher sneeze transportability. The high prevalence of sinusitis in children with CP cannot be explained by changes in mucus physical properties and transportability. (C) 2008 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rheological properties of fresh gluten in small amplitude oscillation in shear (SAOS) and creep recovery after short application of stress was related to the hearth breadbaking performance of wheat flours using the multivariate statistics partial least squares (PLS) regression. The picture was completed by dough mixing and extensional properties, flour protein size distribution determined by SE-HPLC, and high molecular weight glutenin subunit (HMW-GS) composition. The sample set comprised 20 wheat cultivars grown at two different levels of nitrogen fertilizer in one location. Flours yielding stiffer and more elastic glutens, with higher elastic and viscous moduli (G' and G") and lower tan 8 values in SAOS, gave doughs that were better able to retain their shape during proving and baking, resulting in breads of high form ratios. Creep recovery measurements after short application of stress showed that glutens from flours of good breadmaking quality had high relative elastic recovery. The nitrogen fertilizer level affected the protein size distribution by an increase in monomeric proteins (gliadins), which gave glutens of higher tan delta and flatter bread loaves (lower form ratio).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of change of the rheological properties of gluten with the addition of fractions with specific molecular weight was investigated. Fractions extracted from Hereward, Riband and Soissons flours were added to the dough prior to gluten extraction. Once extracted, the glutens were subjected to temperature sweeps and creep recovery rheological tests. In the temperature sweeps, Hereward fractions containing the larger polypeptides had a strengthening effect on the gluten, indicated by a decrease in tan delta and an increase in elastic creep recovery, while those fractions that comprised monomeric gliadins had a weakening effect. Adding total gluten also had a strengthening effect. For the biscuit-making flour Riband, the results were quite the reverse: all fractions appeared to strengthen the gluten network, while the addition of total gluten did not have a strengthening effect. For Soissons gluten, the addition of total gluten had a strengthening effect while adding any individual fraction weakened the gluten. The results were confirmed with creep-recovery tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gluten was extracted from flours of several different wheat varieties of varying baking quality. Creep compliance was measured at room temperature and tan 6 was measured over a range of temperatures from 25 to 95 degrees C. The extracted glutens were heat-treated for 20 min at 25, 40, 50, 60, 70 and 90 degrees C in a water bath, freeze-dried and ground to a fine powder. Tests were carried out for extractability in sodium dodecyl sulphate, free sulphydryl (SH) groups using Ellman's method, surface hydrophobicity and molecular weight (MW) distribution (MWD) using field-flow fractionation and multi-angle laser light scattering. With increasing temperature, the glutens showed a decrease in extractability, with the most rapid decreases occurring between 70 and 90 degrees C, a major transition in tan 6 at around 60 degrees C and a minor transition at 40 degrees C for most varieties, a decrease in free SH groups and surface hydrophobicity and a shift in the MWD towards higher MW. The poor bread-making variety Riband showed the highest values of tan delta and Newtonian compliance, the lowest content of free SH groups and the largest increase of HMW/LMW with increasing temperature. No significant correlations with baking volume were found between any of the measured parameters. (c) 2007 Elsevier Ltd. All rights reserved.