993 resultados para Wide range determination


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background Meta-analysis is increasingly being employed as a screening procedure in large-scale association studies to select promising variants for follow-up studies. However, standard methods for meta-analysis require the assumption of an underlying genetic model, which is typically unknown a priori. This drawback can introduce model misspecifications, causing power to be suboptimal, or the evaluation of multiple genetic models, which augments the number of false-positive associations, ultimately leading to waste of resources with fruitless replication studies. We used simulated meta-analyses of large genetic association studies to investigate naive strategies of genetic model specification to optimize screenings of genome-wide meta-analysis signals for further replication. Methods Different methods, meta-analytical models and strategies were compared in terms of power and type-I error. Simulations were carried out for a binary trait in a wide range of true genetic models, genome-wide thresholds, minor allele frequencies (MAFs), odds ratios and between-study heterogeneity (tau(2)). Results Among the investigated strategies, a simple Bonferroni-corrected approach that fits both multiplicative and recessive models was found to be optimal in most examined scenarios, reducing the likelihood of false discoveries and enhancing power in scenarios with small MAFs either in the presence or in absence of heterogeneity. Nonetheless, this strategy is sensitive to tau(2) whenever the susceptibility allele is common (MAF epsilon 30%), resulting in an increased number of false-positive associations compared with an analysis that considers only the multiplicative model. Conclusion Invoking a simple Bonferroni adjustment and testing for both multiplicative and recessive models is fast and an optimal strategy in large meta-analysis-based screenings. However, care must be taken when examined variants are common, where specification of a multiplicative model alone may be preferable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we develop a theory for diffusion and flow of pure sub-critical adsorbates in microporous activated carbon over a wide range of pressure, ranging from very low to high pressure, where capillary condensation is occurring. This theory does not require any fitting parameter. The only information needed for the prediction is the complete pore size distribution of activated carbon. The various interesting behaviors of permeability versus loading are observed such as the maximum permeability at high loading (occurred at about 0.8-0.9 relative pressure). The theory is tested with diffusion and flow of benzene through a commercial activated carbon, and the agreement is found to be very good in the light that there is no fitting parameter in the model. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A QuEChERS method has been developed for the determination of 14 organochlorine pesticides in 14 soils from different Portuguese regions with wide range composition. The extracts were analysed by GC-ECD (where GC-ECD is gas chromatography-electron-capture detector) and confirmed by GC-MS/MS (where MS/MS is tandem mass spectrometry). The organic matter content is a key factor in the process efficiency. An optimization was carried out according to soils organic carbon level, divided in two groups: HS (organic carbon>2.3%) and LS (organic carbon<2.3%). Themethod was validated through linearity, recovery, precision and accuracy studies. The quantification was carried out using a matrixmatched calibration to minimize the existence of the matrix effect. Acceptable recoveries were obtained (70–120%) with a relative standard deviation of ≤16% for the three levels of contamination. The ranges of the limits of detection and of the limits of quantification in soils HS were from 3.42 to 23.77 μg kg−1 and from 11.41 to 79.23 μg kg−1, respectively. For LS soils, the limits of detection ranged from 6.11 to 14.78 μg kg−1 and the limits of quantification from 20.37 to 49.27 μg kg−1. In the 14 collected soil samples only one showed a residue of dieldrin (45.36 μg kg−1) above the limit of quantification. This methodology combines the advantages of QuEChERS, GC-ECD detection and GC-MS/MS confirmation producing a very rapid, sensitive and reliable procedure which can be applied in routine analytical laboratories.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nos dias de hoje é necessário criar hábitos de vida mais saudáveis que contribuam para o bem-estar da população. Adoptar medidas e práticas de modo regular e disciplinado, pode diminuir o risco do aparecimento de determinadas doenças, como a obesidade, as doenças cardiovasculares, a hipertensão, a diabetes, alguns tipos de cancro e tantas outras. É também importante salientar que, uma alimentação cuidada dá saúde e aumenta a esperança média de vida. Em Portugal, nos últimos anos, os costumes alimentares da população têm vindo a alterar-se significativamente. As refeições caseiras confeccionadas com produtos frescos dão lugar à designada “cultura do fast food”. Em contrapartida, os consumidores são cada vez mais exigentes, estando em permanente alerta no que se refere ao estado dos alimentos. A rotulagem de um produto, para além da função publicitária, tem vindo a ser objecto de legislação específica de forma a fornecer informação simples e clara, correspondente à composição, qualidade, quantidade, validade ou outras características do produto. Estas informações devem ser acessíveis a qualquer tipo de público, com mais ou menos formação e de qualquer estrato social. A qualidade e segurança dos produtos deve basear-se na garantia de que todos os ingredientes, materiais de embalagem e processos produtivos são adequados à produção de produtos seguros, saudáveis e saborosos. A Silliker Portugal, S.A. é uma empresa independente de prestação de serviços para o sector agro-alimentar, líder mundial na prestação de serviços para a melhoria da qualidade e segurança alimentar. A Silliker dedica-se a ajudar as empresas a encontrar soluções para os desafios actuais do sector, oferecendo uma ampla gama de serviços, onde se inclui o serviço de análises microbiológicas, químicas e sensorial; consultadoria em segurança alimentar e desenvolvimento; auditorias; rotulagem e legislação. A actualização permanente de procedimentos na procura de uma melhoria contínua é um dos objectivos da empresa. Para responder a um dos desafios colocados à Silliker, surgiu este trabalho, que consistiu no desenvolvimento de um novo método para determinação de ácidos gordos e da gordura total em diferentes tipos de alimentos e comparação dos resultados, com os obtidos com o método analítico até então adoptado. Se a gordura é um elemento de grande importância na alimentação, devido às suas propriedades nutricionais e organoléticas, recentemente, os investigadores têm focado a sua atenção nos mais diversos ácidos gordos (saturados, monoinsaturados e polinsaturados), em particular nos ácidos gordos essenciais e nos isómeros do ácido linoleico conjugado (CLA), uma mistura de isómeros posicionais e geométricos do ácido linoleico com actividade biológica importante. A técnica usada nas determinações foi a cromatografia gasosa com ionização de chama, GC-FID, tendo as amostras sido previamente tratadas e extraídas de acordo com o tipo de matriz. A metodologia analítica desenvolvida permitiu a correcta avaliação do perfil em ácidos gordos, tendo-se para isso usado uma mistura de 37 ésteres metílicos, em que o ácido gordo C13:0 foi usado como padrão interno. A identificação baseou-se nos tempos de retenção de cada ácido gordo da mistura e para a quantificação usaram-se os factores de resposta. A validação do método implementado foi baseada nos resultados obtidos no estudo de três matrizes relativas a materiais certificados pela BIPEA (Bureau Interprofessionnel des Etudes Analytiques), para o que foram efectuadas doze réplicas de cada matriz. Para cada réplica efectuada foi calculado o teor de matéria gorda, sendo posteriormente o resultado comparado com o emitido pela entidade certificada. Após análise de cada constituinte foi também possível calcular o teor em ácidos gordos saturados, monoinsaturados e polinsaturados. A determinação do perfil em ácidos gordos dos materiais certificados foi aceitável atendendo aos valores obtidos, os quais se encontravam no intervalo de valores admissíveis indicados nos relatórios. A quantificação da matéria gorda no que se refere à matriz de “Paté à Tartinier” apresentou um z-score de 4,3, o que de acordo com as exigências internas da Silliker, não é válido. Para as outras duas matrizes (“Mélange Nutritif” e “Plat cuisiné à base de viande”) os valores de z-score foram, respectivamente, 0,7 e -1,0, o que permite concluir a validade do método. Para que o método possa vir a ser adoptado como método alternativo é necessário um estudo mais alargado relativamente a amostras com diferentes composições. O método foi aplicado na análise de amostras de fiambre, leite gordo, queijo, ovo com ómega 3, amendoim e óleo de girassol, e os resultados foram comparados com os obtidos pelo método até então adoptado.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O presente trabalho, realizado no âmbito da Tese de Mestrado, tem como principal objectivo estudar as características pozolânicas dos materiais da zona de Arganil para substituição parcial do cimento Portland com o objectivo de intensificar certas qualidades devido à diminuição da porosidade do betão. Estas qualidades são interessantes quando se procura maior durabilidade. Para tal, foram realizados diversos ensaios para a caracterização física, química e mineralógica dos produtos. Os metacaulinos utilizados foram obtidos de amostras de argila submetidas a calcinação (750oC, durante uma hora), processo que permitiu a desidroxilação quase total da matéria-prima, transformando esta numa fase amorfa e irreversível, com propriedades pozolânicas. São apresentados os resultados dos ensaios de caracterização da matéria-prima, das condições de calcinação e do produto resultante da desidroxilação, nomeadamente a determinação da pozolanicidade e superfície específica e das características fundamentais para a aplicabilidade do produto. Descreve ainda o emprego do metacaulino em betões de resistência convencional. Estudou-se a influência do emprego do metacaulino (15% de substituição de cimento, em massa) na resistência à flexão e à compressão (aos 28 dias) em argamassas e o emprego de metacaulino (10%, 15% e 20% de substituição de cimento, em massa) na resistência à compressão (3, 7 e 28 dias) no betão.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existe uma vasta literatura que defende a utilização do ensaio com o cone Penetrómetro ou fall cone como sendo uma alternativa razoável relativamente ao método mais tradicional na determinação do limite de liquidez, a concha de Casagrande. Com este trabalho pretende-se dar mais um contributo à temática da obtenção dos parâmetros de plasticidade dos solos, utilizando diferentes dispositivos e metodologias distintas. Para tal, selecionou-se um solo de caráter argiloso, proveniente de um barreiro da zona de Chaves, e estabeleceram-se comparações entre os valores dos limites de consistência, obtidos pela concha de Casagrande e pelo fall cone. Nesse âmbito, foi elaborada, inicialmente, uma caracterização deste tipo de solos, a definição de conceitos importantes como o de limite de liquidez, limite de plasticidade e índice de plasticidade, assim como uma descrição do funcionamento daqueles dispositivos e das variáveis associadas a cada um deles. Procedeu-se à classificação do solo, segundo três sistemas, através de ensaios de identificação e caracterização, com o objetivo de inferir acerca da composição e comportamento do mesmo. Foi também objetivo deste trabalho, estudar a potencial influência do operador, nos resultados obtidos em ambos os dispositivos. Desta forma, foi possível concluir acerca das vantagens e desvantagens de cada aparelho e definir perspetivas para trabalhos futuros.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A Dissertação que se pretende desenvolver decorre em ambiente industrial numa empresa, em que as principais áreas de actuação dizem respeito à injecção e maquinação de alumínio. Os componentes aí injectados e maquinados, são componentes que servirão posteriormente a indústria automóvel num vasto leque de aplicações, desde componentes para sistemas hidráulicos de travagem e componentes para sistemas de lubrificação. Nesta dissertação foi abordada a temática da redução do tempo de paragem não planeada numa linha de maquinação, pelo que se elencaram as principais causas que originavam essas paragens. Recorrendo à metodologia SMED, e a um vasto conjunto de técnicas auxiliares de determinação de causas como os 5 WHY´s, diagrama de Ishikawa, 5S e Kaizen, foram propostas diversas melhorias para a optimização do procedimento de troca de modelo, incluindo um conjunto de regras que visam a sustentabilidade do processo a médio e longo prazo. Com as melhorias adoptadas, verificou-se uma redução de 27,9% no tempo total de troca de modelo comparativamente com o tempo registado antes da implementação da metodologia SMED. Destaca-se a redução de 75% no tempo de montagem e desmontagem dos dispositivos no 3ºProcesso da linha em estudo. Revelou-se também crucial que para a implementação da técnica SMED ser bem sucedida em termos de eficiência e sustentabilidade, devem ser adoptadas um conjunto de diversas ferramentas “Lean Management” e de normalização do trabalho.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado em Técnicas de Caraterização e Análise Química

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The haemolymph of Panstrongylus megistus showed a natural lectin activity for a wide range of vertebrate erythocytes. Agglutination was observed against all vertebrate erythrocytes tested (human ABO, duck, rabbit, mouse, sheep, chicken and cow). Cow erythrocytes showed the lowest titre. Concerning human erythrocytes, the lectin activity was similar in the types A+,B+ and AB+ while the highest activity was observed in the type O+. Determination of minimal inhibitory concentrations was carried out with human erythrocytes type O+. Agglutination was inhibited by several carbohydrates (rhamnose. D-galatose, raffinose, D-lactose and D-fucose). Rhamnose wasreported as the strongest inhibitor (0.78mM). The results suggest the presence of more than one lection in the haemolymph of P. megistus.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The species x location interaction was of great importance in explaining the behaviour of genetic material. The study presented here shows, for the first time, the performance, under field conditions of the new tritordeum species, compared to wheat and triticale in a wide range of Mediterranean countries (Spain, Lebanon and Tunisia). The results obtained revealed that despite the diversity of environmental conditions, the main differences in yield were due to genotypes, especially to differences between species. The multi-local study with different growth conditions revealed important information about the water availability effect on yield. In the lowest yielding environments (Tunisia rainfed), Tritordeum and triticale yields were equivalent. However under better growth conditions (Spain), tritordeum yield was shown to be lower than wheat and triticale. Interestingly, when water limitation was extended during the pre-anthesis period, differences in tritordeum versus wheat-triticale yield rate were larger than when water stress occurred during anthesis. These variations were explained by the fact that kernel weight has been found as the limiting factor for yield determination in tritordeum, and a delay in the anthesis date may have been the cause for the low kernel weight and low yield under Mediterranean drought conditions. Such differences in yield between tritordeum and wheat or triticale could be explained by the fact that tritordeum is a relatively new species and far fewer resources have been devoted to its improvement when compared to wheat and triticale. Our results suggest that breeding efforts should be directed to an earlier anthesis date and a longer grain filling period. tritordeum proved to have possibilities to be grown under drought environments as a new crop, since its performance was quite close to wheat and triticale. Besides, it has qualitative added values that may improve farmers' income per unit land.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To estimate the minimal gene set required to sustain bacterial life in nutritious conditions, we carried out a systematic inactivation of Bacillus subtilis genes. Among approximately 4,100 genes of the organism, only 192 were shown to be indispensable by this or previous work. Another 79 genes were predicted to be essential. The vast majority of essential genes were categorized in relatively few domains of cell metabolism, with about half involved in information processing, one-fifth involved in the synthesis of cell envelope and the determination of cell shape and division, and one-tenth related to cell energetics. Only 4% of essential genes encode unknown functions. Most essential genes are present throughout a wide range of Bacteria, and almost 70% can also be found in Archaea and Eucarya. However, essential genes related to cell envelope, shape, division, and respiration tend to be lost from bacteria with small genomes. Unexpectedly, most genes involved in the Embden-Meyerhof-Parnas pathway are essential. Identification of unknown and unexpected essential genes opens research avenues to better understanding of processes that sustain bacterial life.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The modern approach to the development of new chemical entities against complex diseases, especially the neglected endemic diseases such as tuberculosis and malaria, is based on the use of defined molecular targets. Among the advantages, this approach allows (i) the search and identification of lead compounds with defined molecular mechanisms against a defined target (e.g. enzymes from defined pathways), (ii) the analysis of a great number of compounds with a favorable cost/benefit ratio, (iii) the development even in the initial stages of compounds with selective toxicity (the fundamental principle of chemotherapy), (iv) the evaluation of plant extracts as well as of pure substances. The current use of such technology, unfortunately, is concentrated in developed countries, especially in the big pharma. This fact contributes in a significant way to hamper the development of innovative new compounds to treat neglected diseases. The large biodiversity within the territory of Brazil puts the country in a strategic position to develop the rational and sustained exploration of new metabolites of therapeutic value. The extension of the country covers a wide range of climates, soil types, and altitudes, providing a unique set of selective pressures for the adaptation of plant life in these scenarios. Chemical diversity is also driven by these forces, in an attempt to best fit the plant communities to the particular abiotic stresses, fauna, and microbes that co-exist with them. Certain areas of vegetation (Amazonian Forest, Atlantic Forest, Araucaria Forest, Cerrado-Brazilian Savanna, and Caatinga) are rich in species and types of environments to be used to search for natural compounds active against tuberculosis, malaria, and chronic-degenerative diseases. The present review describes some strategies to search for natural compounds, whose choice can be based on ethnobotanical and chemotaxonomical studies, and screen for their ability to bind to immobilized drug targets and to inhibit their activities. Molecular cloning, gene knockout, protein expression and purification, N-terminal sequencing, and mass spectrometry are the methods of choice to provide homogeneous drug targets for immobilization by optimized chemical reactions. Plant extract preparations, fractionation of promising plant extracts, propagation protocols and definition of in planta studies to maximize product yield of plant species producing active compounds have to be performed to provide a continuing supply of bioactive materials. Chemical characterization of natural compounds, determination of mode of action by kinetics and other spectroscopic methods (MS, X-ray, NMR), as well as in vitro and in vivo biological assays, chemical derivatization, and structure-activity relationships have to be carried out to provide a thorough knowledge on which to base the search for natural compounds or their derivatives with biological activity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bone substitute materials allowing trans-scaffold migration and in-scaffold survival of human bone-derived cells are mandatory for development of cell-engineered permanent implants to repair bone defects. In this study, we evaluated the influence on human bone-derived cells of the material composition and microstructure of foam scaffolds of calcium aluminate. The scaffolds were prepared using a direct foaming method allowing wide-range tailoring of the microstructure for pore size and pore openings. Human fetal osteoblasts (osteo-progenitors) attached to the scaffolds, migrated across the entire bioceramic depending on the scaffold pore size, colonized, and survived in the porous material for at least 6 weeks. The long-term biocompatibility of the scaffold material for human bone-derived cells was evidenced by in-scaffold determination of cell metabolic activity using a modified MTT assay, a repeated WST-1 assay, and scanning electron microscopy. Finally, we demonstrated that the osteo-progenitors can be covalently bound to the scaffolds using biocompatible click chemistry, thus enhancing the rapid adhesion of the cells to the scaffolds. Therefore, the different microstructures of the foams influenced the migratory potential of the cells, but not cell viability. Scaffolds allow covalent biocompatible chemical binding of the cells to the materials, either localized or widespread integration of the scaffolds for cell-engineered implants.