978 resultados para REFERENCE SAMPLES


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Testing a new method of nanoindentation using the atomic force microscope (AFM) was the purpose of this research. Nanoindentation is a useful technique to study the properties of materials on the sub-micron scale. The AFM has been used as a nanoindenter previously; however several parameters needed to obtain accurate results, including tip radius and cantilever sensitivity, can be difficult to determine. To solve this problem, a new method to determine the elastic modulus of a material using the atomic force microscope (AFM) has been proposed by Tang et al. This method models the cantilever and the sample as two springs in a series. The ratio of the cantilever spring constant (k) to diameter of the tip (2a) is treated in the model as one parameter (α=k/2a). The value of a, along with the cantilever sensitivity, are determined on two reference samples with known mechanical properties and then used to find the elastic modulus of an unknown sample. To determine the reliability and accuracy of this technique, it was tested on several polymers. Traditional depth-sensing nanoindentation was preformed for comparison. The elastic modulus values from the AFM were shown to be statistically similar to the nanoindenter results for three of the five samples tested.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Amniotic fluid cells (AFCs) have been proposed as a valuable source for tissue engineering and regenerative medicine. However, before clinical implementation, rigorous evaluation of this cell source in clinically relevant animal models accepted by regulatory authorities is indispensable. Today, the ovine model represents one of the most accepted preclinical animal models, in particular for cardiovascular applications. Here, we investigate the isolation and use of autologous ovine AFCs as cell source for cardiovascular tissue engineering applications. Fetal fluids were aspirated in vivo from pregnant ewes (n = 9) and from explanted uteri post mortem at different gestational ages (n = 91). Amniotic non-allantoic fluid nature was evaluated biochemically and in vivo samples were compared with post mortem reference samples. Isolated cells revealed an immunohistochemical phenotype similar to ovine bone marrow-derived mesenchymal stem cells (MSCs) and showed expression of stem cell factors described for embryonic stem cells, such as NANOG and STAT-3. Isolated ovine amniotic fluid-derived MSCs were screened for numeric chromosomal aberrations and successfully differentiated into several mesodermal phenotypes. Myofibroblastic ovine AFC lineages were then successfully used for the in vitro fabrication of small- and large-diameter tissue-engineered vascular grafts (n = 10) and cardiovascular patches (n = 34), laying the foundation for the use of this relevant pre-clinical in vivo assessment model for future amniotic fluid cell-based therapeutic applications. Copyright © 2013 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Elevation of ketone bodies occurs frequently after parturition during negative energy balance in high yielding dairy cows. Previous studies illustrated that hyperketonemia interferes with metabolism and it is assumed that it impairs the immune response. However, a causative effect of ketone bodies could not be shown in vivo before, because spontaneous hyperketonemia comes usually along with high NEFA and low glucose concentrations. The objective was to study effects of beta-hydroxybutyrate (BHBA) infusion and an additional intramammary lipopolysaccharide (LPS) challenge on metabolism and immune response in dairy cows. Thirteen dairy cows received intravenously either a BHBA infusion (group BHBA, n=5) to induce hyperketonemia (1.7 mmol/L), or an infusion with a 0.9 % saline solution (Control, n=8) for 56 h. Infusions started at 0900 on day 1 and continue up to 1700 two days later. Two udder quarters were challenged with 200 μg Escherichia coli-LPS 48 h after the start of infusion. Blood samples were taken one week and 2 h before the start of infusions as reference samples and hourly during the infusion. Liver and mammary gland biopsies were taken one week before the start of the infusion, 48 h after the start of the infusion, and mammary tissues was additionally taken 8 h after LPS challenge (56 h after the start of infusions). Rectal temperature (RT) and somatic cell count (SCC) was measured before and 48 h after the start of infusions and hourly during LPS challenge. Blood samples were analyzed for plasma glucose, BHBA, NEFA, triglyceride, urea, insulin, glucagon, and cortisol concentration. The mRNA abundance of factors related to potential adaptations of metabolism and immune system was measured in liver and mammary tissue biopsies. Differences between blood constituents, RT, SCC, and mRNA abundance before and 48 h after the start of infusions, and differences between mRNA abundance before and after LPS challenges were tested for significance by GLM of SAS procedure with treatment as fixed effect. Area under the curve was calculated for blood variables during 48 h BHBA infusion and during the LPS challenge, and additionally for RT and SCC during the LPS challenge. Most surprisingly, both plasma glucose and glucagon concentration decreased during the 48 h of BHBA infusion (P<0.05). During the 48 h of BHBA infusion, serum amyloid A mRNA abundance in mammary gland was increased (P<0.01), and haptoglobin (Hp) mRNA abundance tended to increase in cows treated with BHBA compared to control group (P= 0.07). RT, SCC, and candidate genes related to immune response in the liver were not affected by BHBA infusion. However, during LPS challenge the expected increase of both plasma glucose and glucagon concentration was much less pronounced in the animals treated with BHBA (P<0.05) and also SCC increased much less pronounced in the animals infused with BHBA (P<0.05) than in the controls. An increased BHBA infusion rate to maintain plasma BHBA constant could not fully compensate for the decreased plasma BHBA during the LPS challenge which indicates that BHBA is used as an energy source during the immune response. In addition, BHBA infused animals showed a more pronounced increase of mRNA abundance of IL-8, IL-10, and citrate synthase in the mammary tissue of LPS challenged quarters (P<0.05) than control animals. Results demonstrate that infusion of BHBA affects metabolism through decreased plasma glucose concentration which is likely related to a decreased release of glucagon during hyperketonemia and during additional inflammation. It also affects the systemic and mammary immune response which may reflect the increased susceptibility for mastitis during spontaneous hyperketonemia. The obviously reduced gluconeogenesis in response to BHBA infusion may be a mechanism to stimulated the use of BHBA as an energy source instead of glucose, and/or to save oxaloacetate for the citric acid cycle instead of gluconeogenesis and as a consequence to reduce ketogenesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hyperketonemia interferes with the metabolic regulation in dairy cows. It is assumed that metabolic and endocrine changes during hyperketonemia also affect metabolic adaptations during inflammatory processes. We therefore studied systemic and local intramammary effects of elevated plasma β-hydroxybutyrate (BHBA) before and during the response to an intramammary lipopolysaccharide (LPS) challenge. Thirteen dairy cows received intravenously either a Na-DL-β-OH-butyrate infusion (n = 5) to achieve a constant plasma BHBA concentration (1.7 ± 0.1 mmol/L), with adjustments of the infusion rates made based on immediate measurements of plasma BHBA every 15 min, or an infusion with a 0.9% NaCl solution (control; n = 8) for 56 h. Infusions started at 0900 h on d 1 and continued until 1700 h 2 d later. Two udder quarters were challenged with 200 μg of Escherichia coli LPS and 2 udder quarters were treated with 0.9% saline solution as control quarters at 48 h after the start of infusion. Blood samples were taken at 1 wk and 2h before the start of infusions as reference samples and hourly during the infusion. Mammary gland biopsies were taken 1 wk before, and 48 and 56 h (8h after LPS challenge) after the start of infusions. The mRNA abundance of key factors related to BHBA and fatty acid metabolism, and glucose transporters was determined in mammary tissue biopsies. Blood samples were analyzed for plasma glucose, BHBA, nonesterified fatty acid, urea, insulin, glucagon, and cortisol concentrations. Differences were not different for effects of BHBA infusion on the mRNA abundance of any of the measured target genes in the mammary gland before LPS challenge. Intramammary LPS challenge increased plasma glucose, cortisol, glucagon, and insulin concentrations in both groups but increases in plasma glucose and glucagon concentration were less pronounced in the Na-DL-β-OH-butyrate infusion group than in controls. In response to LPS challenge, plasma BHBA concentration decreased in controls and decreased also slightly in the BHBA-infused animals because the BHBA concentration could not be fully maintained despite a rapid increase in BHBA infusion rate. The change in mRNA abundance of citrate synthase in LPS quarters was significant between the 2 treatment groups. The results indicate that elevated circulating BHBA concentration inhibits gluconeogenesis before and during immune response to LPS challenge, likely because BHBA can replace glucose as an energy source.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents the performance of the ATLAS muon reconstruction during the LHC run with pp collisions at √s = 7–8 TeV in 2011–2012, focusing mainly on data collected in 2012. Measurements of the reconstruction efficiency and of the momentum scale and resolution, based on large reference samples of J/ψ → μμ, Z → μμ and ϒ → μμ decays, are presented and compared to Monte Carlo simulations. Corrections to the simulation, to be used in physics analysis, are provided. Over most of the covered phase space (muon |η| < 2.7 and 5 ≲ pT ≲ 100 GeV) the efficiency is above 99% and is measured with per-mille precision. The momentum resolution ranges from 1.7% at central rapidity and for transverse momentum pT ≅ 10 GeV, to 4% at large rapidity and pT ≅ 100 GeV. The momentum scale is known with an uncertainty of 0.05% to 0.2% depending on rapidity. A method for the recovery of final state radiation from the muons is also presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyses of extractable organic matter from selected core samples obtained at DSDP Site 535 in the eastern Gulf of Mexico show that the asphalt (or tar) and adjacent oil stains in Lower Cretaceous fractured limestones have a common origin and are not derived from the surrounding organic-matter-rich limestones. Organic matter indigenous to those surrounding limestones was shown to be thermally immature and incapable of yielding the hydrocarbon mixture discovered. In contrast, the oil-stained and asphaltic material appears to be a post-migration alteration product of a mature oil that has migrated from source rocks deeper in the section, or from stratigraphically equivalent but compositionally different source-facies down-dip from the drill site. Further, hydrocarbons of the altered petroleum residues were shown to be similar to Sunniland-type oils found in Lower Cretaceous rocks of South Florida. The results suggest that shallowwater, platform-type source-rock facies similar to those that generated Sunniland-type oils, or deeper-water facies having comparable oil-generating material, are present in this deep-water (> 3000 m) environment. These findings have important implications for the petroleum potential in the eastern Gulf of Mexico and for certain types of deep-sea sediments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabajo estudia el comportamiento de una matriz de yeso de construcción a la que se le han añadido residuos de construcción y demolición (RCD), residuos de poliestireno extruido (XPS) y residuos cerámicos respectivamente, combinados y en diferentes porcentajes en función del peso del yeso. Los residuos de XPS son producto de una obra en Madrid donde el material fue utilizado como aislamiento térmico y los residuos cerámicos corresponden a trozos de ladrillos toscos encontrados en una obra paralizada en la ciudad de Ávila. Se confeccionaron probetas con porcentajes hasta 3% de XPS y hasta 50% de cerámicos en función del peso del yeso utilizado, como referencia se confeccionaron probetas sin adición de RCD. Fueron ensayadas en laboratorio y se determinaron las características físicas y mecánicas de las mismas. Tras un análisis comparativo se evidencian que la adición de residuos de XPS y residuos cerámicos en conjunto disminuye la densidad seca del material y la absorción de agua por capilaridad, en algunos casos disminuye la conductividad térmica y aumenta la dureza superficial. ABSTRACT: This paper studies the behavior of a building gypsum matrix to which have been added Construction and Demolition waste (CDW), residues of extruded polystyrene (XPS) and ceramic waste respectively, and combined in different percentages depending on the weight of gypsum. XPS waste are the product of a work in Madrid where the material was used as thermal insulation and ceramic waste correspond to pieces of rough bricks found in a paralyzed work in the city of Avila. Specimens were prepared with percentages up to 3% of XPS and up to 50% depending on the weight ceramic gypsum used as reference samples were prepared without addition of CDW. They were tested in laboratory and determined the physical and mechanical characteristics thereof. After a comparative analysis they show that the addition of ceramic waste and waste XPS decreases together dry material density and water absorption by capillary action, in some cases the thermal conductivity decreases and increases surface hardness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La evolución de los teléfonos móviles inteligentes, dotados de cámaras digitales, está provocando una creciente demanda de aplicaciones cada vez más complejas que necesitan algoritmos de visión artificial en tiempo real; puesto que el tamaño de las señales de vídeo no hace sino aumentar y en cambio el rendimiento de los procesadores de un solo núcleo se ha estancado, los nuevos algoritmos que se diseñen para visión artificial han de ser paralelos para poder ejecutarse en múltiples procesadores y ser computacionalmente escalables. Una de las clases de procesadores más interesantes en la actualidad se encuentra en las tarjetas gráficas (GPU), que son dispositivos que ofrecen un alto grado de paralelismo, un excelente rendimiento numérico y una creciente versatilidad, lo que los hace interesantes para llevar a cabo computación científica. En esta tesis se exploran dos aplicaciones de visión artificial que revisten una gran complejidad computacional y no pueden ser ejecutadas en tiempo real empleando procesadores tradicionales. En cambio, como se demuestra en esta tesis, la paralelización de las distintas subtareas y su implementación sobre una GPU arrojan los resultados deseados de ejecución con tasas de refresco interactivas. Asimismo, se propone una técnica para la evaluación rápida de funciones de complejidad arbitraria especialmente indicada para su uso en una GPU. En primer lugar se estudia la aplicación de técnicas de síntesis de imágenes virtuales a partir de únicamente dos cámaras lejanas y no paralelas—en contraste con la configuración habitual en TV 3D de cámaras cercanas y paralelas—con información de color y profundidad. Empleando filtros de mediana modificados para la elaboración de un mapa de profundidad virtual y proyecciones inversas, se comprueba que estas técnicas son adecuadas para una libre elección del punto de vista. Además, se demuestra que la codificación de la información de profundidad con respecto a un sistema de referencia global es sumamente perjudicial y debería ser evitada. Por otro lado se propone un sistema de detección de objetos móviles basado en técnicas de estimación de densidad con funciones locales. Este tipo de técnicas es muy adecuada para el modelado de escenas complejas con fondos multimodales, pero ha recibido poco uso debido a su gran complejidad computacional. El sistema propuesto, implementado en tiempo real sobre una GPU, incluye propuestas para la estimación dinámica de los anchos de banda de las funciones locales, actualización selectiva del modelo de fondo, actualización de la posición de las muestras de referencia del modelo de primer plano empleando un filtro de partículas multirregión y selección automática de regiones de interés para reducir el coste computacional. Los resultados, evaluados sobre diversas bases de datos y comparados con otros algoritmos del estado del arte, demuestran la gran versatilidad y calidad de la propuesta. Finalmente se propone un método para la aproximación de funciones arbitrarias empleando funciones continuas lineales a tramos, especialmente indicada para su implementación en una GPU mediante el uso de las unidades de filtraje de texturas, normalmente no utilizadas para cómputo numérico. La propuesta incluye un riguroso análisis matemático del error cometido en la aproximación en función del número de muestras empleadas, así como un método para la obtención de una partición cuasióptima del dominio de la función para minimizar el error. ABSTRACT The evolution of smartphones, all equipped with digital cameras, is driving a growing demand for ever more complex applications that need to rely on real-time computer vision algorithms. However, video signals are only increasing in size, whereas the performance of single-core processors has somewhat stagnated in the past few years. Consequently, new computer vision algorithms will need to be parallel to run on multiple processors and be computationally scalable. One of the most promising classes of processors nowadays can be found in graphics processing units (GPU). These are devices offering a high parallelism degree, excellent numerical performance and increasing versatility, which makes them interesting to run scientific computations. In this thesis, we explore two computer vision applications with a high computational complexity that precludes them from running in real time on traditional uniprocessors. However, we show that by parallelizing subtasks and implementing them on a GPU, both applications attain their goals of running at interactive frame rates. In addition, we propose a technique for fast evaluation of arbitrarily complex functions, specially designed for GPU implementation. First, we explore the application of depth-image–based rendering techniques to the unusual configuration of two convergent, wide baseline cameras, in contrast to the usual configuration used in 3D TV, which are narrow baseline, parallel cameras. By using a backward mapping approach with a depth inpainting scheme based on median filters, we show that these techniques are adequate for free viewpoint video applications. In addition, we show that referring depth information to a global reference system is ill-advised and should be avoided. Then, we propose a background subtraction system based on kernel density estimation techniques. These techniques are very adequate for modelling complex scenes featuring multimodal backgrounds, but have not been so popular due to their huge computational and memory complexity. The proposed system, implemented in real time on a GPU, features novel proposals for dynamic kernel bandwidth estimation for the background model, selective update of the background model, update of the position of reference samples of the foreground model using a multi-region particle filter, and automatic selection of regions of interest to reduce computational cost. The results, evaluated on several databases and compared to other state-of-the-art algorithms, demonstrate the high quality and versatility of our proposal. Finally, we propose a general method for the approximation of arbitrarily complex functions using continuous piecewise linear functions, specially formulated for GPU implementation by leveraging their texture filtering units, normally unused for numerical computation. Our proposal features a rigorous mathematical analysis of the approximation error in function of the number of samples, as well as a method to obtain a suboptimal partition of the domain of the function to minimize approximation error.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Picobirnavirus (PBV) pertencem à família Picobirnaviridae, divididos em duas espécies Human Picobirnavirus e Rabbit Picobirnavirus. São pequenos vírus constituídos de genoma bissegmentado de cadeia dupla de RNA (dsRNA), não envelopados, com capsídeo de simetria icosaédrica, sendo divididos em dois genogrupos, GI e GII. Já foram detectados em fezes humanas e de uma ampla gama de espécies animais, com ou sem sinais diarreicos, sendo considerados agentes emergentes e oportunistas, e seu potencial zoonótico foi sugerido. Entretanto, os estudos epidemiológicos e moleculares de PBV em bovinos são raros na literatura nacional e internacional. Devido à carência de dados a respeito de PBV em bovinos, o presente estudo foi realizado objetivando-se a detecção e caracterização moleculares de cepas de PVB bovinos dos genogrupos GI e GII em amostras fecais de bovinos com ou sem sintomatologia diarreica de diferentes idades e regiões do Brasil. O estudo foi conduzido a partir de 77 animais, obtendo-se 18 (23,3%) amostras positivas para GI, compreendendo animais provenientes dos estados de São Paulo, Minas Gerais e Goiás. Não foram detectadas amostras positivas para GII. A identidade nucleotídica das amostras obtidas apresentou média de 67,4% quando comparadas uma com as outras e de até 83,77% quando comparadas com amostras de PBV de referência. Na reconstrução filogenética, três amostras agruparam-se em clado de PVB humano e somente uma agrupou-se em clado de PVB bovino. Em síntese, os resultados obtidos indicam, de maneira inédita, a circulação de PVB bovino pertencente ao genogrupo GI em diferentes estados brasileiros, com perfis filogenéticos heterogêneos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nanostructured TiO2 photocatalysts with small crystalline sizes have been synthesized by sol-gel using the amphiphilic triblock copolymer Pluronic P123 as template. A new synthesis route, based on the treatment of TiO2 xerogels with acid-ethanol mixtures in two different steps, synthesis and extraction-crystallization, has been investigated, analyzing two acids, hydrochloric and hydriodic acid. As reference, samples have also been prepared by extraction-crystallization in ethanol, being these TiO2 materials amorphous and presenting higher porosities. The prepared materials present different degrees of crystallinity depending on the experimental conditions used. In general, these materials exhibit high surface areas, with an important contribution of microporosity and mesoporosity, and with very small size anatase crystals, ranging from 5 to 7 nm. The activity of the obtained photocatalysts has been assessed in the oxidation of propene in gas phase at low concentration (100 ppmv) under a UVA lamp with 365 nm wavelength. In the conditions studied, these photocatalysts show different activities in the oxidation of propene which do not depend on their surface areas, but on their crystallinity and band gap energies, being sample prepared with HCl both during synthesis and in extraction-crystallizations steps, the most active one, with superior performance than Evonik P25.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nanoindentation has become a common technique for measuring the hardness and elastic-plastic properties of materials, including coatings and thin films. In recent years, different nanoindenter instruments have been commercialised and used for this purpose. Each instrument is equipped with its own analysis software for the derivation of the hardness and reduced Young's modulus from the raw data. These data are mostly analysed through the Oliver and Pharr method. In all cases, the calibration of compliance and area function is mandatory. The present work illustrates and describes a calibration procedure and an approach to raw data analysis carried out for six different nanoindentation instruments through several round-robin experiments. Three different indenters were used, Berkovich, cube corner, spherical, and three standardised reference samples were chosen, hard fused quartz, soft polycarbonate, and sapphire. It was clearly shown that the use of these common procedures consistently limited the hardness and reduced the Young's modulus data spread compared to the same measurements performed using instrument-specific procedures. The following recommendations for nanoindentation calibration must be followed: (a) use only sharp indenters, (b) set an upper cut-off value for the penetration depth below which measurements must be considered unreliable, (c) perform nanoindentation measurements with limited thermal drift, (d) ensure that the load-displacement curves are as smooth as possible, (e) perform stiffness measurements specific to each instrument/indenter couple, (f) use Fq and Sa as calibration reference samples for stiffness and area function determination, (g) use a function, rather than a single value, for the stiffness and (h) adopt a unique protocol and software for raw data analysis in order to limit the data spread related to the instruments (i.e. the level of drift or noise, defects of a given probe) and to make the H and E r data intercomparable. © 2011 Elsevier Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract : Natural materials have received a full attention in many applications because they are degradable and derived directly from earth. In addition to these benefits, natural materials can be obtained from renewable resources such as plants (i.e. cellulosic fibers like flax, hemp, jute, and etc). Being cheap and light in weight, the cellulosic natural fiber is a good candidate for reinforcing bio-based polymer composites. However, the hydrophilic nature -resulted from the presence of hydroxyl groups in the structure of these fibers- restricts the application of these fibers in the polymeric matrices. This is because of weak interfacial adhesion, and difficulties in mixing due to poor wettability of the fibers within the matrices. Many attempts have been done to modify surface properties of natural fibers including physical, chemical, and physico-chemical treatments but on the one hand, these treatments are unable to cure the intrinsic defects of the surface of the fibers and on the other hand they cannot improve moisture, and alkali resistance of the fibers. However, the creation of a thin film on the fibers would achieve the mentioned objectives. This study aims firstly to functionalize the flax fibers by using selective oxidation of hydroxyl groups existed in cellulose structure to pave the way for better adhesion of subsequent amphiphilic TiO[subscript 2] thin films created by Sol-Gel technique. This method is capable of creating a very thin layer of metallic oxide on a substrate. In the next step, the effect of oxidation on the interfacial adhesion between the TiO[subscript 2] film and the fiber and thus on the physical and mechanical properties of the fiber was characterized. Eventually, the TiO[subscript 2] grafted fibers with and without oxidation were used to reinforce poly lactic acid (PLA). Tensile, impact, and short beam shear tests were performed to characterize the mechanical properties while Thermogravimetric analysis (TGA), Differential Scanning Calorimetry (DSC), Dynamic mechanical analysis (DMA), and moisture absorption were used to show the physical properties of the composites. Results showed a significant increase in physical and mechanical properties of flax fibers when the fibers were oxidized prior to TiO[subscript 2] grafting. Moreover, the TiO[subscript 2] grafted oxidized fiber caused significant changes when they were used as reinforcements in PLA. A higher interfacial strength and less amount of water absorption were obtained in comparison with the reference samples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As azeitonas de mesa são consumidas e apreciadas em todo o mundo e, embora a sua classificação comercial não seja legalmente exigida, o Conselho Oleícola Internacional sugere que seja regulamentada com base na avaliação sensorial por um painel de provadores. A implementação de tal requer o cumprimento de diretrizes estabelecidas pelo Conselho Oleícola Internacional, resultando numa tarefa complexa, demorada e cujas avaliações não estão isentas de subjetividade. Neste trabalho, pela primeira vez, uma língua eletrónica foi utilizada com o intuito de classificar azeitonas de mesa em categorias comerciais, estipuladas com base na presença e na mediana das intensidades do defeito organolético predominante percebido pelo painel de provadores. Modelos de discriminação lineares foram estabelecidos com base em subconjuntos de sinais potenciométricos de sensores da língua eletrónica, selecionados recorrendo ao algoritmo de arrefecimento simulado. Os desempenhos qualitativo de previsão dos modelos de classificação estabelecidos foram avaliados recorrendo à técnica de validação cruzada leave-one-out e à técnica de validação cruzada K-folds com repetição, que permite minimizar o risco de sobreajustamento, permitindo obter resultados mais realistas. O potencial desta abordagem qualitativa, baseada nos perfis eletroquímicos gerados pela língua eletrónica, foi satisfatoriamente demonstrado: (i) na classificação correta (sensibilidades ≥ 93%) de soluções padrão (ácido n-butírico, 2-mercaptoetanol e ácido ciclohexanocarboxílico) de acordo com o defeito sensorial que mimetizam (butírico, pútrido ou sapateira); (ii) na classificação correta (sensibilidades ≥ 93%) de amostras de referência de azeitonas e salmouras (presença de um defeito único intenso) de acordo com o tipo de defeito percebido (avinhado-avinagrado, butírico, mofo, pútrido ou sapateira), e selecionadas pelo painel de provadores; e, (iii) na classificação correta (sensibilidade ≥ 86%) de amostras de azeitonas de mesa com grande heterogeneidade, contendo um ou mais defeitos organoléticos percebidos pelo painel de provadores nas azeitona e/ou salmouras, de acordo com a sua categoria comercial (azeitona extra sem defeito, extra, 1ª escolha, 2ª escolha e azeitonas que não podem ser comercializadas como azeitonas de mesa). Por fim, a capacidade língua eletrónica em quantificar as medianas das intensidades dos atributos negativos detetados pelo painel nas azeitonas de mesa foi demonstrada recorrendo a modelos de regressão linear múltipla-algoritmo de arrefecimento simulado, com base em subconjuntos selecionados de sinais gerados pela língua eletrónica durante a análise potenciométrica das azeitonas e salmouras. O xii desempenho de previsão dos modelos quantitativos foi validado recorrendo às mesmas duas técnicas de validação cruzada. Os modelos estabelcidos para cada um dos 5 defeitos sensoriais presentes nas amostras de azeitona de mesa, permitiram quantificar satisfatoriamente as medianas das intensidades dos defeitos (R² ≥ 0,97). Assim, a qualidade satisfatória dos resultados qualitativos e quantitativos alcançados permite antever, pela primeira vez, uma possível aplicação prática das línguas eletrónicas como uma ferramenta de análise sensorial de defeitos em azeitonas de mesa, podendo ser usada como uma técnica rápida, económica e útil na avaliação organolética de atributos negativos, complementar à tradicional análise sensorial por um painel de provadores.