877 resultados para High-performance computing hyperspectral imaging


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An efficient method for the rapid separation and purification of polyphenols from artichoke by polyamide column chromatography in combination with high-speed counter-current chromatography (HSCCC) was successfully built. The crude ethanol extracts from dry artichoke were first pre-separated by polyamide column chromatography and divided in two parts as sample 1 and sample 2. Then, the samples were further separated by HSCCC and yielded 7.8 mg of chlorogenic acid (compound I), 24.5 mg of luteolin-7-O-β-D-rutinoside (compound II), 18.4 mg of luteolin-7-O-β-D-glucoside (compound III), and 33.4 mg of cynarin (compound IV) with purity levels of 92.0%, 98.2%, 98.5%, and 98.0%, respectively, as determined by high-performance liquid chromatography (HPLC) method. The chemical structures of these compounds were identified by electrospray ionization-mass spectrometry (ESI-MS) and nuclear magnetic resonance (NMR).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An effective method for the rapid separation and purification of three stilbenes from the radix of Polygonum cillinerve (Nakai) Ohwl by macroporous resin column chromatography combined with high-speed counter-current chromatography (HSCCC) was successfully established. In the present study, a two-phase solvent system composed of chloroform-n-butanol-methanol-water (4:1:4:2, v/v/v/v) was used for HSCCC separation. A one-step separation in 4 h from 150 mg of crude extract produced 26.3 mg of trans-resveratrol-3-O-glucoside, 42.0 mg of pieceid-2"-O-gallate, and 17.9 mg of trans-resveratrol with purities of 99.1%, 97.8%, and 99.4%, respectively, as determined by high-performance liquid chromatography (HPLC). The chemical structures of these compounds were identified by nuclear magnetic resonance (NMR) spectroscopy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased heart rate variability (HRV) and high-frequency content of the terminal region of the ventricular activation of signal-averaged ECG (SAECG) have been reported in athletes. The present study investigates HRV and SAECG parameters as predictors of maximal aerobic power (VO2max) in athletes. HRV, SAECG and VO2max were determined in 18 high-performance long-distance (25 ± 6 years; 17 males) runners 24 h after a training session. Clinical visits, ECG and VO2max determination were scheduled for all athletes during thew training period. A group of 18 untrained healthy volunteers matched for age, gender, and body surface area was included as controls. SAECG was acquired in the resting supine position for 15 min and processed to extract average RR interval (Mean-RR) and root mean squared standard deviation (RMSSD) of the difference of two consecutive normal RR intervals. SAECG variables analyzed in the vector magnitude with 40-250 Hz band-pass bi-directional filtering were: total and 40-µV terminal (LAS40) duration of ventricular activation, RMS voltage of total (RMST) and of the 40-ms terminal region of ventricular activation. Linear and multivariate stepwise logistic regressions oriented by inter-group comparisons were adjusted in significant variables in order to predict VO2max, with a P < 0.05 considered to be significant. VO2max correlated significantly (P < 0.05) with RMST (r = 0.77), Mean-RR (r = 0.62), RMSSD (r = 0.47), and LAS40 (r = -0.39). RMST was the independent predictor of VO2max. In athletes, HRV and high-frequency components of the SAECG correlate with VO2max and the high-frequency content of SAECG is an independent predictor of VO2max.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les troubles reliés à la dépression, l’épuisement professionnel et l’anxiété sont de plus en plus répandus dans notre société moderne. La consommation croissante d’antidépresseurs dans les différents pays du monde est responsable de la récente détection de résidus à l’état de traces dans les rejets urbains municipaux. Ainsi, ces substances dites « émergentes » qui possèdent une activité pharmacologique destinée à la régulation de certains neurotransmetteurs dans le cerveau suscitent maintenant de nombreuses inquiétudes de la part de la communauté scientifique. L’objectif principal de ce projet de doctorat a été de mieux comprendre le devenir de plusieurs classes d’antidépresseurs présents dans diverses matrices environnementales (i.e. eaux de surfaces, eaux usées, boues de traitement, tissus biologiques) en développant de nouvelles méthodes analytiques fiables capables de les détecter, quantifier et confirmer par chromatographie liquide à haute performance couplée à la spectrométrie de masse en tandem (LC-QqQMS, LC-QqToFMS). Une première étude complétée à la station d’épuration de la ville de Montréal a permis de confirmer la présence de six antidépresseurs et quatre métabolites N-desmethyl dans les affluents (2 - 330 ng L-1). Pour ce traitement primaire (physico-chimique), de faibles taux d’enlèvement (≤ 15%) ont été obtenus. Des concentrations d’antidépresseurs atteignant près de 100 ng L-1 ont également été détectées dans le fleuve St-Laurent à 0.5 km du point de rejet de la station d’épuration. Une seconde étude menée à la même station a permis l’extraction sélective d’antidépresseurs dans trois tissus (i.e. foie, cerveau et filet) de truites mouchetées juvéniles exposées à différentes concentrations d’effluent dilué traité et non-traité à l’ozone. Un certain potentiel de bioaccumulation dans les tissus (0.08-10 ng g-1) a été observé pour les spécimens exposés à l’effluent non-traité (20% v/v) avec distribution majoritaire dans le foie et le cerveau. Une intéressante corrélation a été établie entre les concentrations de trois antidépresseurs dans le cerveau et l’activité d’un biomarqueur d’exposition (i.e. pompe N/K ATPase impliquée dans la régulation de la sérotonine) mesurée à partir de synaptosomes de truites exposées aux effluents. Une investigation de l’efficacité de plusieurs stations d’épuration canadiennes opérant différents types de traitements a permis de constater que les traitements secondaires (biologiques) étaient plus performants que ceux primaires (physico-chimiques) pour enlever les antidépresseurs (taux moyen d’enlèvement : 30%). Les teneurs les plus élevées dans les boues traitées (biosolides) ont été obtenues avec le citalopram (1033 ng g-1), la venlafaxine (833 ng g-1) et l’amitriptyline (78 ng g-1). Des coefficients de sorption expérimentaux (Kd) calculés pour chacun des antidépresseurs ont permis d’estimer une grande sorption des composés sertraline, desméthylsertraline, paroxetine et fluoxetine sur les solides (log Kd > 4). Finalement, un excellent taux d’enlèvement moyen de 88% a été obtenu après ozonation (5 mg L-1) d’un effluent primaire. Toutefois, la caractérisation de nouveaux sous-produits N-oxyde (venlafaxine, desmethylvenlafaxine) par spectrométrie de masse à haute résolution (LC-QqToFMS) dans l’effluent traité à l’ozone a mis en lumière la possibilité de formation de multiples composés polaires de toxicité inconnue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le travail de modélisation a été réalisé à travers EGSnrc, un logiciel développé par le Conseil National de Recherche Canada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Travail dirigé présenté à la Faculté des Sciences Infirmières en vue de l’obtention du grade de Maître ès Sciences (M. Sc.) en sciences infirmière option administration des sciences infirmières

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les nanomatériaux sont une classe de contaminants qui est de plus en plus présent dans l’environnement. Leur impact sur l’environnement dépendra de leur persistance, mobilité, toxicité et bioaccumulation. Chacun de ces paramètres dépendra de leur comportement physicochimique dans les eaux naturelles (i.e. dissolution et agglomération). L’objectif de cette étude est de comprendre l’agglomération et l’hétéroagglomération des nanoparticules d’argent dans l’environnement. Deux différentes sortes de nanoparticules d’argent (nAg; avec enrobage de citrate et avec enrobage d’acide polyacrylique) de 5 nm de diamètre ont été marquées de manière covalente à l’aide d’un marqueur fluorescent et ont été mélangées avec des colloïdes d’oxyde de silice (SiO2) ou d’argile (montmorillonite). L’homo- et hétéroagglomération des nAg ont été étudiés dans des conditions représentatives d’eaux douces naturelles (pH 7,0; force ionique 10 7 à 10-1 M de Ca2+). Les tailles ont été mesurées par spectroscopie de corrélation par fluorescence (FCS) et les résultats ont été confirmés à l’aide de la microscopie en champ sombre avec imagerie hyperspectrale (HSI). Les résultats ont démontrés que les nanoparticules d’argent à enrobage d’acide polyacrylique sont extrêmement stables sous toutes les conditions imposées, incluant la présence d’autres colloïdes et à des forces ioniques très élevées tandis que les nanoparticules d’argent avec enrobage de citrate ont formées des hétéroagrégats en présence des deux particules colloïdales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, we observed a significant increase of food fraud ranging from false label claims to the use of additives and fillers to increase profitability. Recently in 2013, horse and pig DNA were detected in beef products sold from several retailers. Mass spectrometry has become the workhorse in protein research and the detection of marker proteins could serve for both animal species and tissue authentication. Meat species authenticity will be performed using a well defined proteogenomic annotation, carefully chosen surrogate tryptic peptides and analysis using a hybrid quadrupole-Orbitrap mass spectrometer. Selected mammalian meat samples were homogenized, proteins were extracted and digested with trypsin. The samples were analyzed using a high-resolution mass spectrometer. The chromatography was achieved using a 30 minutes linear gradient along with a BioBasic C8 100 × 1 mm column at a flow rate of 75 µL/min. The mass spectrometer was operated in full-scan high resolution and accurate mass. MS/MS spectra were collected for selected proteotypic peptides. Muscular proteins were methodically analyzed in silico in order to generate tryptic peptide mass lists and theoretical MS/MS spectra. Following a comprehensive bottom-up proteomic analysis, we were able to detect and identify a proteotypic myoglobin tryptic peptide [120-134] for each species with observed m/z below 1.3 ppm compared to theoretical values. Moreover, proteotypic peptides from myosin-1, myosin-2 and -hemoglobin were also identified. This targeted method allowed a comprehensive meat speciation down to 1% (w/w) of undesired product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the commercial and financial data are stored in decimal fonn. Recently, support for decimal arithmetic has received increased attention due to the growing importance in financial analysis, banking, tax calculation, currency conversion, insurance, telephone billing and accounting. Performing decimal arithmetic with systems that do not support decimal computations may give a result with representation error, conversion error, and/or rounding error. In this world of precision, such errors are no more tolerable. The errors can be eliminated and better accuracy can be achieved if decimal computations are done using Decimal Floating Point (DFP) units. But the floating-point arithmetic units in today's general-purpose microprocessors are based on the binary number system, and the decimal computations are done using binary arithmetic. Only few common decimal numbers can be exactly represented in Binary Floating Point (BF P). ln many; cases, the law requires that results generated from financial calculations performed on a computer should exactly match with manual calculations. Currently many applications involving fractional decimal data perform decimal computations either in software or with a combination of software and hardware. The performance can be dramatically improved by complete hardware DFP units and this leads to the design of processors that include DF P hardware.VLSI implementations using same modular building blocks can decrease system design and manufacturing cost. A multiplexer realization is a natural choice from the viewpoint of cost and speed.This thesis focuses on the design and synthesis of efficient decimal MAC (Multiply ACeumulate) architecture for high speed decimal processors based on IEEE Standard for Floating-point Arithmetic (IEEE 754-2008). The research goal is to design and synthesize deeimal'MAC architectures to achieve higher performance.Efficient design methods and architectures are developed for a high performance DFP MAC unit as part of this research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High strength and high performance concrete are being widely used all over the world. Most of the applications of high strength concrete have been found in high rise buildings, long span bridges etc. The potential of rice husk ash as a cement replacement material is well established .Earlier researches showed an improvement in mechanical properties of high strength concrete with finely ground RHA as a partial cement replacement material. A review of literature urges the need for optimizing the replacement level of cement with RHA for improved mechanical properties at optimum water binder ratio. This paper discusses the mechanical properties of RHA- High strength concrete at optimized conditions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Upgrading two widely used standard plastics, polypropylene (PP) and high density polyethylene (HDPE), and generating a variety of useful engineering materials based on these blends have been the main objective of this study. Upgradation was effected by using nanomodifiers and/or fibrous modifiers. PP and HDPE were selected for modification due to their attractive inherent properties and wide spectrum of use. Blending is the engineered method of producing new materials with tailor made properties. It has the advantages of both the materials. PP has high tensile and flexural strength and the HDPE acts as an impact modifier in the resultant blend. Hence an optimized blend of PP and HDPE was selected as the matrix material for upgradation. Nanokaolinite clay and E-glass fibre were chosen for modifying PP/HDPE blend. As the first stage of the work, the mechanical, thermal, morphological, rheological, dynamic mechanical and crystallization characteristics of the polymer nanocomposites prepared with PP/HDPE blend and different surface modified nanokaolinite clay were analyzed. As the second stage of the work, the effect of simultaneous inclusion of nanokaolinite clay (both N100A and N100) and short glass fibres are investigated. The presence of nanofiller has increased the properties of hybrid composites to a greater extent than micro composites. As the last stage, micromechanical modeling of both nano and hybrid A composite is carried out to analyze the behavior of the composite under load bearing conditions. These theoretical analyses indicate that the polymer-nanoclay interfacial characteristics partially converge to a state of perfect interfacial bonding (Takayanagi model) with an iso-stress (Reuss IROM) response. In the case of hybrid composites the experimental data follows the trend of Halpin-Tsai model. This implies that matrix and filler experience varying amount of strain and interfacial adhesion between filler and matrix and also between the two fillers which play a vital role in determining the modulus of the hybrid composites.A significant observation from this study is that the requirement of higher fibre loading for efficient reinforcement of polymers can be substantially reduced by the presence of nanofiller together with much lower fibre content in the composite. Hybrid composites with both nanokaolinite clay and micron sized E-glass fibre as reinforcements in PP/HDPE matrix will generate a novel class of high performance, cost effective engineering material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die ubiquitäre Datenverarbeitung ist ein attraktives Forschungsgebiet des vergangenen und aktuellen Jahrzehnts. Es handelt von unaufdringlicher Unterstützung von Menschen in ihren alltäglichen Aufgaben durch Rechner. Diese Unterstützung wird durch die Allgegenwärtigkeit von Rechnern ermöglicht die sich spontan zu verteilten Kommunikationsnetzwerken zusammen finden, um Informationen auszutauschen und zu verarbeiten. Umgebende Intelligenz ist eine Anwendung der ubiquitären Datenverarbeitung und eine strategische Forschungsrichtung der Information Society Technology der Europäischen Union. Das Ziel der umbebenden Intelligenz ist komfortableres und sichereres Leben. Verteilte Kommunikationsnetzwerke für die ubiquitäre Datenverarbeitung charakterisieren sich durch Heterogenität der verwendeten Rechner. Diese reichen von Kleinstrechnern, eingebettet in Gegenstände des täglichen Gebrauchs, bis hin zu leistungsfähigen Großrechnern. Die Rechner verbinden sich spontan über kabellose Netzwerktechnologien wie wireless local area networks (WLAN), Bluetooth, oder UMTS. Die Heterogenität verkompliziert die Entwicklung und den Aufbau von verteilten Kommunikationsnetzwerken. Middleware ist eine Software Technologie um Komplexität durch Abstraktion zu einer homogenen Schicht zu reduzieren. Middleware bietet eine einheitliche Sicht auf die durch sie abstrahierten Ressourcen, Funktionalitäten, und Rechner. Verteilte Kommunikationsnetzwerke für die ubiquitäre Datenverarbeitung sind durch die spontane Verbindung von Rechnern gekennzeichnet. Klassische Middleware geht davon aus, dass Rechner dauerhaft miteinander in Kommunikationsbeziehungen stehen. Das Konzept der dienstorienterten Architektur ermöglicht die Entwicklung von Middleware die auch spontane Verbindungen zwischen Rechnern erlaubt. Die Funktionalität von Middleware ist dabei durch Dienste realisiert, die unabhängige Software-Einheiten darstellen. Das Wireless World Research Forum beschreibt Dienste die zukünftige Middleware beinhalten sollte. Diese Dienste werden von einer Ausführungsumgebung beherbergt. Jedoch gibt es noch keine Definitionen wie sich eine solche Ausführungsumgebung ausprägen und welchen Funktionsumfang sie haben muss. Diese Arbeit trägt zu Aspekten der Middleware-Entwicklung für verteilte Kommunikationsnetzwerke in der ubiquitären Datenverarbeitung bei. Der Schwerpunkt liegt auf Middleware und Grundlagentechnologien. Die Beiträge liegen als Konzepte und Ideen für die Entwicklung von Middleware vor. Sie decken die Bereiche Dienstfindung, Dienstaktualisierung, sowie Verträge zwischen Diensten ab. Sie sind in einem Rahmenwerk bereit gestellt, welches auf die Entwicklung von Middleware optimiert ist. Dieses Rahmenwerk, Framework for Applications in Mobile Environments (FAME²) genannt, beinhaltet Richtlinien, eine Definition einer Ausführungsumgebung, sowie Unterstützung für verschiedene Zugriffskontrollmechanismen um Middleware vor unerlaubter Benutzung zu schützen. Das Leistungsspektrum der Ausführungsumgebung von FAME² umfasst: • minimale Ressourcenbenutzung, um auch auf Rechnern mit wenigen Ressourcen, wie z.B. Mobiltelefone und Kleinstrechnern, nutzbar zu sein • Unterstützung für die Anpassung von Middleware durch Änderung der enthaltenen Dienste während die Middleware ausgeführt wird • eine offene Schnittstelle um praktisch jede existierende Lösung für das Finden von Diensten zu verwenden • und eine Möglichkeit der Aktualisierung von Diensten zu deren Laufzeit um damit Fehlerbereinigende, optimierende, und anpassende Wartungsarbeiten an Diensten durchführen zu können Eine begleitende Arbeit ist das Extensible Constraint Framework (ECF), welches Design by Contract (DbC) im Rahmen von FAME² nutzbar macht. DbC ist eine Technologie um Verträge zwischen Diensten zu formulieren und damit die Qualität von Software zu erhöhen. ECF erlaubt das aushandeln sowie die Optimierung von solchen Verträgen.