905 resultados para high performance concrete.


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Les troubles reliés à la dépression, l’épuisement professionnel et l’anxiété sont de plus en plus répandus dans notre société moderne. La consommation croissante d’antidépresseurs dans les différents pays du monde est responsable de la récente détection de résidus à l’état de traces dans les rejets urbains municipaux. Ainsi, ces substances dites « émergentes » qui possèdent une activité pharmacologique destinée à la régulation de certains neurotransmetteurs dans le cerveau suscitent maintenant de nombreuses inquiétudes de la part de la communauté scientifique. L’objectif principal de ce projet de doctorat a été de mieux comprendre le devenir de plusieurs classes d’antidépresseurs présents dans diverses matrices environnementales (i.e. eaux de surfaces, eaux usées, boues de traitement, tissus biologiques) en développant de nouvelles méthodes analytiques fiables capables de les détecter, quantifier et confirmer par chromatographie liquide à haute performance couplée à la spectrométrie de masse en tandem (LC-QqQMS, LC-QqToFMS). Une première étude complétée à la station d’épuration de la ville de Montréal a permis de confirmer la présence de six antidépresseurs et quatre métabolites N-desmethyl dans les affluents (2 - 330 ng L-1). Pour ce traitement primaire (physico-chimique), de faibles taux d’enlèvement (≤ 15%) ont été obtenus. Des concentrations d’antidépresseurs atteignant près de 100 ng L-1 ont également été détectées dans le fleuve St-Laurent à 0.5 km du point de rejet de la station d’épuration. Une seconde étude menée à la même station a permis l’extraction sélective d’antidépresseurs dans trois tissus (i.e. foie, cerveau et filet) de truites mouchetées juvéniles exposées à différentes concentrations d’effluent dilué traité et non-traité à l’ozone. Un certain potentiel de bioaccumulation dans les tissus (0.08-10 ng g-1) a été observé pour les spécimens exposés à l’effluent non-traité (20% v/v) avec distribution majoritaire dans le foie et le cerveau. Une intéressante corrélation a été établie entre les concentrations de trois antidépresseurs dans le cerveau et l’activité d’un biomarqueur d’exposition (i.e. pompe N/K ATPase impliquée dans la régulation de la sérotonine) mesurée à partir de synaptosomes de truites exposées aux effluents. Une investigation de l’efficacité de plusieurs stations d’épuration canadiennes opérant différents types de traitements a permis de constater que les traitements secondaires (biologiques) étaient plus performants que ceux primaires (physico-chimiques) pour enlever les antidépresseurs (taux moyen d’enlèvement : 30%). Les teneurs les plus élevées dans les boues traitées (biosolides) ont été obtenues avec le citalopram (1033 ng g-1), la venlafaxine (833 ng g-1) et l’amitriptyline (78 ng g-1). Des coefficients de sorption expérimentaux (Kd) calculés pour chacun des antidépresseurs ont permis d’estimer une grande sorption des composés sertraline, desméthylsertraline, paroxetine et fluoxetine sur les solides (log Kd > 4). Finalement, un excellent taux d’enlèvement moyen de 88% a été obtenu après ozonation (5 mg L-1) d’un effluent primaire. Toutefois, la caractérisation de nouveaux sous-produits N-oxyde (venlafaxine, desmethylvenlafaxine) par spectrométrie de masse à haute résolution (LC-QqToFMS) dans l’effluent traité à l’ozone a mis en lumière la possibilité de formation de multiples composés polaires de toxicité inconnue.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Travail dirigé présenté à la Faculté des Sciences Infirmières en vue de l’obtention du grade de Maître ès Sciences (M. Sc.) en sciences infirmière option administration des sciences infirmières

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Les ombres sont un élément important pour la compréhension d'une scène. Grâce à elles, il est possible de résoudre des situations autrement ambigües, notamment concernant les mouvements, ou encore les positions relatives des objets de la scène. Il y a principalement deux types d'ombres: des ombres dures, aux limites très nettes, qui résultent souvent de lumières ponctuelles ou directionnelles; et des ombres douces, plus floues, qui contribuent à l'atmosphère et à la qualité visuelle de la scène. Les ombres douces résultent de grandes sources de lumière, comme des cartes environnementales, et sont difficiles à échantillonner efficacement en temps réel. Lorsque l'interactivité est prioritaire sur la qualité, des méthodes d'approximation peuvent être utilisées pour améliorer le rendu d'une scène à moindre coût en temps de calcul. Nous calculons interactivement les ombres douces résultant de sources de lumière environnementales, pour des scènes composées d'objets en mouvement et d'un champ de hauteurs dynamique. Notre méthode enrichit la méthode d'exponentiation des harmoniques sphériques, jusque là limitée aux bloqueurs sphériques, pour pouvoir traiter des champs de hauteurs. Nous ajoutons également une représentation pour les BRDFs diffuses et glossy. Nous pouvons ainsi combiner les visibilités et BRDFs dans un même espace, afin de calculer efficacement les ombres douces et les réflexions de scènes complexes. Un algorithme hybride, qui associe les visibilités en espace écran et en espace objet, permet de découpler la complexité des ombres de la complexité de la scène.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Triple quadrupole mass spectrometers coupled with high performance liquid chromatography are workhorses in quantitative bioanalyses. It provides substantial benefits including reproducibility, sensitivity and selectivity for trace analysis. Selected Reaction Monitoring allows targeted assay development but data sets generated contain very limited information. Data mining and analysis of non-targeted high-resolution mass spectrometry profiles of biological samples offer the opportunity to perform more exhaustive assessments, including quantitative and qualitative analysis. The objectives of this study was to test method precision and accuracy, statistically compare bupivacaine drug concentration in real study samples and verify if high resolution and accurate mass data collected in scan mode can actually permit retrospective data analysis, more specifically, extract metabolite related information. The precision and accuracy data presented using both instruments provided equivalent results. Overall, the accuracy was ranging from 106.2 to 113.2% and the precision observed was from 1.0 to 3.7%. Statistical comparisons using a linear regression between both methods reveal a coefficient of determination (R2) of 0.9996 and a slope of 1.02 demonstrating a very strong correlation between both methods. Individual sample comparison showed differences from -4.5% to 1.6% well within the accepted analytical error. Moreover, post acquisition extracted ion chromatograms at m/z 233.1648 ± 5 ppm (M-56) and m/z 305.2224 ± 5 ppm (M+16) revealed the presence of desbutyl-bupivacaine and three distinct hydroxylated bupivacaine metabolites. Post acquisition analysis allowed us to produce semiquantitative evaluations of the concentration-time profiles for bupicavaine metabolites.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, we observed a significant increase of food fraud ranging from false label claims to the use of additives and fillers to increase profitability. Recently in 2013, horse and pig DNA were detected in beef products sold from several retailers. Mass spectrometry has become the workhorse in protein research and the detection of marker proteins could serve for both animal species and tissue authentication. Meat species authenticity will be performed using a well defined proteogenomic annotation, carefully chosen surrogate tryptic peptides and analysis using a hybrid quadrupole-Orbitrap mass spectrometer. Selected mammalian meat samples were homogenized, proteins were extracted and digested with trypsin. The samples were analyzed using a high-resolution mass spectrometer. The chromatography was achieved using a 30 minutes linear gradient along with a BioBasic C8 100 × 1 mm column at a flow rate of 75 µL/min. The mass spectrometer was operated in full-scan high resolution and accurate mass. MS/MS spectra were collected for selected proteotypic peptides. Muscular proteins were methodically analyzed in silico in order to generate tryptic peptide mass lists and theoretical MS/MS spectra. Following a comprehensive bottom-up proteomic analysis, we were able to detect and identify a proteotypic myoglobin tryptic peptide [120-134] for each species with observed m/z below 1.3 ppm compared to theoretical values. Moreover, proteotypic peptides from myosin-1, myosin-2 and -hemoglobin were also identified. This targeted method allowed a comprehensive meat speciation down to 1% (w/w) of undesired product.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Most of the commercial and financial data are stored in decimal fonn. Recently, support for decimal arithmetic has received increased attention due to the growing importance in financial analysis, banking, tax calculation, currency conversion, insurance, telephone billing and accounting. Performing decimal arithmetic with systems that do not support decimal computations may give a result with representation error, conversion error, and/or rounding error. In this world of precision, such errors are no more tolerable. The errors can be eliminated and better accuracy can be achieved if decimal computations are done using Decimal Floating Point (DFP) units. But the floating-point arithmetic units in today's general-purpose microprocessors are based on the binary number system, and the decimal computations are done using binary arithmetic. Only few common decimal numbers can be exactly represented in Binary Floating Point (BF P). ln many; cases, the law requires that results generated from financial calculations performed on a computer should exactly match with manual calculations. Currently many applications involving fractional decimal data perform decimal computations either in software or with a combination of software and hardware. The performance can be dramatically improved by complete hardware DFP units and this leads to the design of processors that include DF P hardware.VLSI implementations using same modular building blocks can decrease system design and manufacturing cost. A multiplexer realization is a natural choice from the viewpoint of cost and speed.This thesis focuses on the design and synthesis of efficient decimal MAC (Multiply ACeumulate) architecture for high speed decimal processors based on IEEE Standard for Floating-point Arithmetic (IEEE 754-2008). The research goal is to design and synthesize deeimal'MAC architectures to achieve higher performance.Efficient design methods and architectures are developed for a high performance DFP MAC unit as part of this research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Upgrading two widely used standard plastics, polypropylene (PP) and high density polyethylene (HDPE), and generating a variety of useful engineering materials based on these blends have been the main objective of this study. Upgradation was effected by using nanomodifiers and/or fibrous modifiers. PP and HDPE were selected for modification due to their attractive inherent properties and wide spectrum of use. Blending is the engineered method of producing new materials with tailor made properties. It has the advantages of both the materials. PP has high tensile and flexural strength and the HDPE acts as an impact modifier in the resultant blend. Hence an optimized blend of PP and HDPE was selected as the matrix material for upgradation. Nanokaolinite clay and E-glass fibre were chosen for modifying PP/HDPE blend. As the first stage of the work, the mechanical, thermal, morphological, rheological, dynamic mechanical and crystallization characteristics of the polymer nanocomposites prepared with PP/HDPE blend and different surface modified nanokaolinite clay were analyzed. As the second stage of the work, the effect of simultaneous inclusion of nanokaolinite clay (both N100A and N100) and short glass fibres are investigated. The presence of nanofiller has increased the properties of hybrid composites to a greater extent than micro composites. As the last stage, micromechanical modeling of both nano and hybrid A composite is carried out to analyze the behavior of the composite under load bearing conditions. These theoretical analyses indicate that the polymer-nanoclay interfacial characteristics partially converge to a state of perfect interfacial bonding (Takayanagi model) with an iso-stress (Reuss IROM) response. In the case of hybrid composites the experimental data follows the trend of Halpin-Tsai model. This implies that matrix and filler experience varying amount of strain and interfacial adhesion between filler and matrix and also between the two fillers which play a vital role in determining the modulus of the hybrid composites.A significant observation from this study is that the requirement of higher fibre loading for efficient reinforcement of polymers can be substantially reduced by the presence of nanofiller together with much lower fibre content in the composite. Hybrid composites with both nanokaolinite clay and micron sized E-glass fibre as reinforcements in PP/HDPE matrix will generate a novel class of high performance, cost effective engineering material.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Scheme86 and the HP Precision Architectures represent different trends in computer processor design. The former uses wide micro-instructions, parallel hardware, and a low latency memory interface. The latter encourages pipelined implementation and visible interlocks. To compare the merits of these approaches, algorithms frequently encountered in numerical and symbolic computation were hand-coded for each architecture. Timings were done in simulators and the results were evaluated to determine the speed of each design. Based on these measurements, conclusions were drawn as to which aspects of each architecture are suitable for a high- performance computer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The introduction of non-toxic fluride compounds as direct replacements for Thorium Fluoride (ThF4) has renewed interest in the use of low index fluoride compounds in high performance infrared filters. This paper reports the results of an investigation into the effects of combining these low index materials, particularly Barium Fluoride (BaF2), with the high index material Lead Telluride (PbTe) in bandpass and edge filters. Infrared filter designs using conventional and the new material ombination are compared, and infrared filters using these material combinations have been manufactured and have been shown to suffer problems with residual stress. A possible solution to this problem utilising Zinc Sulphide (ZnS) layers with compensating compressive stress is discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of virtualization in high-performance computing (HPC) has been suggested as a means to provide tailored services and added functionality that many users expect from full-featured Linux cluster environments. The use of virtual machines in HPC can offer several benefits, but maintaining performance is a crucial factor. In some instances the performance criteria are placed above the isolation properties. This selective relaxation of isolation for performance is an important characteristic when considering resilience for HPC environments that employ virtualization. In this paper we consider some of the factors associated with balancing performance and isolation in configurations that employ virtual machines. In this context, we propose a classification of errors based on the concept of “error zones”, as well as a detailed analysis of the trade-offs between resilience and performance based on the level of isolation provided by virtualization solutions. Finally, a set of experiments are performed using different virtualization solutions to elucidate the discussion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Manufacturing strategy has been widely studied and it is increasingly gaining attention. It has a fundamental role that is to translate the business strategy to the operations by developing the capabilities that are needed by the company in order to accomplish the desired performance. More precisely, manufacturing strategy comprises the decisions that managers take during a certain period of time in order to achieve a desire result. These decisions are related to which operational practices and resources are implemented. Our goal was to identify the relationship between these two decisions with operational performance. We based our arguments on the resource-based view for identifying sources of competitive advantage. Hence, we argued that operational practices and resources affect positively the operational performances. Additionally, we proposed that in the presence of some resources the implementation of operational practices would lead to a greater performance. We used previous scales for measuring operational practices and performance, and developed new constructs for resources. The data used is part of the High Performance Manufacturing project and the sample is composed by 291 plants. Through confirmatory factor analysis and multiple regressions we found that operational practices to a certain extant are positively related to operational performance. More specifically, the results show that JIT and customer orientation practices have a positive relationship with quality, delivery, flexibility, and cost performances. Moreover, we found that resources like technology and people explain a great variance of operational performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este estudo buscou verificar a influencia dos agentes da cadeia de suprimentos no desempenho do desenvolvimento de novos produtos quando os agentes são analisados em conjunto. A motivação desta pesquisa veio de estudos que alertaram para a consideração da integração da cadeia de suprimentos como um constructo multidimensional, englobando o envolvimento da manufatura, fornecedores e clientes no desenvolvimento de novos produtos; e devido à falta de informação sobre as influencias individuais destes agentes no desenvolvimento de novos produtos. Sob essas considerações, buscou-se construir um modelo analítico baseado na Teoria do Capital Social e Capacidade Absortiva, construir hipóteses a partir da revisão da literatura e conectar constructos como cooperação, envolvimento do fornecedor no desenvolvimento de novos produtos (DNP), envolvimento do cliente no DNP, envolvimento da manufatura no DNP, antecipação de novas tecnologias, melhoria contínua, desempenho operacional do DNP, desempenho de mercado do NPD e desempenho de negócio do DNP. Para testar as hipóteses foram consideradas três variáveis moderadoras, tais como turbulência ambiental (baixa, média e alta), indústria (eletrônicos, maquinários e equipamentos de transporte) e localização (América, Europa e Ásia). Para testar o modelo foram usados dados do projeto High Performance Manufacturing que contém 339 empresas das indústrias de eletrônicos, maquinários e equipamentos de transporte, localizadas em onze países. As hipóteses foram testadas por meio da Análise Fatorial Confirmatória (AFC) incluindo a moderação muti-grupo para as três variáveis moderadoras mencionadas anteriormente. Os principais resultados apontaram que as hipóteses relacionadas com cooperação foram confirmadas em ambientes de média turbulência, enquanto as hipóteses relacionadas ao desempenho no DNP foram confirmadas em ambientes de baixa turbulência ambiental e em países asiáticos. Adicionalmente, sob as mesmas condições, fornecedores, clientes e manufatura influenciam diferentemente no desempenho de novos produtos. Assim, o envolvimento de fornecedores influencia diretamente no desempenho operacional e indiretamente no desempenho de mercado e de negócio em baixos níveis de turbulência ambiental, na indústria de equipamentos de transporte em países da Americanos e Europeus. De igual forma, o envolvimento do cliente influenciou diretamente no desempenho operacional e indiretamente no desempenho de mercado e do negócio em médio nível de turbulência ambiental, na indústria de maquinários e em países Asiáticos. Fornecedores e clientes não influenciam diretamente no desempenho de mercado e do negócio e não influenciam indiretamente no desempenho operacional. O envolvimento da manufatura não influenciou nenhum tipo de desempenho do desenvolvimento de novos produtos em todos os cenários testados.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Os fatores de risco para instalação de lesões do esporte têm sido pesquisados no sentido de facilitar o entendimento sobre o assunto. Contudo, para altos níveis de performance, nos eventos de pista e campo do atletismo, são escassos os documentos que abordam o tema. Assim, a partir da possibilidade de reunir informações sobre a condição descrita, objetivou-se com o presente estudo a exploração de fatores de risco para lesões desportivas no atletismo, a partir de inquérito aplicado a atletas da elite mundial da modalidade. A população foi composta por 60 homens e 60 mulheres alocados em grupos conforme a especificidade de sua modalidade (velocidade, resistência, arremessos e saltos). Realizou-se entrevista utilizando-se de inquérito de morbidade referida, abordando questões sobre variáveis antropométricas e de treinamento, assim como lesões. Utilizou-se a técnica da análise de variância paramétrica para as variáveis antropométricas (idade, peso, estatura) e da técnica da análise de variância não paramétrica em relação às variáveis de treinamento (anos de treinamento e horas semanais). Para associação entre momento de lesão e especialidades, utilizou-se do teste de Goodman em nível de 5% de significância. Os resultados mostraram que houve elevada freqüência de lesões na modalidade em ambos os sexos. As taxas de lesão por atleta entrevistado foram de 0,92 (velocidade), 1,08 (resistência), 1,22 (saltos) e 1,20 (arremessos). Não houve diferença estatisticamente significante para as variáveis antropométricas e de treinamento em relação às provas, com exceção dos saltadores, que apresentaram diferenças para estatura e tempo de treinamento; nesse caso, os acometidos são mais altos ou praticam atletismo há menos tempo (P < 0,05). Concluiu-se que, para população estudada, o risco de lesão é acentuado, mas sem relação entre variáveis e presença de agravos, salvo para especialistas em provas de saltos, que apresentaram estatura e tempo de treinamento como fatores predisponentes à lesão.