926 resultados para portable analyzer
Resumo:
The study details the development of a fully validated, rapid and portable sensor based method for the on-site analysis of microcystins in freshwater samples. The process employs a novel lysis method for the mechanical lysis of cyanobacterial cells, with glass beads and a handheld frother in only 10min. The assay utilises an innovative planar waveguide device that, via an evanescent wave excites fluorescent probes, for amplification of signal in a competitive immunoassay, using an anti-microcystin monoclonal with cross-reactivity against the most common, and toxic variants. Validation of the assay showed the limit of detection (LOD) to be 0.78ngmL and the CCß to be 1ngmL. Robustness of the assay was demonstrated by intra- and inter-assay testing. Intra-assay analysis had % C.V.s between 8 and 26% and recoveries between 73 and 101%, with inter-assay analysis demonstrating % C.V.s between 5 and 14% and recoveries between 78 and 91%. Comparison with LC-MS/MS showed a high correlation (R=0.9954) between the calculated concentrations of 5 different Microcystis aeruginosa cultures for total microcystin content. Total microcystin content was ascertained by the individual measurement of free and cell-bound microcystins. Free microcystins can be measured to 1ngmL, and with a 10-fold concentration step in the intracellular microcystin protocol (which brings the sample within the range of the calibration curve), intracellular pools may be determined to 0.1ngmL. This allows the determination of microcystins at and below the World Health Organisation (WHO) guideline value of 1µgL. This sensor represents a major advancement in portable analysis capabilities and has the potential for numerous other applications.
Resumo:
Tissue micro array (TMA) is based on the idea of applying miniaturization and a high throughput approach to hybridization-based analyses of tissues. It facilitates biomedical research on a large scale in a single experiment; thus representing one of the most commonly used technologies in translational research. A critical analysis of the existing TMA instruments indicates that there are potential constraints in terms of portability, apart from costs and complexity. This paper will present the development of an affordable, configurable, and portable TMA instrument to allow an efficient collection of tissues, especially in instrument-to-tissue scenarios. The purely mechanical instrument requires no energy sources other than the user, is light weight, portable, and simple to use. [DOI: 10.1115/1.4004922]
Resumo:
The technique of diffusive gradients in thin films (DGT) is often employed to quantify labile metals in situ; however, it is a challenge to perform the measurements in-field. This study evaluated the capability of field-portable X-ray fluorescence (FP-XRF) to swiftly generate elemental speciation information with DGT. Biologically available metal ions in environmental samples passively preconcentrate in the thin films of DGT devices, providing an ideal and uniform matrix for XRF nondestructive detection. Strong correlation coefficients (r > 0.992 for Mn, Cu, Zn, Pb and As) were obtained for all elements during calibration. The limits of quantitation (LOQ) for the investigated elements of FP-XRF on DGT devices are 2.74 for Mn, 4.89 for Cu, 2.89 for Zn, 2.55 for Pb, and 0.48 for As (unit: µg cm(-2)). When Pb and As co-existed in the solution trials, As did not interfere with Pb detection when using Chelex-DGT. However, there was a significant enhancement of the Pb reading attributed to As when ferrihydrite binding gels were tested, consistent with Fe-oxyhydroxide surfaces absorbing large quantities of As. This study demonstrates the value of the FP-XRF technique to rapidly and nondestructively detect the metals accumulated in DGT devices, providing a new and simple diagnostic tool for on-site environmental monitoring of labile metals/metalloids
Resumo:
In this article the multibody simulation software package MADYMO for analysing and optimizing occupant safety design was used to model crash tests for Normal Containment barriers in accordance with EN 1317. The verification process was carried out by simulating a TB31 and a TB32 crash test performed on vertical portable concrete barriers and by comparing the numerical results to those obtained experimentally. The same modelling approach was applied to both tests to evaluate the predictive capacity of the modelling at two different impact speeds. A sensitivity analysis of the vehicle stiffness was also carried out. The capacity to predict all of the principal EN1317 criteria was assessed for the first time: the acceleration severity index, the theoretical head impact velocity, the barrier working width and the vehicle exit box. Results showed a maximum error of 6% for the acceleration severity index and 21% for theoretical head impact velocity for the numerical simulation in comparison to the recorded data. The exit box position was predicted with a maximum error of 4°. For the working width, a large percentage difference was observed for test TB31 due to the small absolute value of the barrier deflection but the results were well within the limit value from the standard for both tests. The sensitivity analysis showed the robustness of the modelling with respect to contact stiffness increase of ±20% and ±40%. This is the first multibody model of portable concrete barriers that can reproduce not only the acceleration severity index but all the test criteria of EN 1317 and is therefore a valuable tool for new product development and for injury biomechanics research.
Resumo:
Hardware designers and engineers typically need to explore a multi-parametric design space in order to find the best configuration for their designs using simulations that can take weeks to months to complete. For example, designers of special purpose chips need to explore parameters such as the optimal bitwidth and data representation. This is the case for the development of complex algorithms such as Low-Density Parity-Check (LDPC) decoders used in modern communication systems. Currently, high-performance computing offers a wide set of acceleration options, that range from multicore CPUs to graphics processing units (GPUs) and FPGAs. Depending on the simulation requirements, the ideal architecture to use can vary. In this paper we propose a new design flow based on OpenCL, a unified multiplatform programming model, which accelerates LDPC decoding simulations, thereby significantly reducing architectural exploration and design time. OpenCL-based parallel kernels are used without modifications or code tuning on multicore CPUs, GPUs and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL for mapping the simulations into FPGAs. To the best of our knowledge, this is the first time that a single, unmodified OpenCL code is used to target those three different platforms. We show that, depending on the design parameters to be explored in the simulation, on the dimension and phase of the design, the GPU or the FPGA may suit different purposes more conveniently, providing different acceleration factors. For example, although simulations can typically execute more than 3x faster on FPGAs than on GPUs, the overhead of circuit synthesis often outweighs the benefits of FPGA-accelerated execution.
Resumo:
This paper presents a portable electrochemical instrument capable of detecting and identifying heavy metals in soil, in situ. The instrument has been developed for use in a variety of situations to facilitate contaminated land surveys, avoiding expensive and time-consuming procedures. The system uses differential pulse anodic stripping voltammetry which is a precise and sensitive analytical method with excellent limits of detection. The identification of metals is based on a statistical microprocessor-based method. The instrument is capable of detecting six different toxic metals (lead, cadmium, zinc, nickel, mercury and copper) with good sensitivity
Resumo:
A crescente necessidade de reduzir a dependência energética e a emissão de gases de efeito de estufa levou à adoção de uma série de políticas a nível europeu com vista a aumentar a eficiência energética e nível de controlo de equipamentos, reduzir o consumo e aumentar a percentagem de energia produzida a partir de fontes renováveis. Estas medidas levaram ao desenvolvimento de duas situações críticas para o setor elétrico: a substituição das cargas lineares tradicionais, pouco eficientes, por cargas não-lineares mais eficientes e o aparecimento da produção distribuída de energia a partir de fontes renováveis. Embora apresentem vantagens bem documentadas, ambas as situações podem afetar negativamente a qualidade de energia elétrica na rede de distribuição, principalmente na rede de baixa tensão onde é feita a ligação com a maior parte dos clientes e onde se encontram as cargas não-lineares e a ligação às fontes de energia descentralizadas. Isto significa que a monitorização da qualidade de energia tem, atualmente, uma importância acrescida devido aos custos relacionados com perdas inerentes à falta de qualidade de energia elétrica na rede e à necessidade de verificar que determinados parâmetros relacionados com a qualidade de energia elétrica se encontram dentro dos limites previstos nas normas e nos contratos com clientes de forma a evitar disputas ou reclamações. Neste sentido, a rede de distribuição tem vindo a sofrer alterações a nível das subestações e dos postos de transformação que visam aumentar a visibilidade da qualidade de energia na rede em tempo real. No entanto, estas medidas só permitem monitorizar a qualidade de energia até aos postos de transformação de média para baixa tensão, não revelando o estado real da qualidade de energia nos pontos de entrega ao cliente. A monitorização nestes pontos é feita periodicamente e não em tempo real, ficando aquém do necessário para assegurar a deteção correta de problemas de qualidade de energia no lado do consumidor. De facto, a metodologia de monitorização utilizada atualmente envolve o envio de técnicos ao local onde surgiu uma reclamação ou a um ponto de medição previsto para instalar um analisador de energia que permanece na instalação durante um determinado período de tempo. Este tipo de monitorização à posteriori impossibilita desde logo a deteção do problema de qualidade de energia que levou à reclamação, caso não se trate de um problema contínuo. Na melhor situação, o aparelho poderá detetar uma réplica do evento, mas a larga percentagem anomalias ficam fora deste processo por serem extemporâneas. De facto, para detetar o evento que deu origem ao problema é necessário monitorizar permanentemente a qualidade de energia. No entanto este método de monitorização implica a instalação permanente de equipamentos e não é viável do ponto de vista das empresas de distribuição de energia já que os equipamentos têm custos demasiado elevados e implicam a necessidade de espaços maiores nos pontos de entrega para conter os equipamentos e o contador elétrico. Uma alternativa possível que pode tornar viável a monitorização permanente da qualidade de energia consiste na introdução de uma funcionalidade de monitorização nos contadores de energia de determinados pontos da rede de distribuição. Os contadores são obrigatórios em todas as instalações ligadas à rede, para efeitos de faturação. Tradicionalmente estes contadores são eletromecânicos e recentemente começaram a ser substituídos por contadores inteligentes (smart meters), de natureza eletrónica, que para além de fazer a contagem de energia permitem a recolha de informação sobre outros parâmetros e aplicação de uma serie de funcionalidades pelo operador de rede de distribuição devido às suas capacidades de comunicação. A reutilização deste equipamento com finalidade de analisar a qualidade da energia junto dos pontos de entrega surge assim como uma forma privilegiada dado que se trata essencialmente de explorar algumas das suas características adicionais. Este trabalho tem como objetivo analisar a possibilidade descrita de monitorizar a qualidade de energia elétrica de forma permanente no ponto de entrega ao cliente através da utilização do contador elétrico do mesmo e elaborar um conjunto de requisitos para o contador tendo em conta a normalização aplicável, as características dos equipamentos utilizados atualmente pelo operador de rede e as necessidades do sistema elétrico relativamente à monitorização de qualidade de energia.
Resumo:
BACKGROUND: Screening for obstructive sleep apnea (OSA) is recommended as part of the preoperative assessment of obese patients scheduled for bariatric surgery. The objective of this study was to compare the sensitivity of oximetry alone versus portable polygraphy in the preoperative screening for OSA. METHODS: Polygraphy (type III portable monitor) and oximetry data recorded as part of the preoperative assessment before bariatric surgery from 68 consecutive patients were reviewed. We compared the sensitivity of 3% or 4% desaturation index (oximetry alone) with the apnea-hypopnea index (AHI; polygraphy) to diagnose OSA and classify the patients as normal (<10 events per hour), mild to moderate (10-30 events per hour), or severe (>30 events per hour). RESULTS: Using AHI, the prevalence of OSA (AHI > 10 per hour) was 57.4%: 16.2% of the patients were classified as severe, 41.2% as mild to moderate, and 42.6% as normal. Using 3% desaturation index, 22.1% were classified as severe, 47.1% as mild to moderate, and 30.9% as normal. With 4% desaturation index, 17.6% were classified as severe, 32.4% as mild, and 50% as normal. Overall, 3% desaturation index compared to AHI yielded a 95% negative predictive value to rule out OSA (AHI > 10 per hour) and a 100% sensitivity (0.73 positive predictive value) to detect severe OSA (AHI > 30 per hour). CONCLUSIONS: Using oximetry with 3% desaturation index as a screening tool for OSA could allow us to rule out significant OSA in almost a third of the patients and to detect patients with severe OSA. This cheap and widely available technique could accelerate preoperative work-up of these patients.
Resumo:
Red blood cell (RBC) parameters such as morphology, volume, refractive index, and hemoglobin content are of great importance for diagnostic purposes. Existing approaches require complicated calibration procedures and robust cell perturbation. As a result, reference values for normal RBC differ depending on the method used. We present a way for measuring parameters of intact individual RBCs by using digital holographic microscopy (DHM), a new interferometric and label-free technique with nanometric axial sensitivity. The results are compared with values achieved by conventional techniques for RBC of the same donor and previously published figures. A DHM equipped with a laser diode (lambda = 663 nm) was used to record holograms in an off-axis geometry. Measurements of both RBC refractive indices and volumes were achieved via monitoring the quantitative phase map of RBC by means of a sequential perfusion of two isotonic solutions with different refractive indices obtained by the use of Nycodenz (decoupling procedure). Volume of RBCs labeled by membrane dye Dil was analyzed by confocal microscopy. The mean cell volume (MCV), red blood cell distribution width (RDW), and mean cell hemoglobin concentration (MCHC) were also measured with an impedance volume analyzer. DHM yielded RBC refractive index n = 1.418 +/- 0.012, volume 83 +/- 14 fl, MCH = 29.9 pg, and MCHC 362 +/- 40 g/l. Erythrocyte MCV, MCH, and MCHC achieved by an impedance volume analyzer were 82 fl, 28.6 pg, and 349 g/l, respectively. Confocal microscopy yielded 91 +/- 17 fl for RBC volume. In conclusion, DHM in combination with a decoupling procedure allows measuring noninvasively volume, refractive index, and hemoglobin content of single-living RBCs with a high accuracy.
Resumo:
UANL
Resumo:
Depuis la dernière décennie, les outils technologiques et informatiques ont connu un essor considérable dans plusieurs sphères d’activité de la société. L’éducation n’y a pas échappé, et le ministère de l’Éducation, du Loisir et du Sport (MELS) en a d’ailleurs fait une compétence transversale dans le cadre du Programme de formation de l’école québécoise. L’intégration des TIC s’est faite à travers différents moyens, à commencer par les laboratoires informatiques, les ordinateurs à même la salle de classe et, plus récemment, par l’introduction de projets portables où chaque élève et l’enseignant disposent de leur propre ordinateur. Afin d’être mené à terme, ce projet de recherche a été inscrit dans un projet à plus grande échelle, soit celui d’une recherche financée par le Conseil de recherches en sciences humaines du Canada (CRSH), qui a pour objectif d'analyser les obstacles auxquels font face les enseignants dans l'intégration des technologies à l'école. Le présent projet s'est quant à lui attardé plus spécifiquement aux défis technologiques et pédagogiques inhérents aux projets portables. L’étude s'est déroulée en milieu défavorisé, dans une école primaire montréalaise. Une telle intégration demande une planification rigoureuse et un suivi continu afin d’assurer le succès du projet. De plus, il est évident que ce type de projet pose aussi des défis technologiques et pédagogiques particuliers pour les enseignants. À ce sujet, trois catégories de facteurs qui peuvent avoir un impact sur la réussite des projets portables ont été relevées, soit : les facteurs personnels (internes à l’enseignant), les facteurs relatifs au travail (contexte d'enseignement, pratiques pédagogiques, etc.), ainsi que les facteurs relatifs au matériel et à l’infrastructure. À l’intérieur de ce mémoire, différents concepts, dimensions et indicateurs sont donc explicités et mis en relation afin de mieux comprendre les défis technologiques et pédagogiques qui peuvent survenir dans le cadre de la mise en oeuvre de projets portables. Trois enseignantes rattachées à autant de groupes d’élèves ont accepté de participer à une entrevue individuelle et de répondre à un questionnaire. Les échanges par courriel ont aussi été analysés. L'ensemble des données recueillies ont fait l'objet d'analyses qualitatives. Les analyses ont montré que la catégorie de facteurs citée la plus fréquemment était celle des facteurs relatifs au travail avec une forte majorité. Des défis ont toutefois été identifiés pour toutes les dimensions.