961 resultados para Fast methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose

The objective of our study was to test a new approach to approximating organ dose by using the effective energy of the combined 80kV/140kV beam used in fast kV switch dual-energy (DE) computed tomography (CT). The two primary focuses of the study were to first validate experimentally the dose equivalency between MOSFET and ion chamber (as a gold standard) in a fast kV switch DE environment, and secondly to estimate effective dose (ED) of DECT scans using MOSFET detectors and an anthropomorphic phantom.

Materials and Methods

A GE Discovery 750 CT scanner was employed using a fast-kV switch abdomen/pelvis protocol alternating between 80 kV and 140 kV. The specific aims of our study were to (1) Characterize the effective energy of the dual energy environment; (2) Estimate the f-factor for soft tissue; (3) Calibrate the MOSFET detectors using a beam with effective energy equal to the combined DE environment; (4) Validate our calibration by using MOSFET detectors and ion chamber to measure dose at the center of a CTDI body phantom; (5) Measure ED for an abdomen/pelvis scan using an anthropomorphic phantom and applying ICRP 103 tissue weighting factors; and (6) Estimate ED using AAPM Dose Length Product (DLP) method. The effective energy of the combined beam was calculated by measuring dose with an ion chamber under varying thicknesses of aluminum to determine half-value layer (HVL).

Results

The effective energy of the combined dual-energy beams was found to be 42.8 kV. After calibration, tissue dose in the center of the CTDI body phantom was measured at 1.71 ± 0.01 cGy using an ion chamber, and 1.73±0.04 and 1.69±0.09 using two separate MOSFET detectors. This result showed a -0.93% and 1.40 % difference, respectively, between ion chamber and MOSFET. ED from the dual-energy scan was calculated as 16.49 ± 0.04 mSv by the MOSFET method and 14.62 mSv by the DLP method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aflatoxins are a group of carcinogenic compounds produced by Aspergillus fungi that can grow on different agricultural crops. Both acute and chronic exposure to these mycotoxins can cause serious illness. Due to the high occurrence of aflatoxins in crops worldwide fast and cost-effective analytical methods are required for the identification of contaminated agricultural commodities before they are processed into final products and placed on the market. In order to provide new tools for aflatoxin screening two prototype fast ELISA methods: one for the detection of aflatoxin B1 and the other for total aflatoxins were developed. Seven monoclonal antibodies with unique high sensitivity and at the same time good cross-reactivity profiles were produced. The monoclonal antibodies were characterized and two antibodies showing IC50 of 0.037 ng/mL and 0.031 ng/mL for aflatoxin B1 were applied in simple and fast direct competitive ELISA tests. The methods were validated for peanut matrix as this crop is one of the most affected by aflatoxin contamination. The detection capabilities of aflatoxin B1 and total aflatoxins ELISAs were 0.4 μg/kg and 0.3 μg/kg for aflatoxin B1, respectively, which are one of the lowest reported values. Total aflatoxins ELISA was also validated for the detection of aflatoxins B2, G1 and G2. The application of the developed tests was demonstrated by screening 32 peanut samples collected from the UK retailers. Total aflatoxins ELISA was further applied to analyse naturally contaminated maize porridge and distiller's dried grain with solubles samples and the results were correlated with these obtained by UHPLC-MS/MS method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Enabling natural human-robot interaction using computer vision based applications requires fast and accurate hand detection. However, previous works in this field assume different constraints, like a limitation in the number of detected gestures, because hands are highly complex objects difficult to locate. This paper presents an approach which integrates temporal coherence cues and hand detection based on wrists using a cascade classifier. With this approach, we introduce three main contributions: (1) a transparent initialization mechanism without user participation for segmenting hands independently of their gesture, (2) a larger number of detected gestures as well as a faster training phase than previous cascade classifier based methods and (3) near real-time performance for hand pose detection in video streams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evaluation of the quality of the environment is essential for human wellness as pollutants in trace amounts can cause serious health problem. Nitrosamines are a group of compounds that are considered potential carcinogens and can be found in drinking water (as disinfection byproducts), foods, beverages and cosmetics. To monitor the level of these compounds to minimize daily intakes, fast and reliable analytical techniques are required. As these compounds are relatively highly polar, extraction and enrichment from environmental samples (aqueous) are challenging. Also, the trend of analytical techniques toward the reduction of sample size and minimization of organic solvent use demands new methods of analysis. In light of fulfilling these requirements, a new method of online preconcentration tailored to an electrokinetic chromatography is introduced. In this method, electroosmotic flow (EOF) was suppressed to increase the interaction time between analyte and micellar phase, therefore the only force to mobilize the neutral analytes is the interaction of analyte with moving micelles. In absence of EOF, polarity of applied potential was switched (negative or positive) to force (anionic or cationic) micelles to move toward the detector. To avoid the excessive band broadening due to longer analysis time caused by slow moving micelles, auxiliary pressure was introduced to boost the micelle movement toward the detector using an in house designed and built apparatus. Applying the external auxiliary pressure significantly reduced the analysis times without compromising separation efficiency. Parameters, such as type of surfactants, composition of background electrolyte (BGE), type of capillary, matrix effect, organic modifiers, etc., were evaluated in optimization of the method. The enrichment factors for targeted analytes were impressive, particularly; cationic surfactants were shown to be suitable for analysis of nitrosamines due to their ability to act as hydrogen bond donors. Ammonium perfluorooctanoate (APFO) also showed remarkable results in term of peak shapes and number of theoretical plates. It was shown that the separation results were best when a high conductivity sample was paired with a BGE of lower conductivity. Using higher surfactant concentrations (up to 200 mM SDS) than usual (50 mM SDS) for micellar electrokinetic chromatography (MEKC) improved the sweeping. A new method for micro-extraction and enrichment of highly polar neutral analytes (N-Nitrosamines in particular) based on three-phase drop micro-extraction was introduced and its performance studied. In this method, a new device using some easy-to-find components was fabricated and its operation and application demonstrated. Compared to conventional extraction methods (liquid-liquid extraction), consumption of organic solvents and operation times were significantly lower.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to detect and analyse regular patterns of play in fast attack of football teams, through the combination of the sequential analysis technique and semi-structured interviews to experienced first League Portuguese coaches. The sample included 36 games (12 games of the respective national leagues per team) of the F.C. Barcelona, Inter Milan, and Manchester United teams that were coded with the observational instrument tool developed by Sarmento et al. (2010) and the data analysed through sequential analysis with the software SDIS-GSEQ 5.0. Based on the detected patterns, semi-structured interviews were carried out to 8 expert high-performance football coaches and data were analysed through the content analysis technique using the software NVivo 10. The detected patterns of play revealed specific characteristics of the teams under study. The combination of the results of sequential analysis with the qualitative interviews to the professional coaches proved to be very fruitful in this game the analysis of scope, allowing reconcile scientific knowledge with practical interpretation of coaches who develop their tasks in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim of this study was the evaluation of a fast Gradient Spin Echo Technique (GraSE) for cardiac T2-mapping, combining a robust estimation of T2 relaxation times with short acquisition times. The sequence was compared against two previously introduced T2-mapping techniques in a phantom and in vivo. Methods: Phantom experiments were performed at 1.5 T using a commercially available cylindrical gel phantom. Three different T2-mapping techniques were compared: a Multi Echo Spin Echo (MESE; serving as a reference), a T2-prepared balanced Steady State Free Precession (T2prep) and a Gradient Spin Echo sequence. For the subsequent in vivo study, 12 healthy volunteers were examined on a clinical 1.5 T scanner. The three T2-mapping sequences were performed at three short-axis slices. Global myocardial T2 relaxation times were calculated and statistical analysis was performed. For assessment of pixel-by-pixel homogeneity, the number of segments showing an inhomogeneous T2 value distribution, as defined by a pixel SD exceeding 20 % of the corresponding observed T2 time, was counted. Results: Phantom experiments showed a greater difference of measured T2 values between T2prep and MESE than between GraSE and MESE, especially for species with low T1 values. Both, GraSE and T2prep resulted in an overestimation of T2 times compared to MESE. In vivo, significant differences between mean T2 times were observed. In general, T2prep resulted in lowest (52.4 +/- 2.8 ms) and GraSE in highest T2 estimates (59.3 +/- 4.0 ms). Analysis of pixel-by-pixel homogeneity revealed the least number of segments with inhomogeneous T2 distribution for GraSE-derived T2 maps. Conclusions: The GraSE sequence is a fast and robust sequence, combining advantages of both MESE and T2prep techniques, which promises to enable improved clinical applicability of T2-mapping in the future. Our study revealed significant differences of derived mean T2 values when applying different sequence designs. Therefore, a systematic comparison of different cardiac T2-mapping sequences and the establishment of dedicated reference values should be the goal of future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. Most VS methods suppose a unique binding site for the target, but it has been demonstrated that diverse ligands interact with unrelated parts of the target and many VS methods do not take into account this relevant fact. This problem is circumvented by a novel VS methodology named BINDSURF that scans the whole protein surface in order to find new hotspots, where ligands might potentially interact with, and which is implemented in last generation massively parallel GPU hardware, allowing fast processing of large ligand databases. BINDSURF can thus be used in drug discovery, drug design, drug repurposing and therefore helps considerably in clinical research. However, the accuracy of most VS methods and concretely BINDSURF is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to improve accuracy of the scoring functions used in BINDSURF we propose a hybrid novel approach where neural networks (NNET) and support vector machines (SVM) methods are trained with databases of known active (drugs) and inactive compounds, being this information exploited afterwards to improve BINDSURF VS predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nach der Biographie der österreichischen Pädagogin und Psychologin Elsa Köhler (1879-1940) werden in diesem Beitrag ihre Pionierleistungen bei der Grundlegung der empirischen Bildungsforschung beschrieben. Als Lehrerin war sie früh um den Einbezug des Entwicklungsstands von Schülern in die Didaktik im Sinne der Entwicklung differentieller Unterrichtsansätze bemüht. Am Psychologischen Institut der Universität Wien lernte sie bei Karl Bühler die für longitudinale Einzelfallanalysen der Entwicklung von Kindern und Jugendlichen konzipierten quantitativen und qualitativen Beobachtungs- und Protokolltechniken kennen und weitete diese Methoden als erste auf die pädagogische Situation im Unterricht, auf Schülergruppen und auf die Analyse der Entwicklung ganzer Schulklassen aus. Sie trug Wesentliches dazu bei, dass empirische Forschungsmethoden in reformpädagogische Ansätze der 1920er und 1930er Jahre Eingang fanden und machte ihre in der pädagogischen Situation durchgeführten Entwicklungsanalysen für die Entwicklungsberatung zur Optimierung der Selbststeuerung von Schülern fruchtbar. Elsa Köhler verband Grundlagenforschung mit einem starken Anwendungsbezug in den klassischen Bereichen der auf die Kindheit und das Jugendalter bezogenen Entwicklungspsychologie sowie in den Bereichen der Pädagogischen Psychologie und Pädagogik, die heute unter der Bildungsforschung subsumiert werden. Die Beschäftigung mit ihr ist von fachhistorischer Bedeutung und kann zudem auch Impulse für die moderne interdisziplinär ausgerichtete Bildungsforschung geben. (DIPF/Orig.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To develop a simple, fast and sensitive spectrophotometric method for the determination of tofisopam in tablet dosage form. Methods: Tofisopam as n-electron donor was reacted with two π-acceptors, namely, chloranilic acid (ChA), and 7,7,8,8 tetracyanoquinodimethane (TCNQ) to form charge transfer complexes. The complexes were evaluated spectrophotometrically at 520 and 824 nm for ChA and TCNQ, respectively. The optimum conditions for the reaction were determined and optimized. The developed method was compared with Japanese Pharmacopeia method. Results: The calibration curve was linear in the ranges 25 – 125 and 30 – 150 μg/mL for ChA and TCNQ, respectively. The lower limit of detection was 8.0 and 10.0 μg/mL for ChA and TCNQ, respectively while the slope and intercept of the calibration curves were 0.0025 and 0.011 and 0.0115 and -0.237, for ChA and TCNQ, respectively. Conclusion: The developed methods for tofisopam have good accuracy and precision, and comparable to a standard pharmacopeial method. The methods can be applied for routine analysis and in quality control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wide range of studies have shown that liposomes can act as suitable adjuvants for a range of vaccine antigens. Properties such as their amphiphilic character and biphasic nature allow them to incorporate antigens within the lipid bilayer, on the surface, or encapsulated within the inner core. However, appropriate methods for the manufacture of liposomes are limited and this has resulted in issues with cost, supply, and wider scale application of these systems. Within this chapter we explore manufacturing processes that can be used for the production of liposomal adjuvants, and we outline new manufacturing methods can that offer fast, scalable, and cost-effective production of liposomal adjuvants.