910 resultados para Direct Analysis Method
Resumo:
The urinary steroid profile is constituted by anabolic androgenic steroids, including testosterone and its relatives, that are extensively metabolized into phase II sulfated or glucuronidated steroids. The use of liquid chromatography coupled to mass spectrometry (LC-MS) is an issue for the direct analysis of conjugated steroids, which can be used as urinary markers of exogenous steroid administration in doping analysis, without hydrolysis of the conjugated moiety. In this study, a sensitive and selective ultra high-pressure liquid chromatography coupled to quadrupole time-of-flight mass spectrometer (UHPLC-QTOF-MS) method was developed to quantify major urinary metabolites simultaneously after testosterone intake. The sample preparation of the urine (1 mL) was performed by solid-phase extraction on Oasis HLB sorbent using a 96-well plate format. The conjugated steroids were analyzed by UHPLC-QTOF-MS(E) with a single-gradient elution of 36 min (including re-equilibration time) in the negative electrospray ionization mode. MS(E) analysis involved parallel alternating acquisitions of both low- and high-collision energy functions. The method was validated and applied to samples collected from a clinical study performed with a group of healthy human volunteers who had taken testosterone, which were compared with samples from a placebo group. Quantitative results were also compared to GC-MS and LC-MS/MS measurements, and the correlations between data were found appropriate. The acquisition of full mass spectra over the entire mass range with QTOF mass analyzers gives promise of the opportunity to extend the steroid profile to a higher number of conjugated steroids.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
Soil infiltration is a key link of the natural water cycle process. Studies on soil permeability are conducive for water resources assessment and estimation, runoff regulation and management, soil erosion modeling, nonpoint and point source pollution of farmland, among other aspects. The unequal influence of rainfall duration, rainfall intensity, antecedent soil moisture, vegetation cover, vegetation type, and slope gradient on soil cumulative infiltration was studied under simulated rainfall and different underlying surfaces. We established a six factor-model of soil cumulative infiltration by the improved back propagation (BP)-based artificial neural network algorithm with a momentum term and self-adjusting learning rate. Compared to the multiple nonlinear regression method, the stability and accuracy of the improved BP algorithm was better. Based on the improved BP model, the sensitive index of these six factors on soil cumulative infiltration was investigated. Secondly, the grey relational analysis method was used to individually study grey correlations among these six factors and soil cumulative infiltration. The results of the two methods were very similar. Rainfall duration was the most influential factor, followed by vegetation cover, vegetation type, rainfall intensity and antecedent soil moisture. The effect of slope gradient on soil cumulative infiltration was not significant.
Resumo:
The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.
Resumo:
BACKGROUND: Studies on hexaminolevulinate (HAL) cystoscopy report improved detection of bladder tumours. However, recent meta-analyses report conflicting effects on recurrence. OBJECTIVE: To assess available clinical data for blue light (BL) HAL cystoscopy on the detection of Ta/T1 and carcinoma in situ (CIS) tumours, and on tumour recurrence. DESIGN, SETTING, AND PARTICIPANTS: This meta-analysis reviewed raw data from prospective studies on 1345 patients with known or suspected non-muscle-invasive bladder cancer (NMIBC). INTERVENTION: A single application of HAL cystoscopy was used as an adjunct to white light (WL) cystoscopy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS: We studied the detection of NMIBC (intention to treat [ITT]: n=831; six studies) and recurrence (per protocol: n=634; three studies) up to 1 yr. DerSimonian and Laird's random-effects model was used to obtain pooled relative risks (RRs) and associated 95% confidence intervals (CIs) for outcomes for detection. RESULTS AND LIMITATIONS: BL cystoscopy detected significantly more Ta tumours (14.7%; p<0.001; odds ratio [OR]: 4.898; 95% CI, 1.937-12.390) and CIS lesions (40.8%; p<0.001; OR: 12.372; 95% CI, 6.343-24.133) than WL. There were 24.9% patients with at least one additional Ta/T1 tumour seen with BL (p<0.001), significant also in patients with primary (20.7%; p<0.001) and recurrent cancer (27.7%; p<0.001), and in patients at high risk (27.0%; p<0.001) and intermediate risk (35.7%; p=0.004). In 26.7% of patients, CIS was detected only by BL (p<0.001) and was also significant in patients with primary (28.0%; p<0.001) and recurrent cancer (25.0%; p<0.001). Recurrence rates up to 12 mo were significantly lower overall with BL, 34.5% versus 45.4% (p=0.006; RR: 0.761 [0.627-0.924]), and lower in patients with T1 or CIS (p=0.052; RR: 0.696 [0.482-1.003]), Ta (p=0.040; RR: 0.804 [0.653-0.991]), and in high-risk (p=0.050) and low-risk (p=0.029) subgroups. Some subgroups had too few patients to allow statistically meaningful analysis. Heterogeneity was minimised by the statistical analysis method used. CONCLUSIONS: This meta-analysis confirms that HAL BL cystoscopy significantly improves the detection of bladder tumours leading to a reduction of recurrence at 9-12 mo. The benefit is independent of the level of risk and is evident in patients with Ta, T1, CIS, primary, and recurrent cancer.
Resumo:
Tutkielman tavoitteena on tunnistaa Kaakkois-Suomen alueella olevat ohjelmistoyritysten tyypilliset yritysryhmät ja kuvata niiden toimintaa. Tunnistamalla alueen ohjelmistoyrityksille ominaiset piirteet työ antaa myös pohjaa tulevien kehityskohteiden löytämisessä ja kehitystoimenpiteiden kohdistamisessa useisiin samantyyppisiin yrityksiin. Työn taustaksi esitellään ohjelmistoalaa ja ohjelmistoliiketoiminnan malleja, joiden pohjalta muodostetaan viitekehys alueen ohjelmistoyritysten empiiriseen tarkasteluun. Empiriaosuudessa tarkastellaan työn teoreettisessa osiossa esitettyjen liiketoimintamallien toteutumista Kaakkois-Suomessa ja ryhmitellään alueen ohjelmistoyritykset erottelevimpien tekijöiden avulla. Tutkimus on luonteeltaan kvantitatiivinen kokonaistutkimus Kaakkois-Suomen ohjelmistoyrityksistä, ja tutkimusotteeltaan deskriptiivinen eli kuvaileva. Tutkimusaineisto perustui tutkimusryhmän suorittamaan strukturoituun haastatteluun, jossa haastateltiin kaikkiaan 58 ohjelmistoyrityksen vastuuhenkilöitä. Tutkimustulosten perusteella alueelta pystyttiin tunnistamaan neljä toimintatavoiltaan erilaista ohjelmistoliiketoiminnan perustyyppiä: asiakaslähtöiset toimijat (26 toimipaikkaa), räätälöijät (14 toimipaikkaa), integroijat (10 toimipaikkaa) ja tuotteistajat (8 toimipaikkaa). Tulokset osoittavat, että perinteisten ohjelmistoalan liiketoimintamallien kuvaukset ja niistä tehtävät yleistykset antavat hyvän lähtökohdan ohjelmistoyritysten tarkasteluun. Kuitenkin perinteisten ohjelmistoalan liiketoimintamallien antama näkökulma on liian rajoittunut,jos halutaan tarkastella syvällisemmin ohjelmistoyritysten liiketoimintalogiikkaa.
Resumo:
Introduction. Genetic epidemiology is focused on the study of the genetic causes that determine health and diseases in populations. To achieve this goal a common strategy is to explore differences in genetic variability between diseased and nondiseased individuals. Usual markers of genetic variability are single nucleotide polymorphisms (SNPs) which are changes in just one base in the genome. The usual statistical approach in genetic epidemiology study is a marginal analysis, where each SNP is analyzed separately for association with the phenotype. Motivation. It has been observed, that for common diseases the single-SNP analysis is not very powerful for detecting genetic causing variants. In this work, we consider Gene Set Analysis (GSA) as an alternative to standard marginal association approaches. GSA aims to assess the overall association of a set of genetic variants with a phenotype and has the potential to detect subtle effects of variants in a gene or a pathway that might be missed when assessed individually. Objective. We present a new optimized implementation of a pair of gene set analysis methodologies for analyze the individual evidence of SNPs in biological pathways. We perform a simulation study for exploring the power of the proposed methodologies in a set of scenarios with different number of causal SNPs under different effect sizes. In addition, we compare the results with the usual single-SNP analysis method. Moreover, we show the advantage of using the proposed gene set approaches in the context of an Alzheimer disease case-control study where we explore the Reelin signal pathway.
Resumo:
A rapid, economical, reproducible, and simple direct spectrophotometric method was developed and validated for the assay of nitazoxanide in pharmaceutical formulations. Nitazoxanide concentration was estimated in water at 345 nm and pH 4.5. The method was suitable and validated for specificity, linearity, precision, and accuracy. There was no interference of the excipients in the determination of the active pharmaceutical ingredient. The proposed method was successfully applied in the determination of nitazoxanide in coated tablets and in powders for oral suspension. This method was compared to a previously developed and validated method for liquid chromatography to the same drug. There was no significative difference between these methods for nitazoxanide quantitation.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
Bioprocess technology is a multidisciplinary industry that combines knowledge of biology and chemistry with process engineering. It is a growing industry because its applications have an important role in the food, pharmaceutical, diagnostics and chemical industries. In addition, the current pressure to decrease our dependence on fossil fuels motivates new, innovative research in the replacement of petrochemical products. Bioprocesses are processes that utilize cells and/or their components in the production of desired products. Bioprocesses are already used to produce fuels and chemicals, especially ethanol and building-block chemicals such as carboxylic acids. In order to enable more efficient, sustainable and economically feasible bioprocesses, the raw materials must be cheap and the bioprocesses must be operated at optimal conditions. It is essential to measure different parameters that provide information about the process conditions and the main critical process parameters including cell density, substrate concentrations and products. In addition to offline analysis methods, online monitoring tools are becoming increasingly important in the optimization of bioprocesses. Capillary electrophoresis (CE) is a versatile analysis technique with no limitations concerning polar solvents, analytes or samples. Its resolution and efficiency are high in optimized methods creating a great potential for rapid detection and quantification. This work demonstrates the potential and possibilities of CE as a versatile bioprocess monitoring tool. As a part of this study a commercial CE device was modified for use as an online analysis tool for automated monitoring. The work describes three offline CE analysis methods for the determination of carboxylic, phenolic and amino acids that are present in bioprocesses, and an online CE analysis method for the monitoring of carboxylic acid production during bioprocesses. The detection methods were indirect and direct UV, and laser-induced frescence. The results of this work can be used for the optimization of bioprocess conditions, for the development of more robust and tolerant microorganisms, and to study the dynamics of bioprocesses.
Resumo:
In Brazil, scientific research is carried out mainly at universities, where professors coordinate research projects with the active participation of undergraduate and graduate students. However, there is no formal program for the teaching/learning of the scientific method. The objective of the present study was to evaluate the comprehension of the scientific method by students of health sciences who participate in scientific projects in an academic research laboratory. An observational descriptive cross-sectional study was conducted using Edgar Morin complexity as theoretical reference. In a semi-structured interview, students were asked to solve an abstract logical puzzle - TanGram. The collected data were analyzed using the hermeneutic-dialectic analysis method proposed by Minayo and discussed in terms of the theoretical reference of complexity. The students’ concept of the scientific method is limited to participation in projects, stressing the execution of practical procedures as opposed to scientific thinking. The solving of the TanGram puzzle revealed that the students had difficulties in understanding questions and activities focused on subjects and their processes. Objective answers, even when dealing with personal issues, were also reflected on the students’ opinions about the characteristics of a successful researcher. Students’ difficulties concerning these issues may affect their scientific performance and result in poorly designed experiments. This is a preliminary study that should be extended to other centers of scientific research.
Resumo:
The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars) were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar) were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.
Resumo:
Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid) was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid) in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.