967 resultados para bioanalytical method validation
Resumo:
Subunit vaccine discovery is an accepted clinical priority. The empirical approach is time- and labor-consuming and can often end in failure. Rational information-driven approaches can overcome these limitations in a fast and efficient manner. However, informatics solutions require reliable algorithms for antigen identification. All known algorithms use sequence similarity to identify antigens. However, antigenicity may be encoded subtly in a sequence and may not be directly identifiable by sequence alignment. We propose a new alignment-independent method for antigen recognition based on the principal chemical properties of protein amino acid sequences. The method is tested by cross-validation on a training set of bacterial antigens and external validation on a test set of known antigens. The prediction accuracy is 83% for the cross-validation and 80% for the external test set. Our approach is accurate and robust, and provides a potent tool for the in silico discovery of medically relevant subunit vaccines.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
A new mesoscale simulation model for solids dissolution based on an computationally efficient and versatile digital modelling approach (DigiDiss) is considered and validated against analytical solutions and published experimental data for simple geometries. As the digital model is specifically designed to handle irregular shapes and complex multi-component structures, use of the model is explored for single crystals (sugars) and clusters. Single crystals and the cluster were first scanned using X-ray microtomography to obtain a digital version of their structures. The digitised particles and clusters were used as a structural input to digital simulation. The same particles were then dissolved in water and the dissolution process was recorded by a video camera and analysed yielding: the overall dissolution times and images of particle size and shape during the dissolution. The results demonstrate the coherence of simulation method to reproduce experimental behaviour, based on known chemical and diffusion properties of constituent phase. The paper discusses how further sophistications to the modelling approach will need to include other important effects such as complex disintegration effects (particle ejection, uncertainties in chemical properties). The nature of the digital modelling approach is well suited to for future implementation with high speed computation using hybrid conventional (CPU) and graphical processor (GPU) systems.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^
Resumo:
Produced water is a by-product of offshore oil and gas production, and is released in large volumes when platforms are actively processing crude oil. Some pollutants are not typically removed by conventional oil/water separation methods and are discharged with produced water. Oil and grease can be found dispersed in produced water in the form of tiny droplets, and polycyclic aromatic hydrocarbons (PAHs) are commonly found dissolved in produced water. Both can have acute and chronic toxic effects in marine environments even at low exposure levels. The analysis of the dissolved and dispersed phases are a priority, but effort is required to meet the necessary detection limits. There are several methods for the analysis of produced water for dispersed oil and dissolved PAHs, all of which have advantages and disadvantages. In this work, EPA Method 1664 and APHA Method 5520 C for the determination of oil and grease will be examined and compared. For the detection of PAHs, EPA Method 525 and PAH MIPs will be compared, and results evaluated. APHA Method 5520 C Partition-Infrared Method is a liquid-liquid extraction procedure with IR determination of oil and grease. For analysis on spiked samples of artificial seawater, extraction efficiency ranged from 85 – 97%. Linearity was achieved in the range of 5 – 500 mg/L. This is a single-wavelength method and is unsuitable for quantification of aromatics and other compounds that lack sp³-hybridized carbon atoms. EPA Method 1664 is the liquid-liquid extraction of oil and grease from water samples followed by gravimetric determination. When distilled water spiked with reference oil was extracted by this procedure, extraction efficiency ranged from 28.4 – 86.2%, and %RSD ranged from 7.68 – 38.0%. EPA Method 525 uses solid phase extraction with analysis by GC-MS, and was performed on distilled water and water from St. John’s Harbour, all spiked with naphthalene, fluorene, phenanthrene, and pyrene. The limits of detection in harbour water were 0.144, 3.82, 0.119, and 0.153 g/L respectively. Linearity was obtained in the range of 0.5-10 g/L, and %RSD ranged from 0.36% (fluorene) to 46% (pyrene). Molecularly imprinted polymers (MIPs) are sorbent materials made selective by polymerizing functional monomers and crosslinkers in the presence of a template molecule, usually the analytes of interest or related compounds. They can adsorb and concentrate PAHs from aqueous environments and are combined with methods of analysis including GC-MS, LC-UV-Vis, and desorption electrospray ionization (DESI)- MS. This work examines MIP-based methods as well as those methods previously mentioned which are currently used by the oil and gas industry and government environmental agencies. MIPs are shown to give results consistent with other methods, and are a low-cost alternative improving ease, throughput, and sensitivity. PAH MIPs were used to determine naphthalene spiked into ASTM artificial seawater, as well as produced water from an offshore oil and gas operation. Linearity was achieved in the range studied (0.5 – 5 mg/L) for both matrices, with R² = 0.936 for seawater and R² = 0.819 for produced water. The %RSD for seawater ranged from 6.58 – 50.5% and for produced water, from 8.19 – 79.6%.
Resumo:
Automatic load transfer (ALT) on the 11 kV network is the process by which circuit breakers on the network are switched to form open points in order to feed load from different primary substations. Some of the potential benefits that may be gained from dynamically using ALT include maximising utilisation of existing assets, voltage regulation and reduced losses. One of the key issues, that has yet to be properly addressed in published research, is how to validate that the modelled benefits really exist. On an 11 kV distribution network where the load is continually changing and the load on each distribution substation is unlikely to be monitored - reduction in losses from moving the normally open point is particularly difficult to prove. This study proposes a method to overcome this problem and uses measured primary feeder data from two parts of the Western Power Distribution 11 kV Network under different configurations. The process of choosing the different configurations is based on a heuristic modelling method of locating minimum voltages to help reduce losses.
Resumo:
Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD) method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics). The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC), cholesterol, dimethyldioctadecylammonium (DDA) bromide, and D-(+)-trehalose 6,6′-dibehenate (TDB). The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested). The corresponding limit of detection (LOD) and limit of quantification (LOQ) were 0.11 and 0.36 mg/mL (DMPC), 0.02 and 0.80 mg/mL (cholesterol), 0.06 and 0.20 mg/mL (DDA), and 0.05 and 0.16 mg/mL (TDB), respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.
Resumo:
This paper presents a theoretical model on the vibration analysis of micro scale fluid-loaded rectangular isotropic plates, based on the Lamb's assumption of fluid-structure interaction and the Rayleigh-Ritz energy method. An analytical solution for this model is proposed, which can be applied to most cases of boundary conditions. The dynamical experimental data of a series of microfabricated silicon plates are obtained using a base-excitation dynamic testing facility. The natural frequencies and mode shapes in the experimental results are in good agreement with the theoretical simulations for the lower order modes. The presented theoretical and experimental investigations on the vibration characteristics of the micro scale plates are of particular interest in the design of microplate based biosensing devices. Copyright © 2009 by ASME.
Resumo:
There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.
Resumo:
In the context of products from certain regions or countries being banned because of an identified or non-identified hazard, proof of geographical origin is essential with regard to feed and food safety issues. Usually, the product labeling of an affected feed lot shows origin, and the paper documentation shows traceability. Incorrect product labeling is common in embargo situations, however, and alternative analytical strategies for controlling feed authenticity are therefore needed. In this study, distillers' dried grains and solubles (DDGS) were chosen as the product on which to base a comparison of analytical strategies aimed at identifying the most appropriate one. Various analytical techniques were investigated for their ability to authenticate DDGS, including spectroscopic and spectrometric techniques combined with multivariate data analysis, as well as proven techniques for authenticating food, such as DNA analysis and stable isotope ratio analysis. An external validation procedure (called the system challenge) was used to analyze sample sets blind and to compare analytical techniques. All the techniques were adapted so as to be applicable to the DDGS matrix. They produced positive results in determining the botanical origin of DDGS (corn vs. wheat), and several of them were able to determine the geographical origin of the DDGS in the sample set. The maintenance and extension of the databanks generated in this study through the analysis of new authentic samples from a single location are essential in order to monitor developments and processing that could affect authentication.
Resumo:
Injection stretch blow moulding is a well-established method of forming thin-walled containers and has been extensively researched for numerous years. This paper is concerned with validating the finite element analysis of the free-stretch-blow process in an effort to progress the development of injection stretch blow moulding of poly(ethylene terephthalate). Extensive data was obtained experimentally over a wide process window accounting for material temperature and air flow rate, while capturing cavity pressure, stretch-rod reaction force and preform surface strain. This data was then used to assess the accuracy of the correlating FE simulation constructed using ABAQUS/Explicit solver and an appropriate viscoelastic material subroutine. Results reveal that the simulation is able to give good quantitative correlation for conditions where the deformation was predominantly equal biaxial whilst qualitative correlation was achievable when the mode of deformation was predominantly sequential biaxial. Overall the simulation was able to pick up the general trends of how the pressure, reaction force, strain rate and strain vary with the variation in preform temperature and air flow rate. The knowledge gained from these analyses provides insight into the mechanisms of bottle formation, subsequently improving the blow moulding simulation and allowing for reduction in future development costs.
Resumo:
The aim of this study was to develop a multiplex loop-mediated isothermal amplification (LAMP) method capable of detecting Escherichia coli generally and verocytotoxigenic E. coli (VTEC) specifically in beef and bovine faeces. The LAMP assay developed was highly specific (100%) and able to distinguish between E. coli and VTEC based on the amplification of the phoA, and stx1 and/or stx2 genes, respectively. In the absence of an enrichment step, the limit of detection 50% (LOD50) of the LAMP assay was determined to be 2.83, 3.17 and 2.83-3.17 log CFU/g for E. coli with phoA, stx1 and stx2 genes, respectively, when artificially inoculated minced beef and bovine faeces were tested. The LAMP calibration curves generated with pure cultures, and spiked beef and faeces, suggested that the assay had good quantification capability. Validation of the assay, performed using retail beef and bovine faeces samples, demonstrated good correlation between counts obtained by the LAMP assay and by a conventional culture method, but suggested the possibility of false negative LAMP results for 12.5-14.7% of samples tested. The multiplex LAMP assay developed potentially represents a rapid alternative to culture for monitoring E.coli levels in beef or faeces and it would provide additional information on the presence of VTEC. However, some further optimisation is needed to improve detection sensitivity.
Resumo:
Surface flow types (SFT) are advocated as ecologically relevant hydraulic units, often mapped visually from the bankside to characterise rapidly the physical habitat of rivers. SFT mapping is simple, non-invasive and cost-efficient. However, it is also qualitative, subjective and plagued by difficulties in recording accurately the spatial extent of SFT units. Quantitative validation of the underlying physical habitat parameters is often lacking, and does not consistently differentiate between SFTs. Here, we investigate explicitly the accuracy, reliability and statistical separability of traditionally mapped SFTs as indicators of physical habitat, using independent, hydraulic and topographic data collected during three surveys of a c. 50m reach of the River Arrow, Warwickshire, England. We also explore the potential of a novel remote sensing approach, comprising a small unmanned aerial system (sUAS) and Structure-from-Motion photogrammetry (SfM), as an alternative method of physical habitat characterisation. Our key findings indicate that SFT mapping accuracy is highly variable, with overall mapping accuracy not exceeding 74%. Results from analysis of similarity (ANOSIM) tests found that strong differences did not exist between all SFT pairs. This leads us to question the suitability of SFTs for characterising physical habitat for river science and management applications. In contrast, the sUAS-SfM approach provided high resolution, spatially continuous, spatially explicit, quantitative measurements of water depth and point cloud roughness at the microscale (spatial scales ≤1m). Such data are acquired rapidly, inexpensively, and provide new opportunities for examining the heterogeneity of physical habitat over a range of spatial and temporal scales. Whilst continued refinement of the sUAS-SfM approach is required, we propose that this method offers an opportunity to move away from broad, mesoscale classifications of physical habitat (spatial scales 10-100m), and towards continuous, quantitative measurements of the continuum of hydraulic and geomorphic conditions which actually exists at the microscale.
Resumo:
The design number of gyrations (Ndesign) introduced by the Strategic Highway Research Program (SHRP) and used in the Superior Performing Asphalt Pavement (Superpave) mix design method has been commonly used in flexible pavement design throughout the US since 1996. Ndesign, also known as the compaction effort, is used to simulate field compaction during construction and has been reported to produce air voids that are unable to reach ultimate pavement density within the initial 2 to 3 years post-construction, potentially having an adverse impact on long-term performance. Other state transportation agencies have conducted studies validating the Ndesign for their specific regions, which resulted in modifications of the gyration effort for the various traffic levels. Validating this relationship for Iowa asphalt mix designs will lead to better correlations between mix design target voids, field voids, and performance. A comprehensive analysis of current Ndesign levels investigated the current levels with existing mixes and pavements and developed initial asphalt mix design recommendations that identify an optimum Ndesign through the use of performance data tests.
Resumo:
Objective: The study was designed to validate use of elec-tronic health records (EHRs) for diagnosing bipolar disorder and classifying control subjects. Method: EHR data were obtained from a health care system of more than 4.6 million patients spanning more than 20 years. Experienced clinicians reviewed charts to identify text features and coded data consistent or inconsistent with a diagnosis of bipolar disorder. Natural language processing was used to train a diagnostic algorithm with 95% specificity for classifying bipolar disorder. Filtered coded data were used to derive three additional classification rules for case subjects and one for control subjects. The positive predictive value (PPV) of EHR-based bipolar disorder and subphenotype di- agnoses was calculated against diagnoses from direct semi- structured interviews of 190 patients by trained clinicians blind to EHR diagnosis. Results: The PPV of bipolar disorder defined by natural language processing was 0.85. Coded classification based on strict filtering achieved a value of 0.79, but classifications based on less stringent criteria performed less well. No EHR- classified control subject received a diagnosis of bipolar dis- order on the basis of direct interview (PPV=1.0). For most subphenotypes, values exceeded 0.80. The EHR-based clas- sifications were used to accrue 4,500 bipolar disorder cases and 5,000 controls for genetic analyses. Conclusions: Semiautomated mining of EHRs can be used to ascertain bipolar disorder patients and control subjects with high specificity and predictive value compared with diagnostic interviews. EHRs provide a powerful resource for high-throughput phenotyping for genetic and clinical research.