865 resultados para Validation of analytical methods
Resumo:
A gravimetric method was evaluated as a simple, sensitive, reproducible, low-cost alternative to quantify the extent of brain infarct after occlusion of the medial cerebral artery in rats. In ether-anesthetized rats, the left medial cerebral artery was occluded for 1, 1.5 or 2 h by inserting a 4-0 nylon monofilament suture into the internal carotid artery. Twenty-four hours later, the brains were processed for histochemical triphenyltetrazolium chloride (TTC) staining and quantitation of the schemic infarct. In each TTC-stained brain section, the ischemic tissue was dissected with a scalpel and fixed in 10% formalin at 0ºC until its total mass could be estimated. The mass (mg) of the ischemic tissue was weighed on an analytical balance and compared to its volume (mm³), estimated either by plethysmometry using platinum electrodes or by computer-assisted image analysis. Infarct size as measured by the weighing method (mg), and reported as a percent (%) of the affected (left) hemisphere, correlated closely with volume (mm³, also reported as %) estimated by computerized image analysis (r = 0.88; P < 0.001; N = 10) or by plethysmography (r = 0.97-0.98; P < 0.0001; N = 41). This degree of correlation was maintained between different experimenters. The method was also sensitive for detecting the effect of different ischemia durations on infarct size (P < 0.005; N = 23), and the effect of drug treatments in reducing the extent of brain damage (P < 0.005; N = 24). The data suggest that, in addition to being simple and low cost, the weighing method is a reliable alternative for quantifying brain infarct in animal models of stroke.
Resumo:
The evaluation of filtration methods on the yield and oleochemicals characteristics on the crude fish oil from the soapstock of marine fish for purposes of nutrition has been conducted in this work. The analytical properties of the crude and the neutralized oil with three excesses of sodium hydroxide (NaOH, 20%, 40% and 60%) were carried out on two different ways with organza and the “glass wool”. The neutralization of the oil brought about a notable improvement in the analytical properties of the oil. Thus, it leads to a high quality fish oil in terms of taste, colour, odours, shelf life and market value. Based on the improved characteristics of the oil, it could be suitable for applications in pharmaceutical and food industries.
Resumo:
Prostate cancer (PCa) has emerged as the most commonly diagnosed lethal cancer in European men. PCa is a heterogeneous cancer that in the majority of the cases is slow growing: consequently, these patients would not need any medical treatment. Currently, the measurement of prostate-specific antigen (PSA) from blood by immunoassay followed by digital rectal examination and a pathological examination of prostate tissue biopsies are the most widely used methods in the diagnosis of PCa. These methods suffer from a lack of sensitivity and specificity that may cause either missed cancers or overtreatment as a consequence of over-diagnosis. Therefore, more reliable biomarkers are needed for a better discrimination between indolent and potentially aggressive cancers. The aim of this thesis was the identification and validation of novel biomarkers for PCa. The mRNA expression level of 14 genes including AMACR, AR, PCA3, SPINK1, TMPRSS2-ERG, KLK3, ACSM1, CACNA1D, DLX1, LMNB1, PLA2G7, RHOU, SPON2, and TDRD1 was measured by a truly quantitative reverse transcription PCR in different prostate tissue samples from men with and without PCa. For the last eight genes the function of the genes in PCa progression was studied by a specific siRNA knockdown in PC-3 and VCaP cells. The results from radical prostatectomy and cystoprostatectomy samples showed statistically significant overexpression for all the target genes, except for KLK3 in men with PCa compared with men without PCa. Statistically significant difference was also observed in low versus high Gleason grade tumors (for PLA2G7), PSA relapse versus no relapse (for SPON2), and low versus high TNM stages (for CACNA1D and DLX1). Functional studies and siRNA silencing results revealed a cytotoxicity effect for the knock-down of DLX1, PLA2G7, and RHOU, and altered tumor cell invasion for PLA2G7, RHOU, ACSM1, and CACNA1D knock-down in 3D conditions. In addition, effects on tumor cell motility were observed after silencing PLA2G7 and RHOU in 2D monolayer cultures. Altogether, these findings indicate the possibility of utilizing these new markers as diagnostic and prognostic markers, and they may also represent therapeutic targets for PCa.
Resumo:
Several automated reversed-phase HPLC methods have been developed to determine trace concentrations of carbamate pesticides (which are of concern in Ontario environmental samples) in water by utilizing two solid sorbent extraction techniques. One of the methods is known as on-line pre-concentration'. This technique involves passing 100 milliliters of sample water through a 3 cm pre-column, packed with 5 micron ODS sorbent, at flow rates varying from 5-10 mUmin. By the use of a valve apparatus, the HPLC system is then switched to a gradient mobile phase program consisting of acetonitrile and water. The analytes, Propoxur, Carbofuran, Carbaryl, Propham, Captan, Chloropropham, Barban, and Butylate, which are pre-concentrated on the pre-column, are eluted and separated on a 25 cm C-8 analytical column and determined by UV absorption at 220 nm. The total analytical time is 60 minutes, and the pre-column can be used repeatedly for the analysis of as many as thirty samples. The method is highly sensitive as 100 percent of the analytes present in the sample can be injected into the HPLC. No breakthrough of any of the analytes was observed and the minimum detectable concentrations range from 10 to 480 ng/L. The developed method is totally automated for the analysis of one sample. When the above mobile phase is modified with a buffer solution, Aminocarb, Benomyl, and its degradation product, MBC, can also be detected along with the above pesticides with baseline resolution for all of the analytes. The method can also be easily modified to determine Benomyl and MBC both as solute and as particulate matter. By using a commercially available solid phase extraction cartridge, in lieu of a pre-column, for the extraction and concentration of analytes, a completely automated method has been developed with the aid of the Waters Millilab Workstation. Sample water is loaded at 10 mL/min through a cartridge and the concentrated analytes are eluted from the sorbent with acetonitrile. The resulting eluate is blown-down under nitrogen, made up to volume with water, and injected into the HPLC. The total analytical time is 90 minutes. Fifty percent of the analytes present in the sample can be injected into the HPLC, and recoveries for the above eight pesticides ranged from 84 to 93 percent. The minimum detectable concentrations range from 20 to 960 ng/L. The developed method is totally automated for the analysis of up to thirty consecutive samples. The method has proven to be applicable to both purer water samples as well as untreated lake water samples.
Resumo:
L’athérosclérose est une maladie qui cause, par l’accumulation de plaques lipidiques, le durcissement de la paroi des artères et le rétrécissement de la lumière. Ces lésions sont généralement localisées sur les segments artériels coronariens, carotidiens, aortiques, rénaux, digestifs et périphériques. En ce qui concerne l’atteinte périphérique, celle des membres inférieurs est particulièrement fréquente. En effet, la sévérité de ces lésions artérielles est souvent évaluée par le degré d’une sténose (réduction >50 % du diamètre de la lumière) en angiographie, imagerie par résonnance magnétique (IRM), tomodensitométrie ou échographie. Cependant, pour planifier une intervention chirurgicale, une représentation géométrique artérielle 3D est notamment préférable. Les méthodes d’imagerie par coupe (IRM et tomodensitométrie) sont très performantes pour générer une imagerie tridimensionnelle de bonne qualité mais leurs utilisations sont dispendieuses et invasives pour les patients. L’échographie 3D peut constituer une avenue très prometteuse en imagerie pour la localisation et la quantification des sténoses. Cette modalité d’imagerie offre des avantages distincts tels la commodité, des coûts peu élevés pour un diagnostic non invasif (sans irradiation ni agent de contraste néphrotoxique) et aussi l’option d’analyse en Doppler pour quantifier le flux sanguin. Étant donné que les robots médicaux ont déjà été utilisés avec succès en chirurgie et en orthopédie, notre équipe a conçu un nouveau système robotique d’échographie 3D pour détecter et quantifier les sténoses des membres inférieurs. Avec cette nouvelle technologie, un radiologue fait l’apprentissage manuel au robot d’un balayage échographique du vaisseau concerné. Par la suite, le robot répète à très haute précision la trajectoire apprise, contrôle simultanément le processus d’acquisition d’images échographiques à un pas d’échantillonnage constant et conserve de façon sécuritaire la force appliquée par la sonde sur la peau du patient. Par conséquent, la reconstruction d’une géométrie artérielle 3D des membres inférieurs à partir de ce système pourrait permettre une localisation et une quantification des sténoses à très grande fiabilité. L’objectif de ce projet de recherche consistait donc à valider et optimiser ce système robotisé d’imagerie échographique 3D. La fiabilité d’une géométrie reconstruite en 3D à partir d’un système référentiel robotique dépend beaucoup de la précision du positionnement et de la procédure de calibration. De ce fait, la précision pour le positionnement du bras robotique fut évaluée à travers son espace de travail avec un fantôme spécialement conçu pour simuler la configuration des artères des membres inférieurs (article 1 - chapitre 3). De plus, un fantôme de fils croisés en forme de Z a été conçu pour assurer une calibration précise du système robotique (article 2 - chapitre 4). Ces méthodes optimales ont été utilisées pour valider le système pour l’application clinique et trouver la transformation qui convertit les coordonnées de l’image échographique 2D dans le référentiel cartésien du bras robotisé. À partir de ces résultats, tout objet balayé par le système robotique peut être caractérisé pour une reconstruction 3D adéquate. Des fantômes vasculaires compatibles avec plusieurs modalités d’imagerie ont été utilisés pour simuler différentes représentations artérielles des membres inférieurs (article 2 - chapitre 4, article 3 - chapitre 5). La validation des géométries reconstruites a été effectuée à l`aide d`analyses comparatives. La précision pour localiser et quantifier les sténoses avec ce système robotisé d’imagerie échographique 3D a aussi été déterminée. Ces évaluations ont été réalisées in vivo pour percevoir le potentiel de l’utilisation d’un tel système en clinique (article 3- chapitre 5).
Resumo:
CO, O3, and H2O data in the upper troposphere/lower stratosphere (UTLS) measured by the Atmospheric Chemistry Experiment Fourier Transform Spectrometer(ACE-FTS) on Canada’s SCISAT-1 satellite are validated using aircraft and ozonesonde measurements. In the UTLS, validation of chemical trace gas measurements is a challenging task due to small-scale variability in the tracer fields, strong gradients of the tracers across the tropopause, and scarcity of measurements suitable for validation purposes. Validation based on coincidences therefore suffers from geophysical noise. Two alternative methods for the validation of satellite data are introduced, which avoid the usual need for coincident measurements: tracer-tracer correlations, and vertical tracer profiles relative to tropopause height. Both are increasingly being used for model validation as they strongly suppress geophysical variability and thereby provide an “instantaneous climatology”. This allows comparison of measurements between non-coincident data sets which yields information about the precision and a statistically meaningful error-assessment of the ACE-FTS satellite data in the UTLS. By defining a trade-off factor, we show that the measurement errors can be reduced by including more measurements obtained over a wider longitude range into the comparison, despite the increased geophysical variability. Applying the methods then yields the following upper bounds to the relative differences in the mean found between the ACE-FTS and SPURT aircraft measurements in the upper troposphere (UT) and lower stratosphere (LS), respectively: for CO ±9% and ±12%, for H2O ±30% and ±18%, and for O3 ±25% and ±19%. The relative differences for O3 can be narrowed down by using a larger dataset obtained from ozonesondes, yielding a high bias in the ACEFTS measurements of 18% in the UT and relative differences of ±8% for measurements in the LS. When taking into account the smearing effect of the vertically limited spacing between measurements of the ACE-FTS instrument, the relative differences decrease by 5–15% around the tropopause, suggesting a vertical resolution of the ACE-FTS in the UTLS of around 1 km. The ACE-FTS hence offers unprecedented precision and vertical resolution for a satellite instrument, which will allow a new global perspective on UTLS tracer distributions.
Resumo:
The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.
Resumo:
Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.
Resumo:
Capturing the sensory perception and preferences of older adults, whether healthy or with particular disease states, poses major methodological challenges for the sensory community. Currently a vastly under researched area, it is at the same time a vital area of research as alterations in sensory perception can affect daily dietary food choices, intake, health and wellbeing. Tailored sensory methods are needed that take into account the challenges of working with such populations including poor access leading to low patient numbers (study power), cognitive abilities, use of medications, clinical treatments and context (hospitals and care homes). The objective of this paper was to review current analytical and affective sensory methodologies used with different cohorts of healthy and frail older adults, with focus on food preference and liking. We particularly drew attention to studies concerning general ageing as well as to those considering age-related diseases that have an emphasis on malnutrition and weight loss. Pubmed and Web of Science databases were searched to 2014 for relevant articles in English. From this search 75 papers concerning sensory acuity, 41 regarding perceived intensity and 73 relating to hedonic measures were reviewed. Simpler testing methods, such as directional forced choice tests and paired preference tests need to be further explored to determine whether they lead to more reliable results and better inter-cohort comparisons. Finally, sensory quality and related quality of life for older adults suffering from dementia must be included and not ignored in our future actions.
Resumo:
Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.
Resumo:
The importance of nutrient intakes in osteoporosis prevention in treatment is widely recognized. The objective of the present study was to develop and validate a FFQ for women with osteoporosis. The questionnaire was composed of 60 items, separated into 10 groups. The relative validation was accomplished through comparison of the 3-Day Food Record (3DR) with the FFQ. The 3DR was applied to 30 elderly women with confirmed osteoporosis, and after 45 days the FFQ was administrated. Statistical analysis comprised the Kolmogorov-Smirnov, Student T test and Pearson correlation coefficient. The agreement between two methods was evaluated by the frequency of similar classification into quartiles, and by the Bland-Altman method. No significant differences between methods were observed for the mean evaluated nutrients, except for carbohydrate and magnesium. Pearson correlation coefficients were positive and statistically significant for all nutrients. The overall proportion of subjects classified in the same quartile by the two methods was on average 50.01% and in the opposite quartile 0.47%. For calcium intake, only 3% of subjects were classified in opposite extreme quartiles by the two methods. The Bland-Altman analysis demonstrated that the differences obtained by the two methods in each subject were well distributed around the mean of the difference, and the disagreement increases as the mean intake increases. These results indicates that the FFQ for elderly women with osteoporosis presented here is highly acceptable and is an accurate method that can be used in large-scale or clinical studies for evaluation of nutrient intakes in a similar population.
Resumo:
Background: Oxidative modification of low-density lipoprotein (LDL) plays a key role in the pathogenesis of atherosclerosis. LDL(-) is present in blood plasma of healthy subjects and at higher concentrations in diseases with high cardiovascular risk, such as familial hypercholesterolemia or diabetes. Methods: We developed and validated a sandwich ELISA for LDL(-) in human plasma using two monoclonal antibodies against LDL(-) that do not bind to native LDL, extensively copper-oxidized LDL or malondialdehyde-modified LDL. The characteristics of assay performance, such as limits of detection and quantification, accuracy, inter- and intra-assay precision were evaluated. The linearity, interferences and stability tests were also performed. Results: The calibration range of the assay is 0.625-20.0 mU/L at 1: 2000 sample dilution. ELISA validation showed intra- and inter- assay precision and recovery within the required limits for immunoassays. The limits of detection and quantification were 0.423 mU/L and 0.517 mU/L LDL(-), respectively. The intra- and inter- assay coefficient of variation ranged from 9.5% to 11.5% and from 11.3% to 18.9%, respectively. Recovery of LDL(-) ranged from 92.8% to 105.1%. Conclusions: This ELISA represents a very practical tool for measuring LDL(-) in human blood for widespread research and clinical sample use. Clin Chem Lab Med 2008; 46: 1769-75.
Resumo:
Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.
Resumo:
Throughout the industrial processes of sheet metal manufacturing and refining, shear cutting is widely used for its speed and cost advantages over competing cutting methods. Industrial shears may include some force measurement possibilities, but the force is most likely influenced by friction losses between shear tool and the point of measurement, and are in general not showing the actual force applied to the sheet. Well defined shears and accurate measurements of force and shear tool position are important for understanding the influence of shear parameters. Accurate experimental data are also necessary for calibration of numerical shear models. Here, a dedicated laboratory set-up with well defined geometry and movement in the shear, and high measurability in terms of force and geometry is designed, built and verified. Parameters important to the shear process are studied with perturbation analysis techniques and requirements on input parameter accuracy are formulated to meet experimental output demands. Input parameters in shearing are mostly geometric parameters, but also material properties and contact conditions. Based on the accuracy requirements, a symmetric experiment with internal balancing of forces is constructed to avoid guides and corresponding friction losses. Finally, the experimental procedure is validated through shearing of a medium grade steel. With the obtained experimental set-up performance, force changes as result of changes in studied input parameters are distinguishable down to a level of 1%.
Resumo:
Background: The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. Methods: The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. Results: The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Conclusions: Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow for systematic description of the local healthcare context prior implementing healthcare interventions to allow for tailoring implementation strategies or as part of the evaluation of implementing healthcare interventions and thus allow for deeper insights into the process of implementing EBPs in LMICs.