894 resultados para Quantitative verification
Resumo:
Diabetic retinopathy is the leading cause of visual loss in individuals under the age of 55. Most investigations into the pathogenesis of diabetic retinopathy have been concentrated on the neural retina since this is where clinical lesions are manifested. Recently, however, various abnormalities in the structural and secretory functions of retinal pigment epithelium that are essential for neuroretina survival, have been found in diabetic retinopathy. In this context, here we study the effect of hyperglycemic and hypoxic conditions on the metabolism of a human retinal pigment epithelial cell line (ARPE-19) by integrating quantitative proteomics using tandem mass tagging (TMT), untargeted metabolomics using MS and NMR, and 13C-glucose isotopic labeling for metabolic tracking. We observed a remarkable metabolic diversification under our simulated in vitro hyperglycemic conditions of diabetes, characterized increased flux through polyol pathways and inhibition of the Krebs cycle and oxidative phosphorylation. Importantly, under low oxygen supply RPE cells seem to consume rapidly glycogen storages and stimulate anaerobic glycolysis. Our results therefore pave the way to future scenarios involving new therapeutic strategies addressed to modulating RPE metabolic impairment, with the aim of regulating structural and secretory alterations of RPE. Finally, this study shows the importance of tackling biomedical problems by integrating metabolomic and proteomics results.
Resumo:
OBJECTIVE: To evaluate lung fissures completeness, post-treatment radiological response and quantitative CT analysis (QCTA) in a population of severe emphysematous patients submitted to endobronchial valves (EBV) implantation. MATERIALS AND METHODS: Multi-detectors CT exams of 29 patients were studied, using thin-section low dose protocol without contrast. Two radiologists retrospectively reviewed all images in consensus; fissures completeness was estimated in 5% increments and post-EBV radiological response (target lobe atelectasis/volume loss) was evaluated. QCTA was performed in pre and post-treatment scans using a fully automated software. RESULTS: CT response was present in 16/29 patients. In the negative CT response group, all 13 patients presented incomplete fissures, and mean oblique fissures completeness was 72.8%, against 88.3% in the other group. QCTA most significant results showed a reduced post-treatment total lung volume (LV) (mean 542 ml), reduced EBV-submitted LV (700 ml) and reduced emphysema volume (331.4 ml) in the positive response group, which also showed improved functional tests. CONCLUSION: EBV benefit is most likely in patients who have complete interlobar fissures and develop lobar atelectasis. In patients with no radiological response we observed a higher prevalence of incomplete fissures and a greater degree of incompleteness. The fully automated QCTA detected the post-treatment alterations, especially in the treated lung analysis.
Resumo:
High-resolution mass spectrometry (HRMS) has been associated with qualitative and research analysis and QQQ-MS with quantitative and routine analysis. This view is now challenged and for this reason, we have evaluated the quantitative LC-MS performance of a new high-resolution mass spectrometer (HRMS), a Q-orbitrap-MS, and compared the results obtained with a recent triple-quadrupole MS (QQQ-MS). High-resolution full-scan (HR-FS) and MS/MS acquisitions have been tested with real plasma extracts or pure standards. Limits of detection, dynamic range, mass accuracy and false positive or false negative detections have been determined or investigated with protease inhibitors, tyrosine kinase inhibitors, steroids and metanephrines. Our quantitative results show that today's available HRMS are reliable and sensitive quantitative instruments and comparable to QQQ-MS quantitative performance. Taking into account their versatility, user-friendliness and robustness, we believe that HRMS should be seen more and more as key instruments in quantitative LC-MS analyses. In this scenario, most targeted LC-HRMS analyses should be performed by HR-FS recording virtually "all" ions. In addition to absolute quantifications, HR-FS will allow the relative quantifications of hundreds of metabolites in plasma revealing individual's metabolome and exposome. This phenotyping of known metabolites should promote HRMS in clinical environment. A few other LC-HRMS analyses should be performed in single-ion-monitoring or MS/MS mode when increased sensitivity and/or detection selectivity will be necessary.
Resumo:
This paper presents a first analysis on local electronic participatory experiences in Catalonia. The analysis is based on a database constructed and collected by the authors. The paper carries out an explanatory analysis of local initiatives in eparticipationand off line participation taking into account political variables (usually not considered in this kind of analysis) but also classical socio-economic variables that characterise municipalities. Hence, we add a quantitative analysis to the numerous case studies on local e-participation experiences. We have chosen Catalonia because is one of the European regions with more initiatives and one that has enjoyed considerable local governmental support to citizen participation initiatives since the 80s. The paper offers a characterisation of these experiences and a first explanatory analysis, considering: i) the institutional context in which these experiences are embedded, ii) the characteristics of the citizen participation processes and mechanisms on-line, and iii) a set of explanatory variables composed by the population size, thepolitical adscription of the mayor, the electoral abstention rate, age, income and level ofeducation in the municipality. The model that we present is explanatory for the municipalities with more than 20,000 inhabitants but it is not for the fewer than 20,000inhabitants. Actually, the number of participatory activities developed by these last municipalities is very low. Among all the variables, population size becomes the mostinfluential variable. Political variables such as political party of the mayor and the localabstention rate have a certain influence but that have to be controlled by population size.
Resumo:
This article presents an analysis on local participatory experiences in Catalonia,both online and in-person. The analysis is based on a database set up by theauthors. The article carries out an explanatory analysis of local participatoryinitiatives (on- and offline) taking into account political variables (not usually con-sidered in this kind of analysis) and also classical socio-economic variables thatcharacterize municipalities. Hence, we add a quantitative analysis to the numerouscase studies on local e-participation experiences. We have chosen Catalonia becauseit is one of the European regions with more initiatives and a considerable localgovernment support for citizen participation initiatives since the 1980s. Thearticle offers a characterization of these experiences and an explanatory analysis,considering: (i) the institutional context in which these experiences are embedded,(ii) the citizen participation processes and mechanisms online and (iii) a set ofexplanatory variables composed of the population size and the province to whichthe municipality belongs, the political tendency of the mayor, the electoral absten-tion rate, age, income, level of education, broadband connection and users of theInternet in the municipality. The model that we present is explanatory for munici-palities with more than 20,000 inhabitants but it is not for fewer than 20,000inhabitants. Actually, the majority of these latter municipalities have not developedany participatory activities. Among all the variables, population size is the mostinfluential variable and affects the influence of other variables, such as the politicalparty of the mayor, the local abstention rate and the province.
Resumo:
The main objective of this master’s thesis was to quantitatively study the reliability of market and sales forecasts of a certain company by measuring bias, precision and accuracy of these forecasts by comparing forecasts against actual values. Secondly, the differences of bias, precision and accuracy between markets were explained by various macroeconomic variables and market characteristics. Accuracy and precision of the forecasts seems to vary significantly depending on the market that is being forecasted, the variable that is being forecasted, the estimation period, the length of the estimated period, the forecast horizon and the granularity of the data. High inflation, low income level and high year-on-year market volatility seems to be related with higher annual market forecast uncertainty and high year-on-year sales volatility with higher sales forecast uncertainty. When quarterly market size is forecasted, correlation between macroeconomic variables and forecast errors reduces. Uncertainty of the sales forecasts cannot be explained with macroeconomic variables. Longer forecasts are more uncertain, shorter estimated period leads to higher uncertainty, and usually more recent market forecasts are less uncertain. Sales forecasts seem to be more uncertain than market forecasts, because they incorporate both market size and market share risks. When lead time is more than one year, forecast risk seems to grow as a function of root forecast horizon. When lead time is less than year, sequential error terms are typically correlated, and therefore forecast errors are trending or mean-reverting. The bias of forecasts seems to change in cycles, and therefore the future forecasts cannot be systematically adjusted with it. The MASE cannot be used to measure whether the forecast can anticipate year-on-year volatility. Instead, we constructed a new relative accuracy measure to cope with this particular situation.
Resumo:
Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.
Resumo:
Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.
Resumo:
The main purpose of this work is the identification and quantification of phenolic compounds in coal tar samples from a ceramics factory in Cocal (SC), Brazil. The samples were subjected to preparative scale liquid chromatography, using Amberlyst A-27TM ion-exchange resin as stationary phase. The fractions obtained were classified as "acids" and "BN" (bases and neutrals). The identification and quantification of phenols, in the acid fraction, was made by gas chromatography coupled to mass spectrometry (GC/MS). Nearly twenty-five phenols were identified in the samples and nine of them were also quantified. The results showed that coal tar has large quantities of phenolic compounds of industrial interest.
Resumo:
Human activities have resulted in increased nutrient levels in many rivers all over Europe. Sustainable management of river basins demands an assessment of the causes and consequences of human alteration of nutrient flows, together with an evaluation of management options. In the context of an integrated and interdisciplinary environmental assessment (IEA) of nutrient flows, we present and discuss the application of the nutrient emission model MONERIS (MOdelling Nutrient Emissions into River Systems) to the Catalan river basin, La Tordera (north-east Spain), for the period 1996–2002. After a successful calibration and verification process (Nash-Sutcliffe efficiencies E=0.85 for phosphorus and E=0.86 for nitrogen), the application of the model MONERIS proved to be useful in estimating nutrient loads. Crucial for model calibration, in-stream retention was estimated to be about 50 % of nutrient emissions on an annual basis. Through this process, we identified the importance of point sources for phosphorus emissions (about 94% for 1996–2002), and diffuse sources, especially inputs via groundwater, for nitrogen emissions (about 31% for 1996–2002). Despite hurdles related to model structure, observed loads, and input data encountered during the modelling process, MONERIS provided a good representation of the major interannual and spatial patterns in nutrient emissions. An analysis of the model uncertainty and sensitivity to input data indicates that the model MONERIS, even in data-starved Mediterranean catchments, may be profitably used by water managers for evaluating quantitative nutrient emission scenarios for the purpose of managing river basins. As an example of scenario modelling, an analysis of the changes in nutrient emissions through two different future scenarios allowed the identification of a set of relevant measures to reduce nutrient loads.
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
Zidovudine (AZT) and stavudine (D4T) are nucleoside reverse transcriptase inhibitors extensively used in human immunodeficiency virus (HIV) infected patients. In order to evaluate the quality of these drugs, two stability indicating HPLC methods were developed. The validated methods were applied in quantitative determination of AZT, D4T and their induced degradation products in capsule preparations. The stability studies were conducted at controlled temperature and relative humidity conditions based on the International Conference on Harmonization stability studies protocol for Zone IV areas. Easy sample preparation and low-cost make these methods especially useful for quality control and stability studies of AZT and D4T in drug products.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
The aim of our study was to assess the diagnostic usefulness of the gray level parameters to distinguish osteolytic lesions using radiological images. Materials and Methods: A retrospective study was carried out. A total of 76 skeletal radiographs of osteolytic metastases and 67 radiographs of multiple myeloma were used. The cases were classified into nonflat (MM1 and OL1) and flat bones (MM2 and OL2). These radiological images were analyzed by using a computerized method. The parameters calculated were mean, standard deviation, and coefficient of variation (MGL, SDGL, and CVGL) based on gray level histogram analysis of a region-of-interest.Diagnostic utility was quantified bymeasurement of parameters on osteolyticmetastases andmultiplemyeloma, yielding quantification of area under the receiver operating characteristic (ROC) curve (AUC). Results: Flat bone groups (MM2 and OL2) showed significant differences in mean values of MGL ( = 0.048) and SDGL ( = 0.003). Their corresponding values of AUC were 0.758 for MGL and 0.883 for SDGL in flat bones. In nonflat bones these gray level parameters do not show diagnostic ability. Conclusion: The gray level parametersMGL and SDGL show a good discriminatory diagnostic ability to distinguish between multiple myeloma and lytic metastases in flat bones.