142 resultados para Data Structure Evaluation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Much information on flavonoid content of Brazilian foods has already been obtained; however, this information is spread in scientific publications and non-published data. The objectives of this work were to compile and evaluate the quality of national flavonoid data according to the United States Department of Agriculture`s Data Quality Evaluation System (USDA-DQES) with few modifications, for future dissemination in the TBCA-USP (Brazilian Food Composition Database). For the compilation, the most abundant compounds in the flavonoid subclasses were considered (flavonols, flavones, isoflavones, flavanones, flavan-3-ols, and anthocyanidins) and the analysis of the compounds by HPLC was adopted as criteria for data inclusion. The evaluation system considers five categories, and the maximum score assigned to each category is 20. For each data, a confidence code (CC) was attributed (A, B, C and D), indicating the quality and reliability of the information. Flavonoid data (773) present in 197 Brazilian foods were evaluated. The CC ""C"" (as average) was attributed to 99% of the data and ""B"" (above average) to 1%. The main categories assigned low average scores were: number of samples; sampling plan and analytical quality control (average scores 2, 5 and 4, respectively). The analytical method category received an average score of 9. The category assigned the highest score was the sample handling (20 average). These results show that researchers need to be conscious about the importance of the number and plan of evaluated samples and the complete description and documentation of all the processes of methodology execution and analytical quality control. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Brazilian Network of Food Data Systems (BRASILFOODS) has been keeping the Brazilian Food Composition Database-USP (TBCA-USP) (http://www.fcf.usp.br/tabela) since 1998. Besides the constant compilation, analysis and update work in the database, the network tries to innovate through the introduction of food information that may contribute to decrease the risk for non-transmissible chronic diseases, such as the profile of carbohydrates and flavonoids in foods. In 2008, data on carbohydrates, individually analyzed, of 112 foods, and 41 data related to the glycemic response produced by foods widely consumed in the country were included in the TBCA-USP. Data (773) about the different flavonoid subclasses of 197 Brazilian foods were compiled and the quality of each data was evaluated according to the USDAs data quality evaluation system. In 2007, BRASILFOODS/USP and INFOODS/FAO organized the 7th International Food Data Conference ""Food Composition and Biodiversity"". This conference was a unique opportunity for interaction between renowned researchers and participants from several countries and it allowed the discussion of aspects that may improve the food composition area. During the period, the LATINFOODS Regional Technical Compilation Committee and BRASILFOODS disseminated to Latin America the Form and Manual for Data Compilation, version 2009, ministered a Food Composition Data Compilation course and developed many activities related to data production and compilation. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este artigo apresenta uma nova proposta de avaliação da Atenção Básica em Saúde (ABS), utilizando a abordagem sistêmica de Donabedian, com a modificação de que a avaliação desse serviço deveria iniciar pelo componente Processo, visando identificar a adequação da oferta e das relações entre os distintos procedimentos desse nível de atenção. A partir dessa análise deverá ser realizada a busca do resultado preditivo, permitindo a relação entre esses componentes. Com base nessa análise, a avaliação de estrutura ganha significado para o processo de decisão. Faz parte da abordagem a avaliação da rede de atenção, tendo como foco e imagem-objetivo a atenção básica como porta de entrada da rede. Apresenta uma concepção de rede mais dinâmica e flexível e propõe a utilização na avaliação do método misto, englobando a abordagem quantitativa baseada nos bancos de dados existentes no sistema Datasus e complementada pela abordagem qualitativa, permitindo maior compreensão do significado das relações encontradas nos serviços e na rede. Nesse desenho metodológico, será possível a identificação das variáveis de funcionamento e a organização da ABS e da rede de serviços, possibilitando direcionar a tomada de decisão para a melhoria de qualidade da atenção básica de saúde.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Methods. Data from the Beginning and Ending Supportive Therapy for the Kidney (BEST Kidney) study, a prospective observational study from 54 ICUs in 23 countries of critically ill patients with severe AKI, were analysed. The RIFLE class was determined by using observed (o) pre-morbid and estimated (e) baseline SCr values. Agreement was evaluated by correlation coefficients and Bland-Altman plots. Sensitivity analysis by chronic kidney disease (CKD) status was performed. Results. Seventy-six percent of patients (n = 1327) had a pre-morbid baseline SCr, and 1314 had complete data for evaluation. Forty-six percent had CKD. The median (IQR) values were 97 mu mol/L (79-150) for oSCr and 88 mu mol/L (71-97) for eSCr. The oSCr and eSCr determined at ICU admission and at study enrolment showed only a modest correlation (r = 0.49, r = 0.39). At ICU admission and study enrolment, eSCr misclassified 18.8% and 11.7% of patients as having AKI compared with oSCr. Exclusion of CKD patients improved the correlation between oSCr and eSCr at ICU admission and study enrolment (r = 0.90, r = 0.84) resulting in 6.6% and 4.0% being misclassified, respectively. Conclusions. While limited, estimating baseline SCr by the MDRD equation when pre-morbid SCr is unavailable would appear to perform reasonably well for determining the RIFLE categories only if and when pre-morbid GFR was near normal. However, in patients with suspected CKD, the use of MDRD to estimate baseline SCr overestimates the incidence of AKI and should not likely be used. Improved methods to estimate baseline SCr are needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Moving-least-squares (MLS) surfaces undergoing large deformations need periodic regeneration of the point set (point-set resampling) so as to keep the point-set density quasi-uniform. Previous work by the authors dealt with algebraic MLS surfaces, and proposed a resampling strategy based on defining the new points at the intersections of the MLS surface with a suitable set of rays. That strategy has very low memory requirements and is easy to parallelize. In this article new resampling strategies with reduced CPU-time cost are explored. The basic idea is to choose as set of rays the lines of a regular, Cartesian grid, and to fully exploit this grid: as data structure for search queries, as spatial structure for traversing the surface in a continuation-like algorithm, and also as approximation grid for an interpolated version of the MLS surface. It is shown that in this way a very simple and compact resampling technique is obtained, which cuts the resampling cost by half with affordable memory requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Given two strings A and B of lengths n(a) and n(b), n(a) <= n(b), respectively, the all-substrings longest common subsequence (ALCS) problem obtains, for every substring B` of B, the length of the longest string that is a subsequence of both A and B. The ALCS problem has many applications, such as finding approximate tandem repeats in strings, solving the circular alignment of two strings and finding the alignment of one string with several others that have a common substring. We present an algorithm to prepare the basic data structure for ALCS queries that takes O(n(a)n(b)) time and O(n(a) + n(b)) space. After this preparation, it is possible to build that allows any LCS length to be retrieved in constant time. Some trade-offs between the space required and a matrix of size O(n(b)(2)) the querying time are discussed. To our knowledge, this is the first algorithm in the literature for the ALCS problem. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Usually, a Petri net is applied as an RFID model tool. This paper, otherwise, presents another approach to the Petri net concerning RFID systems. This approach, called elementary Petri net inside an RFID distributed database, or PNRD, is the first step to improve RFID and control systems integration, based on a formal data structure to identify and update the product state in real-time process execution, allowing automatic discovery of unexpected events during tag data capture. There are two main features in this approach: to use RFID tags as the object process expected database and last product state identification; and to apply Petri net analysis to automatically update the last product state registry during reader data capture. RFID reader data capture can be viewed, in Petri nets, as a direct analysis of locality for a specific transition that holds in a specific workflow. Following this direction, RFID readers storage Petri net control vector list related to each tag id is expected to be perceived. This paper presents PNRD cornerstones and a PNRD implementation example in software called DEMIS Distributed Environment in Manufacturing Information Systems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The aim of this in vitro study was to evaluate four different approaches to the decision of changing or not defective amalgam restorations in first primary molar teeth concerning the loss of dental structure. Ditched amalgam restorations (n = 11) were submitted to four different treatments, as follows: Control group - polishing and finishing of the restorations were carried out; Amalgam group - the ditched amalgam restorations were replaced by new amalgam restorations; Composite resin group - the initial amalgam restorations were replaced by composite resin restorations; Flowable resin group - the ditching around the amalgam restorations was filled with flowable resin. Images of the sectioned teeth were made and the area of the cavities before and after the procedures was determined by image analysis software to assess structural loss. The data were submitted to ANOVA complemented by the Student Newman Keuls test (p < 0.05). The cavities in all the groups presented significantly greater areas after the procedures. However, the amalgam group showed more substantial dental loss. The other three groups presented no statistically significant difference in dental structure loss after the re-treatments. Thus, replacing ditched amalgam restorations by other similar restorations resulted in a significant dental structure loss while maintaining them or replacing them by resin restorations did not result in significant loss.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Adult rats submitted to perinatal salt overload presented renin-angiotensin system (RAS) functional disturbances. The RAS contributes to the renal development and renal damage in a 5/6 nephrectomy model. The aim of the present study was to analyze the renal structure and function of offspring from dams that received a high-salt intake during pregnancy and lactation. We also evaluated the influence of the prenatal high-salt intake on the evolution of 5/6 nephrectomy in adult rats. A total of 111 sixty-day-old rat pups from dams that received saline or water during pregnancy and lactation were submitted to 5/6 nephrectomy (nephrectomized) or to a sham operation (sham). The animals were killed 120 days after surgery, and the kidneys were removed for immunohistochemical and histological analysis. Systolic blood pressure (SBP), albuminuria, and glomerular filtration rate (GFR) were evaluated. Increased SBP, albuminuria, and decreased GFR were observed in the rats from dams submitted to high-sodium intake before surgery. However, there was no difference in these parameters between the groups after the 5/6 nephrectomy. The scores for tubulointerstitial lesions and glomerulosclerosis were higher in the rats from the sham saline group compared to the same age control rats, but there was no difference in the histological findings between the groups of nephrectomized rats. In conclusion, our data showed that the high-salt intake during pregnancy and lactation in rats leads to structural changes in the kidney of adult offspring. However, the progression of the renal lesions after 5/6 nephrectomy was similar in both groups.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Genome wide association studies (GWAS) are becoming the approach of choice to identify genetic determinants of complex phenotypes and common diseases. The astonishing amount of generated data and the use of distinct genotyping platforms with variable genomic coverage are still analytical challenges. Imputation algorithms combine directly genotyped markers information with haplotypic structure for the population of interest for the inference of a badly genotyped or missing marker and are considered a near zero cost approach to allow the comparison and combination of data generated in different studies. Several reports stated that imputed markers have an overall acceptable accuracy but no published report has performed a pair wise comparison of imputed and empiric association statistics of a complete set of GWAS markers. Results: In this report we identified a total of 73 imputed markers that yielded a nominally statistically significant association at P < 10(-5) for type 2 Diabetes Mellitus and compared them with results obtained based on empirical allelic frequencies. Interestingly, despite their overall high correlation, association statistics based on imputed frequencies were discordant in 35 of the 73 (47%) associated markers, considerably inflating the type I error rate of imputed markers. We comprehensively tested several quality thresholds, the haplotypic structure underlying imputed markers and the use of flanking markers as predictors of inaccurate association statistics derived from imputed markers. Conclusions: Our results suggest that association statistics from imputed markers showing specific MAF (Minor Allele Frequencies) range, located in weak linkage disequilibrium blocks or strongly deviating from local patterns of association are prone to have inflated false positive association signals. The present study highlights the potential of imputation procedures and proposes simple procedures for selecting the best imputed markers for follow-up genotyping studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Grass reference evapotranspiration (ETo) is an important agrometeorological parameter for climatological and hydrological studies, as well as for irrigation planning and management. There are several methods to estimate ETo, but their performance in different environments is diverse, since all of them have some empirical background. The FAO Penman-Monteith (FAD PM) method has been considered as a universal standard to estimate ETo for more than a decade. This method considers many parameters related to the evapotranspiration process: net radiation (Rn), air temperature (7), vapor pressure deficit (Delta e), and wind speed (U); and has presented very good results when compared to data from lysimeters Populated with short grass or alfalfa. In some conditions, the use of the FAO PM method is restricted by the lack of input variables. In these cases, when data are missing, the option is to calculate ETo by the FAD PM method using estimated input variables, as recommended by FAD Irrigation and Drainage Paper 56. Based on that, the objective of this study was to evaluate the performance of the FAO PM method to estimate ETo when Rn, Delta e, and U data are missing, in Southern Ontario, Canada. Other alternative methods were also tested for the region: Priestley-Taylor, Hargreaves, and Thornthwaite. Data from 12 locations across Southern Ontario, Canada, were used to compare ETo estimated by the FAD PM method with a complete data set and with missing data. The alternative ETo equations were also tested and calibrated for each location. When relative humidity (RH) and U data were missing, the FAD PM method was still a very good option for estimating ETo for Southern Ontario, with RMSE smaller than 0.53 mm day(-1). For these cases, U data were replaced by the normal values for the region and Delta e was estimated from temperature data. The Priestley-Taylor method was also a good option for estimating ETo when U and Delta e data were missing, mainly when calibrated locally (RMSE = 0.40 mm day(-1)). When Rn was missing, the FAD PM method was not good enough for estimating ETo, with RMSE increasing to 0.79 mm day(-1). When only T data were available, adjusted Hargreaves and modified Thornthwaite methods were better options to estimate ETo than the FAO) PM method, since RMSEs from these methods, respectively 0.79 and 0.83 mm day(-1), were significantly smaller than that obtained by FAO PM (RMSE = 1.12 mm day(-1). (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The goal of this work was to study the liquid crystalline structure of a nanodispersion delivery system intended to be used in photodynamic therapy after loading with photosensitizers (PSs) and additives such as preservatives and thickening polymers. Polarized light microscopy and light scattering were performed on a standard nanodispersion in order to determine the anisotropy of the liquid crystalline structure and the mean diameter of the nanoparticles, respectively. Small angle X-ray diffraction (SAXRD) was used to verify the influence of drug loading and additives on the liquid crystalline structure of the nanodispersions. The samples, before and after the addition of PSs and additives, were stable over 90 days, as verified by dynamic light scattering. SAXRD revealed that despite the alteration observed in some of the samples analyzed in the presence of photosensitizing drugs and additives, the hexagonal phase still remained in the crystalline phase. (C) 2011 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 100: 2849-2857, 2011

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Proteinuria was associated with cardiovascular events and mortality in community-based cohorts. The association of proteinuria with mortality and cardiovascular events in patients undergoing percutaneous coronary intervention (PCI) was unknown. The association of urinary dipstick proteinuria with mortality and cardiovascular events (composite of death, myocardial infarction, or nonhemorrhagic stroke) in 5,835 subjects of the EXCITE trial was evaluated. Dipstick urinalysis was performed before PCI, and proteinuria was defined as trace or greater. Subjects were followed up for 210 days/7 months after enrollment for the occurrence of events. Multivariate Cox regression analysis evaluated the independent association of proteinuria with each outcome. Mean age was 59 years, 21% were women, 18% had diabetes mellitus, and mean estimated glomerular filtration rate was 90 ml/min/1.73 m(2). Proteinuria was present in 750 patients (13%). During follow-up, 22 subjects (2.9%) with proteinuria and 54 subjects (1.1%) without proteinuria died (adjusted hazard ratio 2.83, 95% confidence interval [CI] 1.65 to 4.84, p <0.001). The severity of proteinuria attenuated the strength of the association with mortality after PCI (low-grade proteinuria, hazard ratio 2.67, 95% CI 1.50 to 4.75; high-grade proteinuria, hazard ratio 3.76, 95% CI 1.24 to 11.37). No significant association was present for cardiovascular events during the relatively short follow-up, but high-grade proteinuria tended toward increased risk of cardiovascular events (hazard ratio 1.45, 95% CI 0.81 to 2.61). In conclusion, proteinuria was strongly and independently associated with mortality in patients undergoing PCI. These data suggest that such a relatively simple and clinically easy to use tool as urinary dipstick may be useful to identify and treat patients at high risk of mortality at the time of PCI. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1151-1155)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, blood serum trace elements, biochemical and hematological parameters were obtained to assess the health status of an elderly population residing in So Paulo city, SP, Brazil. Results obtained showed that more than 93% of the studied individuals presented most of the serum trace element concentrations and of the hematological and biochemical data within the reference values used in clinical laboratories. However, the percentage of elderly presenting recommended low density lipoprotein (LDL) cholesterol concentrations was low (70%). The study indicated positive correlation between the concentrations of Zn and LDL-cholesterol (p < 0.06).