996 resultados para Exponential smoothing methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To evaluate the knowledge about diagnostic imaging methods among primary care and medical emergency physicians. Materials and Methods Study developed with 119 primary care and medical emergency physicians in Montes Claros, MG, Brazil, by means of a structured questionnaire about general knowledge and indications of imaging methods in common clinical settings. A rate of correct responses corresponding to ≥ 80% was considered as satisfactory. The Poisson regression (PR) model was utilized in the data analysis. Results Among the 81 individuals who responded the questionnaire, 65% (n = 53) demonstrated to have satisfactory general knowledge and 44% (n = 36) gave correct responses regarding indications of imaging methods. Respectively, 65% (n = 53) and 51% (n = 41) of the respondents consider that radiography and computed tomography do not use ionizing radiation. The prevalence of a satisfactory general knowledge about imaging methods was associated with medical residency in the respondents' work field (PR = 4.55; IC 95%: 1.18-16.67; p-value: 0.03), while the prevalence of correct responses regarding indication of imaging methods was associated with the professional practice in primary health care (PR = 1.79; IC 95%: 1.16-2.70; p-value: 0.01). Conclusion Major deficiencies were observed as regards the knowledge about imaging methods among physicians, with better results obtained by those involved in primary health care and by residents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Emerging polyploids may depend on environmental niche shifts for successful establishment. Using the alpine plant Ranunculus kuepferi as a model system, we explore the niche shift hypothesis at different spatial resolutions and in contrasting parts of the species range. Location: European Alps. Methods: We sampled 12 individuals from each of 102 populations of R. kuepferi across the Alps, determined their ploidy levels, derived coarse-grain (100x100m) environmental descriptors for all sampling sites by downscaling WorldClim maps, and calculated fine-scale environmental descriptors (2x2m) from indicator values of the vegetation accompanying the sampled individuals. Both coarse and fine-scale variables were further computed for 8239 vegetation plots from across the Alps. Subsequently, we compared niche optima and breadths of diploid and tetraploid cytotypes by combining principal components analysis and kernel smoothing procedures. Comparisons were done separately for coarse and fine-grain data sets and for sympatric, allopatric and the total set of populations. Results: All comparisons indicate that the niches of the two cytotypes differ in optima and/or breadths, but results vary in important details. The whole-range analysis suggests differentiation along the temperature gradient to be most important. However, sympatric comparisons indicate that this climatic shift was not a direct response to competition with diploid ancestors. Moreover, fine-grained analyses demonstrate niche contraction of tetraploids, especially in the sympatric range, that goes undetected with coarse-grained data. Main conclusions: Although the niche optima of the two cytotypes differ, separation along ecological gradients was probably less decisive for polyploid establishment than a shift towards facultative apomixis, a particularly effective strategy to avoid minority cytotype exclusion. In addition, our results suggest that coarse-grained analyses overestimate niche breadths of widely distributed taxa. Niche comparison analyses should hence be conducted at environmental data resolutions appropriate for the organism and question under study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contemporary public administrations have become increasingly more complex, having to cordinate actions with emerging actors in the public and the private spheres. In this scenario the modern ICTs have begun to be seen as an ideal vehicle to resolve some of the problems of public administration. We argue that there is a clear need to explore the extent to which public administrations are undergoing a process of transformation towards a netowork government linked to the systematic incorporation of ICTs in their basic activities. Through critically analysing a selection of e-government evaluation reports, we conclude that research should be carried out if we are to build a solid government assessment framework based on network-like organisation characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The User-centered design (UCD) game is a tool forhuman-computer interaction practitioners to demonstrate the key user-centered design methodsand how they interrelate in the design process in an interactive and participatory manner. The target audiences are departments and institutions unfamiliar with UCD but whose work is related to the definition, creation, and update of a product or service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular probe techniques have made important contributions to the determination of microstructure of surfactant assemblies such as size, stability, micropolarity and conformation. Conductivity and surface tension were used to determine the critical aggregation concentration (cac) of polymer-surfactant complexes and the critical micellar concentration (cmc) of aqueous micellar aggregates. The results are compared with those of fluorescent techniques. Several surfactant systems were examined, 1-butanol-sodium dodecylsulfate (SDS) mixtures, solutions containing poly(ethylene oxide)-SDS, poly(vinylpyrrolidone)-SDS and poly(acrylic acid)-alkyltrimethylammonium bromide complexes. We found differences between the cac and cmc values obtained by conductivity or surface tension and those obtained by techniques which use hydrophobic probe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seaports play an important part in the wellbeing of a nation. Many nations are highly dependent on foreign trade and most trade is done using sea vessels. This study is part of a larger research project, where a simulation model is required in order to create further analyses on Finnish macro logistical networks. The objective of this study is to create a system dynamic simulation model, which gives an accurate forecast for the development of demand of Finnish seaports up to 2030. The emphasis on this study is to show how it is possible to create a detailed harbor demand System Dynamic model with the help of statistical methods. The used forecasting methods were ARIMA (autoregressive integrated moving average) and regression models. The created simulation model gives a forecast with confidence intervals and allows studying different scenarios. The building process was found to be a useful one and the built model can be expanded to be more detailed. Required capacity for other parts of the Finnish logistical system could easily be included in the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a comparison between three analytical methods developed for the simultaneous determination of eight quinolones regulated by the European Union (marbofloxacin, ciprofloxacin, danofloxacin, enrofloxacin, difloxacin, sarafloxacin, oxolinic acid and flumequine) in pig muscle, using liquid chromatography with fluorescence detection (LC-FD), liquid chromatography-mass spectrometry (LC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS). The procedures involve an extraction of the quinolones from the tissues, a step for clean-up and preconcentration of the analytes by solid-phase extraction and a subsequent liquid chromatographic analysis. The limits of detection of the methods ranged from 0.1 to 2.1 ng g−1 using LC-FD, from 0.3 to 1.8 using LC-MS and from 0.2 to 0.3 using LC-MS/MS, while inter- and intra-day variability was under 15 % in all cases. Most of those data are notably lower than the maximum residue limits established by the European Union for quinolones in pig tissues. The methods have been applied for the determination of quinolones in six different commercial pig muscle samples purchased in different supermarkets located in the city of Granada (south-east Spain).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Throughout history indigo was derived from various plants for example Dyer’s Woad (Isatis tinctoria L.) in Europe. In the 19th century were the synthetic dyes developed and nowadays indigo is mainly synthesized from by-products of fossil fuels. Indigo is a so-called vat dye, which means that it needs to be reduced to its water soluble leucoform before dyeing. Nowadays, most of the industrial reduction is performed chemically by sodium dithionite. However, this is considered environmentally unfavourable because of waste waters contaminating degradation products. Therefore there has been interest to find new possibilities to reduce indigo. Possible alternatives for the application of dithionite as the reducing agent are biologically induced reduction and electrochemical reduction. Glucose and other reducing sugars have recently been suggested as possible environmentally friendly alternatives as reducing agents for sulphur dyes and there have also been interest in using glucose to reduce indigo. In spite of the development of several types of processes, very little is known about the mechanism and kinetics associated with the reduction of indigo. This study aims at investigating the reduction and electrochemical analysis methods of indigo and give insight on the reduction mechanism of indigo. Anthraquinone as well as it’s derivative 1,8-dihydroxyanthraquinone were discovered to act as catalysts for the glucose induced reduction of indigo. Anthraquinone introduces a strong catalytic effect which is explained by invoking a molecular “wedge effect” during co-intercalation of Na+ and anthraquinone into the layered indigo crystal. The study includes also research on the extraction of plant-derived indigo from woad and the examination of the effect of this method to the yield and purity of indigo. The purity has been conventionally studied spectrophotometrically and a new hydrodynamic electrode system is introduced in this study. A vibrating probe is used in following electrochemically the leuco-indigo formation with glucose as a reducing agent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.