925 resultados para Weighted histogram analysis method


Relevância:

50.00% 50.00%

Publicador:

Resumo:

The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developing method is based on the crystallographic phenomenon that when crystallizing aqueous solutions of dihydrate CuCl2 with adding of organic solutions, originating, e.g., from crop samples, biocrystallograms are generated with reproducible crystal patterns (Kleber & Steinike-Hartung, 1959). Its output is a crystal pattern on glass plates from which different variables (numbers) can be calculated by using image analysis. However, there is a lack of a standardized evaluation method to quantify the morphological features of the biocrystallogram image. Therefore, the main sakes of this research are (1) to optimize an existing statistical model in order to describe all the effects that contribute to the experiment, (2) to investigate the effect of image parameters on the texture analysis of the biocrystallogram images, i.e., region of interest (ROI), color transformation and histogram matching on samples from the project 020E170/F financed by the Federal Ministry of Food, Agriculture and Consumer Protection(BMELV).The samples are wheat and carrots from controlled field and farm trials, (3) to consider the strongest effect of texture parameter with the visual evaluation criteria that have been developed by a group of researcher (University of Kassel, Germany; Louis Bolk Institute (LBI), Netherlands and Biodynamic Research Association Denmark (BRAD), Denmark) in order to clarify how the relation of the texture parameter and visual characteristics on an image is. The refined statistical model was accomplished by using a lme model with repeated measurements via crossed effects, programmed in R (version 2.1.0). The validity of the F and P values is checked against the SAS program. While getting from the ANOVA the same F values, the P values are bigger in R because of the more conservative approach. The refined model is calculating more significant P values. The optimization of the image analysis is dealing with the following parameters: ROI(Region of Interest which is the area around the geometrical center), color transformation (calculation of the 1 dimensional gray level value out of the three dimensional color information of the scanned picture, which is necessary for the texture analysis), histogram matching (normalization of the histogram of the picture to enhance the contrast and to minimize the errors from lighting conditions). The samples were wheat from DOC trial with 4 field replicates for the years 2003 and 2005, “market samples”(organic and conventional neighbors with the same variety) for 2004 and 2005, carrot where the samples were obtained from the University of Kassel (2 varieties, 2 nitrogen treatments) for the years 2004, 2005, 2006 and “market samples” of carrot for the years 2004 and 2005. The criterion for the optimization was repeatability of the differentiation of the samples over the different harvest(years). For different samples different ROIs were found, which reflect the different pictures. The best color transformation that shows efficiently differentiation is relied on gray scale, i.e., equal color transformation. The second dimension of the color transformation only appeared in some years for the effect of color wavelength(hue) for carrot treated with different nitrate fertilizer levels. The best histogram matching is the Gaussian distribution. The approach was to find a connection between the variables from textural image analysis with the different visual criteria. The relation between the texture parameters and visual evaluation criteria was limited to the carrot samples, especially, as it could be well differentiated by the texture analysis. It was possible to connect groups of variables of the texture analysis with groups of criteria from the visual evaluation. These selected variables were able to differentiate the samples but not able to classify the samples according to the treatment. Contrarily, in case of visual criteria which describe the picture as a whole there is a classification in 80% of the sample cases possible. Herewith, it clearly can find the limits of the single variable approach of the image analysis (texture analysis).

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Starting with logratio biplots for compositional data, which are based on the principle of subcompositional coherence, and then adding weights, as in correspondence analysis, we rediscover Lewi's spectral map and many connections to analyses of two-way tables of non-negative data. Thanks to the weighting, the method also achieves the property of distributional equivalence

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

50.00% 50.00%

Publicador:

Resumo:

I propose that the Last in, First out (LIFO) inventory valuation method needs to be reevaluated. I will evaluate the impact of the LIFO method on earnings of publically traded companies with a LIFO reserve over the past 10 years. I will begin my proposal with the history of how the LIFO method became an acceptable valuation method and discuss the significance of LIFO within the accounting profession Next I will provide a description of LIFO, the First in, First out (FIFO), and the weighted average inventory valuation methods and explore the differences among each. More specifically, I will explore the arguments for and against the use of the LIFO method and the potential shift towards financial standards that do not allow LIFO (a standard adopted and influenced by the International Financial Accounting Standards Board). Data will be collected from Compustat for publicly traded companies (with a LIFO Reserve) for the past 10 years. I will document which firms use LIFO, analyze trends relating to LIFO usage and LIFO reserves (the difference in the cost of inventory between using LIFO and FIFO), and evaluate the effect on earnings. The purpose of this research is to evaluate the accuracy of LIFO in portraying earnings and to see how much tax has gone uncollected over the years because of the use of LIFO. Moreover, I will provide an opinion as to whether U.S. GAAP should adopt a standard similar to IFRS and ban the LIFO method.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This PhD thesis analyses networks of knowledge flows, focusing on the role of indirect ties in the knowledge transfer, knowledge accumulation and knowledge creation process. It extends and improves existing methods for mapping networks of knowledge flows in two different applications and contributes to two stream of research. To support the underlying idea of this thesis, which is finding an alternative method to rank indirect network ties to shed a new light on the dynamics of knowledge transfer, we apply Ordered Weighted Averaging (OWA) to two different network contexts. Knowledge flows in patent citation networks and a company supply chain network are analysed using Social Network Analysis (SNA) and the OWA operator. The OWA is used here for the first time (i) to rank indirect citations in patent networks, providing new insight into their role in transferring knowledge among network nodes; and to analyse a long chain of patent generations along 13 years; (ii) to rank indirect relations in a company supply chain network, to shed light on the role of indirectly connected individuals involved in the knowledge transfer and creation processes and to contribute to the literature on knowledge management in a supply chain. In doing so, indirect ties are measured and their role as means of knowledge transfer is shown. Thus, this thesis represents a first attempt to bridge the OWA and SNA fields and to show that the two methods can be used together to enrich the understanding of the role of indirectly connected nodes in a network. More specifically, the OWA scores enrich our understanding of knowledge evolution over time within complex networks. Future research can show the usefulness of OWA operator in different complex networks, such as the on-line social networks that consists of thousand of nodes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To determine the most adequate number and size of tissue microarray (TMA) cores for pleomorphic adenoma immunohistochemical studies. Eighty-two pleomorphic adenoma cases were distributed in 3 TMA blocks assembled in triplicate containing 1.0-, 2.0-, and 3.0-mm cores. Immunohistochemical analysis against cytokeratin 7, Ki67, p63, and CD34 were performed and subsequently evaluated with PixelCount, nuclear, and microvessel software applications. The 1.0-mm TMA presented lower results than 2.0- and 3.0-mm TMAs versus conventional whole section slides. Possibly because of an increased amount of stromal tissue, 3.0-mm cores presented a higher microvessel density. Comparing the results obtained with one, two, and three 2.0-mm cores, there was no difference between triplicate or duplicate TMAs and a single-core TMA. Considering the possible loss of cylinders during immunohistochemical reactions, 2.0-mm TMAs in duplicate are a more reliable approach for pleomorphic adenoma immunohistochemical study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A flow injection method for the quantitative analysis of ketoconazole in tablets, based on the reaction with iron (III) ions, is presented. Ketoconazole forms a red complex with iron ions in an acid medium, with maximum absorbance at 495 nm. The detection limit was estimated to be 1×10--4 mol L-1; the quantitation limit is about 3×10--4 mol L-1 and approximately 30 determinations can be performed in an hour. The results were compared with those obtained with a reference HPLC method. Statistical comparisons were done using the Student's t procedure and the F test. Complete agreement was found at the 0.95 significance level between the proposed flow injection and the HPLC procedures. The two methods present similar precision, i.e., for HPLC the mean relative standard deviation was ca. 1.2% and for FIA ca. 1.6%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The metrological principles of neutron activation analysis are discussed. It has been demonstrated that this method can provide elemental amount of substance with values fully traceable to the SI. The method has been used by several laboratories worldwide in a number of CCQM key comparisons - interlaboratory comparison tests at the highest metrological level - supplying results equivalent to values from other methods for elemental or isotopic analysis in complex samples without the need to perform chemical destruction and dissolution of these samples. The CCOM accepted therefore in April 2007 the claim that neutron activation analysis should have the similar status as the methods originally listed by the CCOM as `primary methods of measurement`. Analytical characteristics and scope of application are given.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a physical non-linear formulation to deal with steel fiber reinforced concrete by the finite element method. The proposed formulation allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced in the pre-existent finite element numerical system to consider any distribution or quantity of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic of the formulation is the reduced work required by the user to introduce reinforcements, avoiding ""rebar"" elements, node by node geometrical definitions or even complex mesh generation. Bounded connection between long fibers and continuum is considered, for short fibers a simplified approach is proposed to consider splitting. Non-associative plasticity is adopted for the continuum and one dimensional plasticity is adopted to model fibers. Examples are presented in order to show the capabilities of the formulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Inverse analysis is currently an important subject of study in several fields of science and engineering. The identification of physical and geometric parameters using experimental measurements is required in many applications. In this work a boundary element formulation to identify boundary and interface values as well as material properties is proposed. In particular the proposed formulation is dedicated to identifying material parameters when a cohesive crack model is assumed for 2D problems. A computer code is developed and implemented using the BEM multi-region technique and regularisation methods to perform the inverse analysis. Several examples are shown to demonstrate the efficiency of the proposed model. (C) 2010 Elsevier Ltd. All rights reserved,

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, a new boundary element formulation for the analysis of plate-beam interaction is presented. This formulation uses a three nodal value boundary elements and each beam element is replaced by its actions on the plate, i.e., a distributed load and end of element forces. From the solution of the differential equation of a beam with linearly distributed load the plate-beam interaction tractions can be written as a function of the nodal values of the beam. With this transformation a final system of equation in the nodal values of displacements of plate boundary and beam nodes is obtained and from it, all unknowns of the plate-beam system are obtained. Many examples are analyzed and the results show an excellent agreement with those from the analytical solution and other numerical methods. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Generalized Finite Element Method (GFEM) is employed in this paper for the numerical analysis of three-dimensional solids tinder nonlinear behavior. A brief summary of the GFEM as well as a description of the formulation of the hexahedral element based oil the proposed enrichment strategy are initially presented. Next, in order to introduce the nonlinear analysis of solids, two constitutive models are briefly reviewed: Lemaitre`s model, in which damage and plasticity are coupled, and Mazars`s damage model suitable for concrete tinder increased loading. Both models are employed in the framework of a nonlocal approach to ensure solution objectivity. In the numerical analyses carried out, a selective enrichment of approximation at regions of concern in the domain (mainly those with high strain and damage gradients) is exploited. Such a possibility makes the three-dimensional analysis less expensive and practicable since re-meshing resources, characteristic of h-adaptivity, can be minimized. Moreover, a combination of three-dimensional analysis and the selective enrichment presents a valuable good tool for a better description of both damage and plastic strain scatterings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, a formulation for representation of stiffeners in plane stress by the boundary elements method (BEM) in linear analysis is presented. The strategy is to adopt approximations for the displacements in the central line of the stiffener. With this simplification the Spurious oscillations in the stress along stiffeners with small thickness is prevented. Worked examples are analyzed to show the efficiency of these techniques, especially in the insertion of very narrow sub-regions, in which quasi-singular integrals are calculated, with stiffeners that are much stiffer than the main domain. The results obtained with this formulation are very close to those obtained with other formulations. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A way of coupling digital image correlation (to measure displacement fields) and boundary element method (to compute displacements and tractions along a crack surface) is presented herein. It allows for the identification of Young`s modulus and fracture parameters associated with a cohesive model. This procedure is illustrated to analyze the latter for an ordinary concrete in a three-point bend test on a notched beam. In view of measurement uncertainties, the results are deemed trustworthy thanks to the fact that numerous measurement points are accessible and used as entries to the identification procedure. (C) 2010 Elsevier Ltd. All rights reserved.