945 resultados para Statistical analysis methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study investigated the immunodetection of PCNA in epithelial components of dental follicles associated with impacted third molars without radiographical and morphological signs of pathosis. A total of 105 specimens of dental follicles associated with impacted third molars with incomplete rhizogenesis (between Nolla`s stage 6 and 9) were surgically removed from 56 patients. Epithelial cell proliferating was determined by using immunohistochemical labeling. Statistical analysis was performed using the Fisher exact test. Of the 105 dental follicles collected, 6 were PCNA-positive (approximate to 6%). The specimens with squamous metaplasia and epithelial hyperplasia had higher rates of positivity for PCNA, as well as those with proliferative remnants of odontogenic epithelium. In conclusion, this study shows that dental follicles at this stage of development have low proliferative potential, but suggests that squamous metaplasia, hyperplasia of the epithelial lining and presence of proliferative odontogenic epithelial rests in the connective tissue may be early signs of developing lesions of odontogenic origin.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The importance of nutrient intakes in osteoporosis prevention in treatment is widely recognized. The objective of the present study was to develop and validate a FFQ for women with osteoporosis. The questionnaire was composed of 60 items, separated into 10 groups. The relative validation was accomplished through comparison of the 3-Day Food Record (3DR) with the FFQ. The 3DR was applied to 30 elderly women with confirmed osteoporosis, and after 45 days the FFQ was administrated. Statistical analysis comprised the Kolmogorov-Smirnov, Student T test and Pearson correlation coefficient. The agreement between two methods was evaluated by the frequency of similar classification into quartiles, and by the Bland-Altman method. No significant differences between methods were observed for the mean evaluated nutrients, except for carbohydrate and magnesium. Pearson correlation coefficients were positive and statistically significant for all nutrients. The overall proportion of subjects classified in the same quartile by the two methods was on average 50.01% and in the opposite quartile 0.47%. For calcium intake, only 3% of subjects were classified in opposite extreme quartiles by the two methods. The Bland-Altman analysis demonstrated that the differences obtained by the two methods in each subject were well distributed around the mean of the difference, and the disagreement increases as the mean intake increases. These results indicates that the FFQ for elderly women with osteoporosis presented here is highly acceptable and is an accurate method that can be used in large-scale or clinical studies for evaluation of nutrient intakes in a similar population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to improve our understanding of climate change, the aim of this research project was to study the climatology and the time trends of drizzle and fog events in the Sao Paulo Metropolitan Area, and the possible connections of this variability with the sea surface temperature (SST) of the Atlantic and Pacific Oceans. The climatology of both phenomena presents differences and similarities. Fog shows a marked maximum frequency in winter and a minimum frequency in summer, while the seasonal differences of drizzle occurrence are less pronounced, there is a maximum in spring, whereas the other seasons present smaller and similar numbers of events. Both phenomena present a negative trend from 1933 to 2005 which is stronger for fog events. A multivariate statistical analysis indicates that the South Atlantic SST could increase warm temperature advection to the continent. This could be one of the responsible factors for the negative tendency in the number of both fog and drizzle events.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Flickering is a phenomenon related to mass accretion observed among many classes of astrophysical objects. In this paper we present a study of flickering emission lines and the continuum of the cataclysmic variable V3885 Sgr. The flickering behavior was first analyzed through statistical analysis and the power spectra of lightcurves. Autocorrelation techniques were then employed to estimate the flickering timescale of flares. A cross-correlation study between the line and its underlying continuum variability is presented. The cross-correlation between the photometric and spectroscopic data is also discussed. Periodograms, calculated using emission-line data, show a behavior that is similar to those obtained from photometric datasets found in the literature, with a plateau at lower frequencies and a power-law at higher frequencies. The power-law index is consistent with stochastic events. The cross-correlation study indicates the presence of a correlation between the variability on Ha and its underlying continuum. Flickering timescales derived from the photometric data were estimated to be 25 min for two lightcurves and 10 min for one of them. The average timescales of the line flickering is 40 min, while for its underlying continuum it drops to 20 min.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background and Objective: Mucositis is the most common oral complication of cancer chemotherapy, which causes pain on mastication and swallowing, impairs patients` ability to eat and take oral drugs and may determine interruption of the treatment. The aim of this study was to evaluate the effect of light-emitting diode (LED) therapy on chemotherapy-induced mucositis in hamsters. Study Design/Materials and Methods: Animals of both experimental (Group 1; n = 32) and positive control (Group II; n = 32) groups received intraperitoneal injections of 5-fluorouracil on days 0 and 2. All animals had their right and left cheek pouch irritated by superficial scratching on days 3 and 4. In Group I, LED irradiation (630 nm +/- 10 nm, 160 mW, 12 J/cm(2)) was applied during 37.5 seconds at days 3, 4, 6, 8, 10, 12, and 14. In Group II, mucositis was induced, but LED therapy was not performed. The oral mucosa was photographed from day 4 to 14 at 2-day intervals. Photographs were randomly scored according to the severity of induced mucositis (0 to 5). In the negative control group (Group III; n = 6), no mucositis was induced. Biopsies of the cheek pouches of 8 animals (Group I and Group II) were surgically obtained on days 5, 9, 13 and 15 and processed for histological examination. Results: The statistical analysis showed significant differences between irradiated and non-irradiated groups (P < 0.05). However, muscular degeneration was observed in 18% of the samples of Group I. Conclusion: It may be concluded that the LED therapy protocol established for this in vivo study was effective in reducing the severity of oral mucositis, although the oral lesions were not completely prevented. Lasers Surg. Med. 40:625-633, 2008. (c) 2008Wiley-Liss, Inc.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recently, the deterministic tourist walk has emerged as a novel approach for texture analysis. This method employs a traveler visiting image pixels using a deterministic walk rule. Resulting trajectories provide clues about pixel interaction in the image that can be used for image classification and identification tasks. This paper proposes a new walk rule for the tourist which is based on contrast direction of a neighborhood. The yielded results using this approach are comparable with those from traditional texture analysis methods in the classification of a set of Brodatz textures and their rotated versions, thus confirming the potential of the method as a feasible texture analysis methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background. The aim of this study was to evaluate Ki-67 and Bcl-2 antigen expression in colorectal polyps from women with breast cancer. Methods. A randomized, controlled study was carried out in 35 women, either with or without breast cancer, who had adenomatous colorectal polyps. The patients were divided into two groups: group A (without breast cancer; control group; n = 17) and group B (with breast cancer; study group; n = 18). Immunohistochemistry was performed on the colorectal polyps to evaluate Ki-67 and Bcl-2 antigen expression. Student`s t-test and the chi(2) test were used for the statistical analysis of Ki-67 and Bcl-2 expression, respectively. Statistical significance was established as P < 0.05. Results. The mean percentage of Ki-67-stained nuclei in groups A and B was 36.25 +/- 2.31 and 59.44 +/- 3.34 ( SEM), respectively (P < 0.0001), while the percentage of cases with cells expressing Bcl-2 in groups A and B was 23.5 and 77.8%, respectively (P < 0.001). Conclusions. In the present study, there was greater proliferative activity and greater expression of the antiapoptotic protein Bcl-2 in the colorectal polyps of women with breast cancer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate the effect of raloxifene on CD34 and Ki-67 antigen expression in breast cancer specimens from postmenopausal women. Methods: Sixteen postmenopausal patients with operable, stage II (>= 3 cm), estrogen receptor-positive breast cancer, who took 60 mg of raloxifene daily for 28 days, participated in this study. Immunohistochemistry was carried out in tumor samples prior to and following raloxifene treatment to evaluate CD34 and Ki-67 protein expression. Angiogenesis was quantified in 10 randomly selected fields per slide, and Ki-67-stained nuclei were counted in 1,000 cells per slide using an image capture and analysis system with 400 ! magnification. Student`s t test for paired samples was used for the statistical analysis of data. Statistical significance was established at p < 0.05. Results: The mean number of microvessels was 44.44 +/- 3.54 prior to raloxifene therapy and 22.63 +/- 1.61 following therapy (p < 0.001), and the mean percentage of Ki-67-stained nuclei was 19.28 +/- 8 1.61 and 12.13 +/- 8 1.48 prior to and following raloxifene treatment, respectively (p < 0.001). Conclusion: Raloxifene significantly reduces CD34 and Ki-67 protein expression in breast carcinoma in postmenopausal women. Copyright (C) 2008 S. Karger AG, Basel

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, it has been observed that software clones and plagiarism are becoming an increased threat for one?s creativity. Clones are the results of copying and using other?s work. According to the Merriam – Webster dictionary, “A clone is one that appears to be a copy of an original form”. It is synonym to duplicate. Clones lead to redundancy of codes, but not all redundant code is a clone.On basis of this background knowledge ,in order to safeguard one?s idea and to avoid intentional code duplication for pretending other?s work as if their owns, software clone detection should be emphasized more. The objective of this paper is to review the methods for clone detection and to apply those methods for finding the extent of plagiarism occurrence among the Swedish Universities in Master level computer science department and to analyze the results.The rest part of the paper, discuss about software plagiarism detection which employs data analysis technique and then statistical analysis of the results.Plagiarism is an act of stealing and passing off the idea?s and words of another person?s as one?s own. Using data analysis technique, samples(Master level computer Science thesis report) were taken from various Swedish universities and processed in Ephorus anti plagiarism software detection. Ephorus gives the percentage of plagiarism for each thesis document, from this results statistical analysis were carried out using Minitab Software.The results gives a very low percentage of Plagiarism extent among the Swedish universities, which concludes that Plagiarism is not a threat to Sweden?s standard of education in computer science.This paper is based on data analysis, intelligence techniques, EPHORUS software plagiarism detection tool and MINITAB statistical software analysis.