906 resultados para Exploratory statistical data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A gestão de recursos humanos (RH) representa teoricamente uma abordagem da gestão empresarial voltada à organização do trabalho visando seu melhor aproveitamento e, em particular, o envolvimento dos trabalhadores nos objetivos da empresa. As práticas de gestão incidem não somente sobre o trabalho em si, mas também de maneira complexa nas interações sociais ocorridas no ambiente de trabalho, bem como na vida pessoal dos trabalhadores, de acordo com as premissas e práticas do estilo de gestão predominante. O presente estudo procura conhecer a natureza das correlações entre gestão de recursos humanos e sociabilidade dos trabalhadores, isto é, sua capacidade de tecer e de manter laços sociais diversos, a partir dos pontos de vista dos trabalhadores. Tomaram-se como referência empírica quatro redes de supermercados na Região Metropolitana de Belém, Pará. O setor é grande empregador, vem se modernizando expressivamente nas duas últimas décadas, implementando alguns procedimentos de gestão de RH e se mantém ao abrigo da forte concorrência de grupos nacionais e internacionais que se observa em outras capitais do país. A metodologia incluiu observações sistemáticas, análise documental e entrevistas estruturadas e semi-estruturadas em profundidade, respectivamente com trezentos e oitenta e quatorze trabalhadores, estes últimos selecionados dentre os constantes da amostra maior. As entrevistas versaram sobre atributos sociais e demográficos, trajetória ocupacional e padrões de relacionamento pessoal e profissional. Abrangeram, também, as percepções sobre regras e atitudes no trabalho, com base nas normas constantes dos manuais de serviço das empresas. Incluem-se trinta e quatro itens em uma escala de Lickert. Esses itens foram dispostos em fatores, sendo dois sobre gestão – qualidade do trabalho (QT) e introjeção das normas organizacionais (IN) – e três sobre sociabilidade – confiança (CF), manutenção (MR) e utilidade das relações (UT) no trabalho. Os entrevistados respondiam aos itens, ajustando o grau de sua percepção sobre cada um deles. Tais dados foram submetidos à técnica estatística exploratória Análise de Correspondência (AC) de maneira a verificar a correlação entre os fatores da escala e as características dos entrevistados. Sobre as correlações entre gestão e sociabilidade, sobressaiu em primeiro lugar o regime de trabalho. Jornadas extensas, escalas variáveis, longos intervalos diários e a política de qualificação em serviço (on the job) absorvem quase integralmente o tempo do empregado e dificultam manter relações pessoais ou mesmo estender aquelas formadas no ambiente de trabalho para além deste espaço. Dificulta também investir nos estudos, outra esfera de sociabilidade, o que surpreende em uma amostra cuja faixa etária predominante não ultrapassava trinta anos e cuja ocupação tem poucas possibilidades de carreira. Nesse quadro geral de restrições, a condição de gênero e de família também foi relevante, pois as mulheres, em particular as mães, indicaram menos atividades de lazer, em grupos menores e com mínima presença de colegas de trabalho, em comparação aos homens. Por outro lado, encontraram-se alguns casos de pessoas que construíram relações de conteúdo afetivo no ambiente de trabalho, mesmo a convivência se restringindo à empresa. Outra característica marcante foi a dependência do apoio familiar para o exercício da atividade laboral e para o enfrentamento das vicissitudes do mercado de trabalho. A importância dos laços familiares foi reforçada pelo longo tempo de moradia no mesmo bairro e, em proporção significativa, na mesma residência, em muitos casos a moradia era próxima ou no mesmo domicílio dos pais ou sogros, o que facilitava a ajuda mútua. Outro aspecto que se destacou da gestão de RH foi a imprecisão percebida nos critérios de ascensão profissional e de aplicação das normas, contribuindo para a existência de conflitos velados. Ao estabelecer laços sociais, os empregados depositam uma confiança seletiva, expressa no pequeno número de pessoas em quem se confiava no trabalho. Vale notar aqui também uma pequena variação entre homens e mulheres, pois eles confiavam mais que elas nos colegas. A AC mostrou sensíveis diferenças de percepção sobre qualidade do trabalho e introjeção de normas entre os trabalhadores com primeiro registro em carteira e aqueles com experiência anterior de trabalho formal. Os primeiros notaram um controle (vigilância) mais incisivo da gestão e expressaram menor anuência às regras organizacionais, enquanto que os demais não percebiam o controle da mesma forma e se viam como cumpridores dessas regras. Esses resultados foram interpretados como decorrentes das diferentes trajetórias anteriores, em ocupações formais ou informais, ou por se tratar do primeiro emprego. Não se pode afirmar que as restrições à sociabilidade se devam exclusivamente às características da gestão nesse setor, tendo em vista a incidência de outros fatores, tais como a condição sócio-econômica da família ou o tempo do vinculo empregatício, em média de dois anos entre os entrevistados, que podem ter contribuído para esses resultados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A longevidade apesar de ser, sem dúvida, um triunfo, apresenta importantes diferenças entre países desenvolvidos e países em desenvolvimento. Enquanto, nos primeiros, o envelhecimento ocorreu associado às melhorias nas condições gerais de vida, nos outros, esse processo acontece de forma rápida, sem tempo para uma reorganização social e da área de saúde adequada para atender às novas demandas emergentes. Este estudo tem por objetivos conhecer as condições de saúde da população idosa cadastrada na estratégia de saúde da família do município de Benevides-PA, assim como, descrever o perfil sócio-epidemiológico, condições de vida e saúde da população idosa e por fim verificar as relações entre as variáveis condições de vida e saúde e o perfil sócio-epidemiológico estabelecido. Realizou-se um estudo do tipo transversal prospectivo com abordagem quantitativa nas Unidades da Estratégia Saúde da Família, com 441 idosos utilizando como técnicas estatísticas a análise exploratória de dados e a análise de correspondência. Destaca-se que a maior parte dos idosos está entre 60 a 64 anos, são do sexo feminino, casado (a)s, católico (as) e com renda familiar de 1 a 3 salários mínimos. A maioria não ingere bebida alcoólica e não possui o hábito de fumar, assim como não pratica atividade física. Hipertensão arterial é a doença crônica não – transmissível mais prevalente. No estudo foi possível identificar o diagnóstico das condições de vida e saúde da população idosa demonstrando o seu envelhecimento com a presença de comorbidades, sendo possível a partir deste estudo favorecer a implementação de políticas públicas de saúde voltadas para a população idosa de Benevides a fim de proporcionar melhores condições de vida e saúde.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to compare the microhardness of four indirect composite resins. Forty cylindrical samples were prepared according to the manufacturer s recommendations using a Teflon mold. Ten specimens were produced from each tested material, constituting four groups (n=10) as follows: G1 - Artglass; G2 - Sinfony; G3 - Solidex; G4 - Targis. Microhardness was determined by the Vickers indentation technique with a load of 300g for 10 seconds. Four indentations were made on each sample, determining the mean microhardness values for each specimen. Descriptive statistics data for the experimental conditions were: G1 - Artglass (mean ±standard deviation: 55.26 ± 1.15HVN; median: 52.6); G2 - Sinfony (31.22 ± 0.65HVN; 31.30); G3 - Solidex (52.25 ± 1.55HVN; 52.60); G4 - Targis (72.14 ± 2.82HVN; 73.30). An exploratory data analysis was performed to determine the most appropriate statistical test through: (I) Levene's for homogeneity of variances; (II) ANOVA on ranks (Kruskal-Wallis); (III) Dunn's multiple comparison test (0.05). Targis presented the highest microhardness values while Sinfony presented the lowest. Artglass and Solidex were found as intermediate materials. These results indicate that distinct mechanical properties may be observed at specific materials. The composition of each material as well as variations on polymerization methods are possibly responsibles for the difference found in microhardness. Therefore, indirect composite resin materials that guarantee both good esthetics and adequate mechanical properties may be considered as substitutes of natural teeth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Geografia - FCT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Aortic aneurysm and dissection are important causes of death in older people. Ruptured aneurysms show catastrophic fatality rates reaching near 80%. Few population-based mortality studies have been published in the world and none in Brazil. The objective of the present study was to use multiple-cause-of-death methodology in the analysis of mortality trends related to aortic aneurysm and dissection in the state of Sao Paulo, between 1985 and 2009. Methods: We analyzed mortality data from the Sao Paulo State Data Analysis System, selecting all death certificates on which aortic aneurysm and dissection were listed as a cause-of-death. The variables sex, age, season of the year, and underlying, associated or total mentions of causes of death were studied using standardized mortality rates, proportions and historical trends. Statistical analyses were performed by chi-square goodness-of-fit and H Kruskal-Wallis tests, and variance analysis. The joinpoint regression model was used to evaluate changes in age-standardized rates trends. A p value less than 0.05 was regarded as significant. Results: Over a 25-year period, there were 42,615 deaths related to aortic aneurysm and dissection, of which 36,088 (84.7%) were identified as underlying cause and 6,527 (15.3%) as an associated cause-of-death. Dissection and ruptured aneurysms were considered as an underlying cause of death in 93% of the deaths. For the entire period, a significant increased trend of age-standardized death rates was observed in men and women, while certain non-significant decreases occurred from 1996/2004 until 2009. Abdominal aortic aneurysms and aortic dissections prevailed among men and aortic dissections and aortic aneurysms of unspecified site among women. In 1985 and 2009 death rates ratios of men to women were respectively 2.86 and 2.19, corresponding to a difference decrease between rates of 23.4%. For aortic dissection, ruptured and non-ruptured aneurysms, the overall mean ages at death were, respectively, 63.2, 68.4 and 71.6 years; while, as the underlying cause, the main associated causes of death were as follows: hemorrhages (in 43.8%/40.5%/13.9%); hypertensive diseases (in 49.2%/22.43%/24.5%) and atherosclerosis (in 14.8%/25.5%/15.3%); and, as associated causes, their principal overall underlying causes of death were diseases of the circulatory (55.7%), and respiratory (13.8%) systems and neoplasms (7.8%). A significant seasonal variation, with highest frequency in winter, occurred in deaths identified as underlying cause for aortic dissection, ruptured and non-ruptured aneurysms. Conclusions: This study introduces the methodology of multiple-causes-of-death to enhance epidemiologic knowledge of aortic aneurysm and dissection in Sao Paulo, Brazil. The results presented confer light to the importance of mortality statistics and the need for epidemiologic studies to understand unique trends in our own population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims. We studied four young star clusters to characterise their anomalous extinction or variable reddening and asses whether they could be due to contamination by either dense clouds or circumstellar effects. Methods. We evaluated the extinction law (R-V) by adopting two methods: (i) the use of theoretical expressions based on the colour-excess of stars with known spectral type; and (ii) the analysis of two-colour diagrams, where the slope of the observed colour distribution was compared to the normal distribution. An algorithm to reproduce the zero-age main-sequence (ZAMS) reddened colours was developed to derive the average visual extinction (A(V)) that provides the closest fit to the observational data. The structure of the clouds was evaluated by means of a statistical fractal analysis, designed to compare their geometric structure with the spatial distribution of the cluster members. Results. The cluster NGC 6530 is the only object of our sample affected by anomalous extinction. On average, the other clusters suffer normal extinction, but several of their members, mainly in NGC 2264, seem to have high R-V, probably because of circumstellar effects. The ZAMS fitting provides A(V) values that are in good agreement with those found in the literature. The fractal analysis shows that NGC 6530 has a centrally concentrated distribution of stars that differs from the substructures found in the density distribution of the cloud projected in the A(V) map, suggesting that the original cloud was changed by the cluster formation. However, the fractal dimension and statistical parameters of Berkeley 86, NGC 2244, and NGC 2264 indicate that there is a good cloud-cluster correlation, when compared to other works based on an artificial distribution of points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Aortic aneurysm and dissection are important causes of death in older people. Ruptured aneurysms show catastrophic fatality rates reaching near 80%. Few population-based mortality studies have been published in the world and none in Brazil. The objective of the present study was to use multiple-cause-of-death methodology in the analysis of mortality trends related to aortic aneurysm and dissection in the state of Sao Paulo, between 1985 and 2009. Methods: We analyzed mortality data from the Sao Paulo State Data Analysis System, selecting all death certificates on which aortic aneurysm and dissection were listed as a cause-of-death. The variables sex, age, season of the year, and underlying, associated or total mentions of causes of death were studied using standardized mortality rates, proportions and historical trends. Statistical analyses were performed by chi-square goodness-of-fit and H Kruskal-Wallis tests, and variance analysis. The joinpoint regression model was used to evaluate changes in age-standardized rates trends. A p value less than 0.05 was regarded as significant. Results: Over a 25-year period, there were 42,615 deaths related to aortic aneurysm and dissection, of which 36,088 (84.7%) were identified as underlying cause and 6,527 (15.3%) as an associated cause-of-death. Dissection and ruptured aneurysms were considered as an underlying cause of death in 93% of the deaths. For the entire period, a significant increased trend of age-standardized death rates was observed in men and women, while certain non-significant decreases occurred from 1996/2004 until 2009. Abdominal aortic aneurysms and aortic dissections prevailed among men and aortic dissections and aortic aneurysms of unspecified site among women. In 1985 and 2009 death rates ratios of men to women were respectively 2.86 and 2.19, corresponding to a difference decrease between rates of 23.4%. For aortic dissection, ruptured and non-ruptured aneurysms, the overall mean ages at death were, respectively, 63.2, 68.4 and 71.6 years; while, as the underlying cause, the main associated causes of death were as follows: hemorrhages (in 43.8%/40.5%/13.9%); hypertensive diseases (in 49.2%/22.43%/24.5%) and atherosclerosis (in 14.8%/25.5%/15.3%); and, as associated causes, their principal overall underlying causes of death were diseases of the circulatory (55.7%), and respiratory (13.8%) systems and neoplasms (7.8%). A significant seasonal variation, with highest frequency in winter, occurred in deaths identified as underlying cause for aortic dissection, ruptured and non-ruptured aneurysms. Conclusions: This study introduces the methodology of multiple-causes-of-death to enhance epidemiologic knowledge of aortic aneurysm and dissection in São Paulo, Brazil. The results presented confer light to the importance of mortality statistics and the need for epidemiologic studies to understand unique trends in our own population.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The candidate tackled an important issue in contemporary management: the role of CSR and Sustainability. The research proposal focused on a longitudinal and inductive research, directed to specify the evolution of CSR and contribute to the new institutional theory, in particular institutional work framework, and to the relation between institutions and discourse analysis. The documental analysis covers all the evolution of CSR, focusing also on a number of important networks and associations. Some of the methodologies employed in the thesis have been employed as a consequence of data analysis, in a truly inductive research process. The thesis is composed by two section. The first section mainly describes the research process and the analyses results. The candidates employed several research methods: a longitudinal content analysis of documents, a vocabulary research with statistical metrics as cluster analysis and factor analysis, a rhetorical analysis of justifications. The second section puts in relation the analysis results with theoretical frameworks and contributions. The candidate confronted with several frameworks: Actor-Network-Theory, Institutional work and Boundary Work, Institutional Logic. Chapters are focused on different issues: a historical reconstruction of CSR; a reflection about symbolic adoption of recurrent labels; two case studies of Italian networks, in order to confront institutional and boundary works; a theoretical model of institutional change based on contradiction and institutional complexity; the application of the model to CSR and Sustainability, proposing Sustainability as a possible institutional logic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The recent development of semi-automated techniques for staining and analyzing flow cytometry samples has presented new challenges. Quality control and quality assessment are critical when developing new high throughput technologies and their associated information services. Our experience suggests that significant bottlenecks remain in the development of high throughput flow cytometry methods for data analysis and display. Especially, data quality control and quality assessment are crucial steps in processing and analyzing high throughput flow cytometry data. Methods: We propose a variety of graphical exploratory data analytic tools for exploring ungated flow cytometry data. We have implemented a number of specialized functions and methods in the Bioconductor package rflowcyt. We demonstrate the use of these approaches by investigating two independent sets of high throughput flow cytometry data. Results: We found that graphical representations can reveal substantial non-biological differences in samples. Empirical Cumulative Distribution Function and summary scatterplots were especially useful in the rapid identification of problems not identified by manual review. Conclusions: Graphical exploratory data analytic tools are quick and useful means of assessing data quality. We propose that the described visualizations should be used as quality assessment tools and where possible, be used for quality control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genomic alterations have been linked to the development and progression of cancer. The technique of Comparative Genomic Hybridization (CGH) yields data consisting of fluorescence intensity ratios of test and reference DNA samples. The intensity ratios provide information about the number of copies in DNA. Practical issues such as the contamination of tumor cells in tissue specimens and normalization errors necessitate the use of statistics for learning about the genomic alterations from array-CGH data. As increasing amounts of array CGH data become available, there is a growing need for automated algorithms for characterizing genomic profiles. Specifically, there is a need for algorithms that can identify gains and losses in the number of copies based on statistical considerations, rather than merely detect trends in the data. We adopt a Bayesian approach, relying on the hidden Markov model to account for the inherent dependence in the intensity ratios. Posterior inferences are made about gains and losses in copy number. Localized amplifications (associated with oncogene mutations) and deletions (associated with mutations of tumor suppressors) are identified using posterior probabilities. Global trends such as extended regions of altered copy number are detected. Since the posterior distribution is analytically intractable, we implement a Metropolis-within-Gibbs algorithm for efficient simulation-based inference. Publicly available data on pancreatic adenocarcinoma, glioblastoma multiforme and breast cancer are analyzed, and comparisons are made with some widely-used algorithms to illustrate the reliability and success of the technique.