838 resultados para Jeffrey-matusita distance
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The objective of this work is to characterize the genome of the chromosome 1 of A.thaliana, a small flowering plants used as a model organism in studies of biology and genetics, on the basis of a recent mathematical model of the genetic code. I analyze and compare different portions of the genome: genes, exons, coding sequences (CDS), introns, long introns, intergenes, untranslated regions (UTR) and regulatory sequences. In order to accomplish the task, I transformed nucleotide sequences into binary sequences based on the definition of the three different dichotomic classes. The descriptive analysis of binary strings indicate the presence of regularities in each portion of the genome considered. In particular, there are remarkable differences between coding sequences (CDS and exons) and non-coding sequences, suggesting that the frame is important only for coding sequences and that dichotomic classes can be useful to recognize them. Then, I assessed the existence of short-range dependence between binary sequences computed on the basis of the different dichotomic classes. I used three different measures of dependence: the well-known chi-squared test and two indices derived from the concept of entropy i.e. Mutual Information (MI) and Sρ, a normalized version of the “Bhattacharya Hellinger Matusita distance”. The results show that there is a significant short-range dependence structure only for the coding sequences whose existence is a clue of an underlying error detection and correction mechanism. No doubt, further studies are needed in order to assess how the information carried by dichotomic classes could discriminate between coding and noncoding sequence and, therefore, contribute to unveil the role of the mathematical structure in error detection and correction mechanisms. Still, I have shown the potential of the approach presented for understanding the management of genetic information.
Resumo:
Crop monitoring and more generally land use change detection are of primary importance in order to analyze spatio-temporal dynamics and its impacts on environment. This aspect is especially true in such a region as the State of Mato Grosso (south of the Brazilian Amazon Basin) which hosts an intensive pioneer front. Deforestation in this region as often been explained by soybean expansion in the last three decades. Remote sensing techniques may now represent an efficient and objective manner to quantify how crops expansion really represents a factor of deforestation through crop mapping studies. Due to the special characteristics of the soybean productions' farms in Mato Grosso (area varying between 1000 hectares and 40000 hectares and individual fields often bigger than 100 hectares), the Moderate Resolution Imaging Spectroradiometer (MODIS) data with a near daily temporal resolution and 250 m spatial resolution can be considered as adequate resources to crop mapping. Especially, multitemporal vegetation indices (VI) studies have been currently used to realize this task [1] [2]. In this study, 16-days compositions of EVI (MODQ13 product) data are used. However, although these data are already processed, multitemporal VI profiles still remain noisy due to cloudiness (which is extremely frequent in a tropical region such as south Amazon Basin), sensor problems, errors in atmospheric corrections or BRDF effect. Thus, many works tried to develop algorithms that could smooth the multitemporal VI profiles in order to improve further classification. The goal of this study is to compare and test different smoothing algorithms in order to select the one which satisfies better to the demand which is classifying crop classes. Those classes correspond to 6 different agricultural managements observed in Mato Grosso through an intensive field work which resulted in mapping more than 1000 individual fields. The agricultural managements above mentioned are based on combination of soy, cotton, corn, millet and sorghum crops sowed in single or double crop systems. Due to the difficulty in separating certain classes because of too similar agricultural calendars, the classification will be reduced to 3 classes : Cotton (single crop), Soy and cotton (double crop), soy (single or double crop with corn, millet or sorghum). The classification will use training data obtained in the 2005-2006 harvest and then be tested on the 2006-2007 harvest. In a first step, four smoothing techniques are presented and criticized. Those techniques are Best Index Slope Extraction (BISE) [3], Mean Value Iteration (MVI) [4], Weighted Least Squares (WLS) [5] and Savitzky-Golay Filter (SG) [6] [7]. These techniques are then implemented and visually compared on a few individual pixels so that it allows doing a first selection between the five studied techniques. The WLS and SG techniques are selected according to criteria proposed by [8]. Those criteria are: ability in eliminating frequent noises, conserving the upper values of the VI profiles and keeping the temporality of the profiles. Those selected algorithms are then programmed and applied to the MODIS/TERRA EVI data (16-days composition periods). Tests of separability are realized based on the Jeffries-Matusita distance in order to see if the algorithms managed in improving the potential of differentiation between the classes. Those tests are realized on the overall profile (comprising 23 MODIS images) as well as on each MODIS sub-period of the profile [1]. This last test is a double interest process because it allows comparing the smoothing techniques and also enables to select a set of images which carries more information on the separability between the classes. Those selected dates can then be used to realize a supervised classification. Here three different classifiers are tested to evaluate if the smoothing techniques as a particular effect on the classification depending on the classifiers used. Those classifiers are Maximum Likelihood classifier, Spectral Angle Mapper (SAM) classifier and CHAID Improved Decision tree. It appears through the separability tests on the overall process that the smoothed profiles don't improve efficiently the potential of discrimination between classes when compared with the original data. However, the same tests realized on the MODIS sub-periods show better results obtained with the smoothed algorithms. The results of the classification confirm this first analyze. The Kappa coefficients are always better with the smoothing techniques and the results obtained with the WLS and SG smoothed profiles are nearly equal. However, the results are different depending on the classifier used. The impact of the smoothing algorithms is much better while using the decision tree model. Indeed, it allows a gain of 0.1 in the Kappa coefficient. While using the Maximum Likelihood end SAM models, the gain remains positive but is much lower (Kappa improved of 0.02 only). Thus, this work's aim is to prove the utility in smoothing the VI profiles in order to improve the final results. However, the choice of the smoothing algorithm has to be made considering the original data used and the classifier models used. In that case the Savitzky-Golay filter gave the better results.
Resumo:
We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.
Resumo:
The Out-of-Africa (OOA) dispersal ∼50,000 y ago is characterized by a series of founder events as modern humans expanded into multiple continents. Population genetics theory predicts an increase of mutational load in populations undergoing serial founder effects during range expansions. To test this hypothesis, we have sequenced full genomes and high-coverage exomes from seven geographically divergent human populations from Namibia, Congo, Algeria, Pakistan, Cambodia, Siberia, and Mexico. We find that individual genomes vary modestly in the overall number of predicted deleterious alleles. We show via spatially explicit simulations that the observed distribution of deleterious allele frequencies is consistent with the OOA dispersal, particularly under a model where deleterious mutations are recessive. We conclude that there is a strong signal of purifying selection at conserved genomic positions within Africa, but that many predicted deleterious mutations have evolved as if they were neutral during the expansion out of Africa. Under a model where selection is inversely related to dominance, we show that OOA populations are likely to have a higher mutation load due to increased allele frequencies of nearly neutral variants that are recessive or partially recessive.
Resumo:
Two different methods to measure binocular longitudinal corneal apex movements were synchronously applied. High-speed videokeratoscopy at a sampling frequency of 15 Hz and a customdesigned ultrasound distance sensor at 100 Hz were used for the left and the right eye, respectively. Four healthy subjects participated in the study. Simultaneously, cardiac electric cycle (ECG) was registered for each subject at 100 Hz. Each measurement took 20 s. Subjects were asked to suppress blinking during the measurements. A rigid headrest and a bite-bar were used to minimize undesirable head movements. Time, frequency and time-frequency representations of the acquired signals were obtained to establish their temporal and spectral contents. Coherence analysis was used to estimate the correlation between the measured signals. The results showed close correlation between both corneal apex movements and the cardiopulmonary system. Unraveling these relationships could lead to better understanding of interactions between ocular biomechanics and vision. The advantages and disadvantages of the two methods in the context of measuring longitudinal movements of the corneal apex are outlined.
Resumo:
Today more than ever, generating and managing knowledge is an essential source of competitive advantage for every organization, and particularly for Multinational corporations (MNC). However, despite the undisputed agreement about the importance of creating and managing knowledge, there are still a large number of corporations that act unethically or illegally. Clearly, there is a lack of attention in gaining more knowledge about the management of ethical knowledge in organizations. This paper refers to value-based knowledge, as the process of recognise and manage those values that stand at the heart of decision-making and action in organizations. In order to support MNCs in implementing value-based knowledge process, the managerial ethical profile (MEP) has been presented as a valuable tool to facilitate knowledge management process at both the intra-organizational network level and at the inter-organizational network level.
Resumo:
This chapter seeks to develop an analysis of the contemporary use of the ePortfolio (Electronic Portfolio) in education practices. Unlike other explorations of this new technology which are deterministic in their approach, the authors seek to reveal the techniques and practices of government which underpin the implementation of the e-portfolio. By interrogating a specific case study example from a large Australian university’s preservice teacher program, the authors find that the e-portfolio is represented as eLearning technology but serves to govern students via autonomization and self responsibilization. Using policy data and other key documents, they are able to reveal the e-portfolio as a delegated authority in the governance of preservice teachers. However, despite this ongoing trend, they suggest that like other practices of government, the e-portfolio will eventually fail. This however the authors conclude opens up space for critical thought and engagement which is not afforded presently.
Resumo:
Historically, distance education consisted of a combination of face-to-face blocks of time and surface mailed packages. However, advances in information technology literacy and the abundance of personal computers has placed e-learning in increased demand. The authors describe the planning, implementation, and evaluation of the blending of e-learning with face-to-face education in the postgraduate nursing forum. Experiences of this particular student group are also discussed.
Resumo:
Uncooperative iris identification systems at a distance and on the move often suffer from poor resolution and poor focus of the captured iris images. The lack of pixel resolution and well-focused images significantly degrades the iris recognition performance. This paper proposes a new approach to incorporate the focus score into a reconstruction-based super-resolution process to generate a high resolution iris image from a low resolution and focus inconsistent video sequence of an eye. A reconstruction-based technique, which can incorporate middle and high frequency components from multiple low resolution frames into one desired super-resolved frame without introducing false high frequency components, is used. A new focus assessment approach is proposed for uncooperative iris at a distance and on the move to improve performance for variations in lighting, size and occlusion. A novel fusion scheme is then proposed to incorporate the proposed focus score into the super-resolution process. The experiments conducted on the The Multiple Biometric Grand Challenge portal database shows that our proposed approach achieves an EER of 2.1%, outperforming the existing state-of-the-art averaging signal-level fusion approach by 19.2% and the robust mean super-resolution approach by 8.7%.