850 resultados para Local classification method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Houston region is home to arguably the largest petrochemical and refining complex anywhere. The effluent of this complex includes many potentially hazardous compounds. Study of some of these compounds has led to recognition that a number of known and probable carcinogens are at elevated levels in ambient air. Two of these, benzene and 1,3-butadiene, have been found in concentrations which may pose health risk for residents of Houston.^ Recent popular journalism and publications by local research institutions has increased the interest of the public in Houston's air quality. Much of the literature has been critical of local regulatory agencies' oversight of industrial pollution. A number of citizens in the region have begun to volunteer with air quality advocacy groups in the testing of community air. Inexpensive methods exist for monitoring of ozone, particulate matter and airborne toxic ambient concentrations. This study is an evaluation of a technique that has been successfully applied to airborne toxics.^ This technique, solid phase microextraction (SPME), has been used to measure airborne volatile organic hydrocarbons at community-level concentrations. It is has yielded accurate and rapid concentration estimates at a relatively low cost per sample. Examples of its application to measurement of airborne benzene exist in the literature. None have been found for airborne 1,3-butadiene. These compounds were selected for an evaluation of SPME as a community-deployed technique, to replicate previous application to benzene, to expand application to 1,3-butadiene and due to the salience of these compounds in this community. ^ This study demonstrates that SPME is a useful technique for quantification of 1,3-butadiene at concentrations observed in Houston. Laboratory background levels precluded recommendation of the technique for benzene. One type of SPME fiber, 85 μm Carboxen/PDMS, was found to be a sensitive sampling device for 1,3-butadiene under temperature and humidity conditions common in Houston. This study indicates that these variables affect instrument response. This suggests the necessity of calibration within specific conditions of these variables. While deployment of this technique was less expensive than other methods of quantification of 1,3-butadiene, the complexity of calibration may exclude an SPME method from broad deployment by community groups.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation focuses on two aspects of RNA sequencing technology. The first is the methodology for modeling the overdispersion inherent in RNA-seq data for differential expression analysis. This aspect is addressed in three sections. The second aspect is the application of RNA-seq data to identify the CpG island methylator phenotype (CIMP) by integrating datasets of mRNA expression level and DNA methylation status. Section 1: The cost of DNA sequencing has reduced dramatically in the past decade. Consequently, genomic research increasingly depends on sequencing technology. However it remains elusive how the sequencing capacity influences the accuracy of mRNA expression measurement. We observe that accuracy improves along with the increasing sequencing depth. To model the overdispersion, we use the beta-binomial distribution with a new parameter indicating the dependency between overdispersion and sequencing depth. Our modified beta-binomial model performs better than the binomial or the pure beta-binomial model with a lower false discovery rate. Section 2: Although a number of methods have been proposed in order to accurately analyze differential RNA expression on the gene level, modeling on the base pair level is required. Here, we find that the overdispersion rate decreases as the sequencing depth increases on the base pair level. Also, we propose four models and compare them with each other. As expected, our beta binomial model with a dynamic overdispersion rate is shown to be superior. Section 3: We investigate biases in RNA-seq by exploring the measurement of the external control, spike-in RNA. This study is based on two datasets with spike-in controls obtained from a recent study. We observe an undiscovered bias in the measurement of the spike-in transcripts that arises from the influence of the sample transcripts in RNA-seq. Also, we find that this influence is related to the local sequence of the random hexamer that is used in priming. We suggest a model of the inequality between samples and to correct this type of bias. Section 4: The expression of a gene can be turned off when its promoter is highly methylated. Several studies have reported that a clear threshold effect exists in gene silencing that is mediated by DNA methylation. It is reasonable to assume the thresholds are specific for each gene. It is also intriguing to investigate genes that are largely controlled by DNA methylation. These genes are called “L-shaped” genes. We develop a method to determine the DNA methylation threshold and identify a new CIMP of BRCA. In conclusion, we provide a detailed understanding of the relationship between the overdispersion rate and sequencing depth. And we reveal a new bias in RNA-seq and provide a detailed understanding of the relationship between this new bias and the local sequence. Also we develop a powerful method to dichotomize methylation status and consequently we identify a new CIMP of breast cancer with a distinct classification of molecular characteristics and clinical features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study multibeam angular backscatter data acquired in the eastern slope of the Porcupine Seabight are analysed. Processing of the angular backscatter data using the 'NRGCOR' software was made for 29 locations comprising different geological provinces like: carbonate mounds, buried mounds, seafloor channels, and inter-channel areas. A detailed methodology is developed to produce a map of angle-invariant (normalized) backscatter data by correcting the local angular backscatter values. The present paper involves detailed processing steps and related technical aspects of the normalization approach. The presented angle-invariant backscatter map possesses 12 dB dynamic range in terms of grey scale. A clear distinction is seen between the mound dominated northern area (Belgica province) and the Gollum channel seafloor at the southern end of the site. Qualitative analyses of the calculated mean backscatter values i.e., grey scale levels, utilizing angle-invariant backscatter data generally indicate backscatter values are highest (lighter grey scale) in the mound areas followed by buried mounds. The backscatter values are lowest in the inter-channel areas (lowest grey scale level). Moderate backscatter values (medium grey level) are observed from the Gollum and Kings channel data, and significant variability within the channel seafloor provinces. The segmentation of the channel seafloor provinces are made based on the computed grey scale levels for further analyses based on the angular backscatter strength. Three major parameters are utilized to classify four different seafloor provinces of the Porcupine Seabight by employing a semi-empirical method to analyse multibeam angular backscatter data. The predicted backscatter response which has been computed at 20° is the highest for the mound areas. The coefficient of variation (CV) of the mean backscatter response is also the highest for the mound areas. Interestingly, the slope value of the buried mound areas are found to be the highest. However, the channel seafloor of moderate backscatter response presents the lowest slope and CV values. A critical examination of the inter-channel areas indicates less variability within the estimated three parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study subdivides the Potter Cove, King George Island, Antarctica, into seafloor regions using multivariate statistical methods. These regions are categories used for comparing, contrasting and quantifying biogeochemical processes and biodiversity between ocean regions geographically but also regions under development within the scope of global change. The division obtained is characterized by the dominating components and interpreted in terms of ruling environmental conditions. The analysis includes in total 42 different environmental variables, interpolated based on samples taken during Australian summer seasons 2010/2011 and 2011/2012. The statistical errors of several interpolation methods (e.g. IDW, Indicator, Ordinary and Co-Kriging) with changing settings have been compared and the most reasonable method has been applied. The multivariate mathematical procedures used are regionalized classification via k means cluster analysis, canonical-correlation analysis and multidimensional scaling. Canonical-correlation analysis identifies the influencing factors in the different parts of the cove. Several methods for the identification of the optimum number of clusters have been tested and 4, 7, 10 as well as 12 were identified as reasonable numbers for clustering the Potter Cove. Especially the results of 10 and 12 clusters identify marine-influenced regions which can be clearly separated from those determined by the geological catchment area and the ones dominated by river discharge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale environmental patterns in the Humboldt Current System (HCS) show major changes during strong El Niño episodes, leading to the mass mortality of dominant species in coastal ecosystems. Here we explore how these changes affect the life-history traits of the surf clam Mesodesma donacium. Growth and mortality rates under normal temperature and salinity were compared to those under anomalous (El Niño) higher temperature and reduced salinity. Moreover, the reproductive spatial-temporal patterns along the distribution range were studied, and their relationship to large-scale environmental variability was assessed. M. donacium is highly sensitive to temperature changes, supporting the hypothesis of temperature as the key factor leading to mass mortality events of this clam in northern populations. In contrast, this species, particularly juveniles, was remarkably tolerant to low salinity, which may be related to submarine groundwater discharge in Hornitos, northern Chile. The enhanced osmotic tolerance by juveniles may represent an adaptation of early life stages allowing settlement in vacant areas at outlets of estuarine areas. The strong seasonality in freshwater input and in upwelling strength seems to be linked to the spatial and temporal patterns in the reproductive cycle. Owing to its origin and thermal sensitivity, the expansion and dominance of M. donacium from the Pliocene/Pleistocene transition until the present seem closely linked to the establishment and development of the cold HCS. Therefore, the recurrence of warming events (particularly El Niño since at least the Holocene) has submitted this cold-water species to a continuous local extinction-recolonization process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major-element and most of the trace-element data from the different laboratories that contributed to the study of samples recovered during Leg 82 are presented in the following tables. The different basalt groups, identified on the basis of their chemical properties (major and trace elements), were defined from the data available on board the Glomar Challenger as the cruise progressed (see site chapters, all sites, this volume). Most of the data obtained since the end of the cruise and presented in these tables confirm the classification that was proposed by the shipboard party (see site chapters, all sites, this volume). Nevertheless, special mention should be made about Site 564. The shipboard party proposed a single chemical group at this site but noticed significant variations down the hole, mainly in trace-element data. However, the range of variation was small compared to the precision of the measurements. These variations were confirmed by the onshore studies (see papers in Part IV of this volume, especially Brannon's paper, partly devoted to this topic).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study describes detailed partitioning of phytomass carbon (C) and soil organic carbon (SOC) for four study areas in discontinuous permafrost terrain, Northeast European Russia. The mean aboveground phytomass C storage is 0.7 kg C/m**2. Estimated landscape SOC storage in the four areas varies between 34.5 and 47.0 kg C/m**2 with LCC (land cover classification) upscaling and 32.5-49.0 kg C/m**2 with soil map upscaling. A nested upscaling approach using a Landsat thematic mapper land cover classification for the surrounding region provides estimates within 5 ± 5% of the local high-resolution estimates. Permafrost peat plateaus hold the majority of total and frozen SOC, especially in the more southern study areas. Burying of SOC through cryoturbation of O- or A-horizons contributes between 1% and 16% (mean 5%) of total landscape SOC. The effect of active layer deepening and thermokarst expansion on SOC remobilization is modeled for one of the four areas. The active layer thickness dynamics from 1980 to 2099 is modeled using a transient spatially distributed permafrost model and lateral expansion of peat plateau thermokarst lakes is simulated using geographic information system analyses. Active layer deepening is expected to increase the proportion of SOC affected by seasonal thawing from 29% to 58%. A lateral expansion of 30 m would increase the amount of SOC stored in thermokarst lakes/fens from 2% to 22% of all SOC. By the end of this century, active layer deepening will likely affect more SOC than thermokarst expansion, but the SOC stores vulnerable to thermokarst are less decomposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Locally weighted regression is a technique that predicts the response for new data items from their neighbors in the training data set, where closer data items are assigned higher weights in the prediction. However, the original method may suffer from overfitting and fail to select the relevant variables. In this paper we propose combining a regularization approach with locally weighted regression to achieve sparse models. Specifically, the lasso is a shrinkage and selection method for linear regression. We present an algorithm that embeds lasso in an iterative procedure that alternatively computes weights and performs lasso-wise regression. The algorithm is tested on three synthetic scenarios and two real data sets. Results show that the proposed method outperforms linear and local models for several kinds of scenarios

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most fusion satellite image methodologies at pixel-level introduce false spatial details, i.e.artifacts, in the resulting fusedimages. In many cases, these artifacts appears because image fusion methods do not consider the differences in roughness or textural characteristics between different land covers. They only consider the digital values associated with single pixels. This effect increases as the spatial resolution image increases. To minimize this problem, we propose a new paradigm based on local measurements of the fractal dimension (FD). Fractal dimension maps (FDMs) are generated for each of the source images (panchromatic and each band of the multi-spectral images) with the box-counting algorithm and by applying a windowing process. The average of source image FDMs, previously indexed between 0 and 1, has been used for discrimination of different land covers present in satellite images. This paradigm has been applied through the fusion methodology based on the discrete wavelet transform (DWT), using the à trous algorithm (WAT). Two different scenes registered by optical sensors on board FORMOSAT-2 and IKONOS satellites were used to study the behaviour of the proposed methodology. The implementation of this approach, using the WAT method, allows adapting the fusion process to the roughness and shape of the regions present in the image to be fused. This improves the quality of the fusedimages and their classification results when compared with the original WAT method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A finite element model was used to simulate timberbeams with defects and predict their maximum load in bending. Taking into account the elastoplastic constitutive law of timber, the prediction of fracture load gives information about the mechanisms of timber failure, particularly with regard to the influence of knots, and their local graindeviation, on the fracture. A finite element model was constructed using the ANSYS element Plane42 in a plane stress 2D-analysis, which equates thickness to the width of the section to create a mesh which is as uniform as possible. Three sub-models reproduced the bending test according to UNE EN 408: i) timber with holes caused by knots; ii) timber with adherent knots which have structural continuity with the rest of the beam material; iii) timber with knots but with only partial contact between knot and beam which was artificially simulated by means of contact springs between the two materials. The model was validated using ten 45 × 145 × 3000 mm beams of Pinus sylvestris L. which presented knots and graindeviation. The fracture stress data obtained was compared with the results of numerical simulations, resulting in an adjustment error less of than 9.7%

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A land classification method was designed for the Community of Madrid (CM), which has lands suitable for either agriculture use or natural spaces. The process started from an extensive previous CM study that contains sets of land attributes with data for 122 types and a minimum-requirements method providing a land quality classification (SQ) for each land. Borrowing some tools from Operations Research (OR) and from Decision Science, that SQ has been complemented by an additive valuation method that involves a more restricted set of 13 representative attributes analysed using Attribute Valuation Functions to obtain a quality index, QI, and by an original composite method that uses a fuzzy set procedure to obtain a combined quality index, CQI, that contains relevant information from both the SQ and the QI methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free people association constructed from button to above to get better conditions of people using local resources, are among others, elements of local development. LEADER (Liaisons HQWUH DFWLYLWpV GH 'HYHORSHPHQW GH /¶(FRQRPLH 5XUDO) is the Europe Union model of rural development. The LEADER method is conformed in seven features which are factors of success in the approach of applying in different territories . The actions held in the municipal council of rural development of San Andres C a l p a n during 2010 showed some elements of LEADER for it´s adjustment: 1).- territory definition , 2).- local association , 3).- financing. It´s used a methodology consists of reviewing documents about the financing and association in the territory studied, survey applying t define the model of agricultural production and development along with mayors of different municipalities, the economical and social actors. The definition performance field with territory integration of citizen councils as groups of local action and a financing strategy are part of the results of this process of adapting in this territory

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tras el devastador terremoto del 12 de enero de 2010 en Puerto Príncipe, Haití, las autoridades locales, numerosas ONGs y organismos nacionales e internacionales están trabajando en el desarrollo de estrategias para minimizar el elevado riesgo sísmico existente en el país. Para ello es necesario, en primer lugar, estimar dicho riesgo asociado a eventuales terremotos futuros que puedan producirse, evaluando el grado de pérdidas que podrían generar, para dimensionar la catástrofe y actuar en consecuencia, tanto en lo referente a medidas preventivas como a adopción de planes de emergencia. En ese sentido, este Trabajo Fin de Master aporta un análisis detallado del riesgo sísmico asociado a un futuro terremoto que podría producirse con probabilidad razonable, causando importantes daños en Puerto Príncipe. Se propone para ello una metodología de cálculo del riesgo adaptada a los condicionantes de la zona, con modelos calibrados empleando datos del sismo de 2010. Se ha desarrollado en el marco del proyecto de cooperación Sismo-Haití, financiado por la Universidad Politécnica de Madrid, que comenzó diez meses después del terremoto de 2010 como respuesta a una petición de ayuda del gobierno haitiano. El cálculo del riesgo requiere la consideración de dos inputs: la amenaza sísmica o movimiento esperado por el escenario definido (sismo de cierta magnitud y localización) y los elementos expuestos a esta amenaza (una clasificación del parque inmobiliario en diferentes tipologías constructivas, así como su vulnerabilidad). La vulnerabilidad de estas tipologías se describe por medio de funciones de daño: espectros de capacidad, que representan su comportamiento ante las fuerzas horizontales motivadas por los sismos, y curvas de fragilidad, que representan la probabilidad de que las estructuras sufran daños al alcanzar el máximo desplazamiento horizontal entre plantas debido a la mencionada fuerza horizontal. La metodología que se propone especifica determinadas pautas y criterios para estimar el movimiento, asignar la vulnerabilidad y evaluar el daño, cubriendo los tres estados del proceso. Por una parte, se consideran diferentes modelos de movimiento fuerte incluyendo el efecto local, y se identifican los que mejor ajustan a las observaciones de 2010. Por otra se clasifica el parque inmobiliario en diferentes tipologías constructivas, en base a la información extraída en una campaña de campo y utilizando además una base de datos aportada por el Ministerio de Obras Públicas de Haití. Ésta contiene información relevante de todos los edificios de la ciudad, resultando un total de 6 tipologías. Finalmente, para la estimación del daño se aplica el método capacidad-demanda implementado en el programa SELENA (Molina et al., 2010). En primer lugar, utilizado los datos de daño del terremoto de 2010, se ha calibrado el modelo propuesto de cálculo de riesgo sísmico: cuatro modelos de movimiento fuerte, tres modelos de tipo de suelo y un conjunto de funciones de daño. Finalmente, con el modelo calibrado, se ha simulado un escenario sísmico determinista correspondiente a un posible terremoto con epicentro próximo a Puerto Príncipe. Los resultados muestran que los daños estructurales serán considerables y podrán llevar a pérdidas económicas y humanas que causen un gran impacto en el país, lo que pone de manifiesto la alta vulnerabilidad estructural existente. Este resultado será facilitado a las autoridades locales, constituyendo una base sólida para toma de decisiones y adopción de políticas de prevención y mitigación del riesgo. Se recomienda dirigir esfuerzos hacia la reducción de la vulnerabilidad estructural - mediante refuerzo de edificios vulnerables y adopción de una normativa sismorresistente- y hacia el desarrollo de planes de emergencia. Abstract After the devastating 12 January 2010 earthquake that hit the city of Port-au-Prince, Haiti, strategies to minimize the high seismic risk are being developed by local authorities, NGOs, and national and international institutions. Two important tasks to reach this objective are, on the one hand, the evaluation of the seismic risk associated to possible future earthquakes in order to know the dimensions of the catastrophe; on the other hand, the design of preventive measures and emergency plans to minimize the consequences of such events. In this sense, this Master Thesis provides a detailed estimation of the damage that a possible future earthquake will cause in Port-au-Prince. A methodology to calculate the seismic risk is proposed, adapted to the study area conditions. This methodology has been calibrated using data from the 2010 earthquake. It has been conducted in the frame of the Sismo-Haiti cooperative project, supported by the Technical University of Madrid, which started ten months after the 2010 earthquake as an answer to an aid call of the Haitian government. The seismic risk calculation requires two inputs: the seismic hazard (expected ground motion due to a scenario earthquake given by magnitude and location) and the elements exposed to the hazard (classification of the building stock into building typologies, as well as their vulnerability). This vulnerability is described through the damage functions: capacity curves, which represent the structure performance against the horizontal forces caused by the seisms; and fragility curves, which represent the probability of damage as the structure reaches the maximum spectral displacement due to the horizontal force. The proposed methodology specifies certain guidelines and criteria to estimate the ground motion, assign the vulnerability, and evaluate the damage, covering the whole process. Firstly, different ground motion prediction equations including the local effect are considered, and the ones that have the best correlation with the observations of the 2010 earthquake, are identified. Secondly, the classification of building typologies is made by using the information collected during a field campaign, as well as a data base provided by the Ministry of Public Works of Haiti. This data base contains relevant information about all the buildings in the city, leading to a total of 6 different typologies. Finally, the damage is estimated using the capacity-spectrum method as implemented in the software SELENA (Molina et al., 2010). Data about the damage caused by the 2010 earthquake have been used to calibrate the proposed calculation model: different choices of ground motion relationships, soil models, and damage functions. Then, with the calibrated model, a deterministic scenario corresponding to an epicenter close to Port-au-Prince has been simulated. The results show high structural damage, and therefore, they point out the high structural vulnerability in the city. Besides, the economic and human losses associated to the damage would cause a great impact in the country. This result will be provided to the Haitian Government, constituting a scientific base for decision making and for the adoption of measures to prevent and mitigate the seismic risk. It is highly recommended to drive efforts towards the quality control of the new buildings -through reinforcement and construction according to a seismic code- and the development of emergency planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Simultaneous Multiple Surfaces (SMS) was developed as a design method in Nonimaging Optics during the 90s. Later, the method was extended for designing Imaging Optics. We present an overview of the method applied to imaging optics in planar (2D) geometry and compare the results with more classical designs based on achieving aplanatism of different orders. These classical designs are also viewed as particular cases of SMS designs. Systems with up to 4 aspheric surfaces are shown. The SMS design strategy is shown to perform always better than the classical design (in terms of image quality). Moreover, the SMS method is a direct method, i.e., it is not based in multi-parametric optimization techniques. This gives the SMS method an additional interest since it can be used for exploring solutions where the multiparameter techniques can get lost because of the multiple local minima