956 resultados para Data Sets
Resumo:
Seismic recordings of IRIS/IDA/GSN station CMLA and of several temporary stations in the Azores archipelago are processed with P and S receiver function (PRF and SRF) techniques. Contrary to regional seismic tomography these methods provide estimates of the absolute velocities and of the Vp/Vs ratio up to a depth of similar to 300 km. Joint inversion of PRFs and SRFs for a few data sets consistently reveals a division of the subsurface medium into four zones with a distinctly different Vp/Vs ratio: the crust similar to 20 km thick with a ratio of similar to 1.9 in the lower crust, the high-Vs mantle lid with a strongly reduced VpNs velocity ratio relative to the standard 1.8, the low-velocity zone (LVZ) with a velocity ratio of similar to 2.0, and the underlying upper-mantle layer with a standard velocity ratio. Our estimates of crustal thickness greatly exceed previous estimates (similar to 10 km). The base of the high-Vs lid (the Gutenberg discontinuity) is at a depth of-SO km. The LVZ with a reduction of S velocity of similar to 15% relative to the standard (IASP91) model is terminated at a depth of similar to 200 km. The average thickness of the mantle transition zone (TZ) is evaluated from the time difference between the S410p and SKS660p, seismic phases that are robustly detected in the S and SKS receiver functions. This thickness is practically similar to the standard IASP91 value of 250 km. and is characteristic of a large region of the North Atlantic outside the Azores plateau. Our data are indicative of a reduction of the S-wave velocity of several percent relative to the standard velocity in a depth interval from 460 to 500 km. This reduction is found in the nearest vicinities of the Azores, in the region sampled by the PRFs, but, as evidenced by SRFs, it is missing at a distance of a few hundred kilometers from the islands. We speculate that this anomaly may correspond to the source of a plume which generated the Azores hotspot. Previously, a low S velocity in this depth range was found with SRF techniques beneath a few other hotspots.
Resumo:
The crustal and lithospheric mantle structure at the south segment of the west Iberian margin was investigated along a 370 km long seismic transect. The transect goes from unthinned continental crust onshore to oceanic crust, crossing the ocean-continent transition (OCT) zone. The wide-angle data set includes recordings from 6 OBSs and 2 inland seismic stations. Kinematic and dynamic modeling provided a 2D velocity model that proved to be consistent with the modeled free-air anomaly data. The interpretation of coincident multi-channel near-vertical and wide-angle reflection data sets allowed the identification of four main crustal domains: (i) continental (east of 9.4 degrees W); (ii) continental thinning (9.4 degrees W-9.7 degrees W): (iii) transitional (9.7 degrees W-similar to 10.5 degrees W); and (iv) oceanic (west of similar to 10.5 degrees W). In the continental domain the complete crustal section of slightly thinned continental crust is present. The upper (UCC, 5.1-6.0 km/s) and the lower continental crust (LCC, 6.9-7.2 km/s) are seismically reflective and have intermediate to low P-wave velocity gradients. The middle continental crust (MCC, 6.35-6.45 km/s) is generally unreflective with low velocity gradient. The main thinning of the continental crust occurs in the thinning domain by attenuation of the UCC and the LCC. Major thinning of the MCC starts to the west of the LCC pinchout point, where it rests directly upon the mantle. In the thinning domain the Moho slope is at least 13 degrees and the continental crust thickness decreases seaward from 22 to 11 km over a similar to 35 km distance, stretched by a factor of 1.5 to 3. In the oceanic domain a two-layer high-gradient igneous crust (5.3-6.0 km/s; 6.5-7.4 km/s) was modeled. The intra-crustal interface correlates with prominent mid-basement, 10-15 km long reflections in the multi-channel seismic profile. Strong secondary reflected PmP phases require a first order discontinuity at the Moho. The sedimentary cover can be as thick as 5 km and the igneous crustal thickness varies from 4 to 11 km in the west, where the profile reaches the Madeira-Tore Rise. In the transitional domain the crust has a complex structure that varies both horizontally and vertically. Beneath the continental slope it includes exhumed continental crust (6.15-6.45 km/s). Strong diffractions were modeled to originate at the lower interface of this layer. The western segment of this transitional domain is highly reflective at all levels, probably due to dykes and sills, according to the high apparent susceptibility and density modeled at this location. Sub-Moho mantle velocity is found to be 8.0 km/s, but velocities smaller than 8.0 km/s confined to short segments are not excluded by the data. Strong P-wave wide-angle reflections are modeled to originate at depth of 20 km within the lithospheric mantle, under the eastern segment of the oceanic domain, or even deeper at the transitional domain, suggesting a layered structure for the lithospheric mantle. Both interface depths and velocities of the continental section are in good agreement to the conjugate Newfoundland margin. A similar to 40 km wide OCT having a geophysical signature distinct from the OCT to the north favors a two pulse continental breakup.
Resumo:
The only Iberian lower Jurassic palcomagnetic pole come from the "Central Atlantic Magmatic Province"-related Messejana Plasencia dyke, but the age and origin of its remanence have been a matter of discussion. With the aim of solving this uncertainty, and to go further into a better understanding of its emplacement and other possible tectonic features, a systematic paleomagnetic investigation of 40 sites (625 specimens) distributed all along the 530 kin of the Messejana Plasencia dyke has been carried out. Rock magnetic experiments indicate PSD low Ti-titanomagnetite and magnetite as the minerals carrying the NRM. The samples were mostly thermally demagnetized. Most sites exhibit a characteristic remanent component of normal polarity with the exception of two sites, where samples with reversed polarities have been observed. The paleomagnetic pole derived from a total of 35 valid sites is representative of the whole structure of the dyke, and statistically well defined, with values of PLa = 70.4 degrees N, PLo = 237.6 degrees E, K= 47.9 and A(95) = 3.5 degrees. Paleomagnetic data indicates that: (i) there is no evidence of a Cretaceous remagnetization in the dyke, as it was suggested; (ii) most of the dyke had a brief emplacement time; furthermore, two dyke intrusion events separated in time from it by at least 10,000 y have been detected; (iii) the high grouping of the VGPs directions suggests no important tectonic perturbations of the whole structure of the dyke since its intrusion time; (iv) the pole derived from this study is a good quality lower Jurassic paleopole for the Iberian plate; and (v) the Messejana Plasencia dyke paleopole for the Iberian plate is also in agreement with quality-selected European and North American lower Jurassic paleopoles and the magnetic anomalies data sets that are available for rotate them to Iberia.
Resumo:
Financial literature and financial industry use often zero coupon yield curves as input for testing hypotheses, pricing assets or managing risk. They assume this provided data as accurate. We analyse implications of the methodology and of the sample selection criteria used to estimate the zero coupon bond yield term structure on the resulting volatility of spot rates with different maturities. We obtain the volatility term structure using historical volatilities and Egarch volatilities. As input for these volatilities we consider our own spot rates estimation from GovPX bond data and three popular interest rates data sets: from the Federal Reserve Board, from the US Department of the Treasury (H15), and from Bloomberg. We find strong evidence that the resulting zero coupon bond yield volatility estimates as well as the correlation coefficients among spot and forward rates depend significantly on the data set. We observe relevant differences in economic terms when volatilities are used to price derivatives.
Resumo:
OBJECTIVE: To assess the variation in Anopheles darlingi's biting activity compared to An. marajoara in the same locality and to biting activity data from other regions. METHODS: Using human bait, eight observations of the biting activity of An. darlingi and An. marajoara were carried out during 1999 and 2000 in the municipality of São Raimundo do Pirativa, state of Amapá, Brazil. Each observation consisted of three consecutive 13-hour collections, close to full moon. There were shifts of collectors in the observation points and nocturnal periods. RESULTS: An. darlingi revealed considerable plasticity of biting activity in contrast to An. marajoara, which showed well-defined crepuscular biting peaks. No significant correlation between density and biting activity was found, but a significant correlation existed between time and proportional crepuscular activity, indicating underlying ecological processes not yet understood. Two of the four available data sets having multiple observations at one locality showed considerable plasticity of this species' biting patterns as well. CONCLUSION: Intra-population variation of biting activity can be as significant as inter-population variation. Some implications in malaria vector control and specific studies are also discussed.
Resumo:
Tuberculosis (TB) is a worldwide infectious disease that has shown over time extremely high mortality levels. The urgent need to develop new antitubercular drugs is due to the increasing rate of appearance of multi-drug resistant strains to the commonly used drugs, and the longer durations of therapy and recovery, particularly in immuno-compromised patients. The major goal of the present study is the exploration of data from different families of compounds through the use of a variety of machine learning techniques so that robust QSAR-based models can be developed to further guide in the quest for new potent anti-TB compounds. Eight QSAR models were built using various types of descriptors (from ADRIANA.Code and Dragon software) with two publicly available structurally diverse data sets, including recent data deposited in PubChem. QSAR methodologies used Random Forests and Associative Neural Networks. Predictions for the external evaluation sets obtained accuracies in the range of 0.76-0.88 (for active/inactive classifications) and Q(2)=0.66-0.89 for regressions. Models developed in this study can be used to estimate the anti-TB activity of drug candidates at early stages of drug development (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Copyright © 2013 Springer Netherlands.
Resumo:
Mestrado em Engenharia Química. Ramo optimização energética na indústria química.
Resumo:
A navegação de veículos autónomos em ambientes não estruturados continua a ser um problema em aberto. A complexidade do mundo real ainda é um desafio. A difícil caracterização do relevo irregular, dos objectos dinâmicos e pouco distintos(e a inexistência de referências de localização) tem sido alvo de estudo e do desenvolvimento de vários métodos que permitam de uma forma eficiente, e em tempo real, modelizar o espaço tridimensional. O trabalho realizado ao longo desta dissertação insere-se na estratégia do Laboratório de Sistemas Autónomos (LSA) na pesquisa e desenvolvimento de sistemas sensoriais que possibilitem o aumento da capacidade de percepção das plataformas robóticas. O desenvolvimento de um sistema de modelização tridimensional visa acrescentar aos projectos LINCE (Land INtelligent Cooperative Explorer) e TIGRE (Terrestrial Intelligent General proposed Robot Explorer) maior autonomia e capacidade de exploração e mapeamento. Apresentamos alguns sensores utilizados para a aquisição de modelos tridimensionais, bem como alguns dos métodos mais utilizados para o processo de mapeamento, e a sua aplicação em plataformas robóticas. Ao longo desta dissertação são apresentadas e validadas técnicas que permitem a obtenção de modelos tridimensionais. É abordado o problema de analisar a cor e geometria dos objectos, e da criação de modelos realistas que os representam. Desenvolvemos um sistema que nos permite a obtenção de dados volumétricos tridimensionais, a partir de múltiplas leituras de um Laser Range Finder bidimensional de médio alcance. Aos conjuntos de dados resultantes associamos numa nuvem de pontos coerente e referenciada. Foram desenvolvidas e implementadas técnicas de segmentação que permitem inspeccionar uma nuvem de pontos e classifica-la quanto às suas características geométricas, bem como ao tipo de estruturas que representem. São apresentadas algumas técnicas para a criação de Mapas de Elevação Digital, tendo sido desenvolvida um novo método que tira partido da segmentação efectuada
Resumo:
Com a expansão da Televisão Digital e a convergência entre os meios de difusão convencionais e a televisão sobre IP, o número de canais disponíveis tem aumentado de forma gradual colocando o espectador numa situação de difícil escolha quanto ao programa a visionar. Sobrecarregados com uma grande quantidade de programas e informação associada, muitos espectadores desistem sistematicamente de ver um programa e tendem a efectuar zapping entre diversos canais ou a assistir sempre aos mesmos programas ou canais. Diante deste problema de sobrecarga de informação, os sistemas de recomendação apresentam-se como uma solução. Nesta tese pretende estudar-se algumas das soluções existentes dos sistemas de recomendação de televisão e desenvolver uma aplicação que permita a recomendação de um conjunto de programas que representem potencial interesse ao espectador. São abordados os principais conceitos da área dos algoritmos de recomendação e apresentados alguns dos sistemas de recomendação de programas de televisão desenvolvidos até à data. Para realizar as recomendações foram desenvolvidos dois algoritmos baseados respectivamente em técnicas de filtragem colaborativa e de filtragem de conteúdo. Estes algoritmos permitem através do cálculo da similaridade entre itens ou utilizadores realizar a predição da classificação que um utilizador atribuiria a um determinado item (programa de televisão, filme, etc.). Desta forma é possível avaliar o nível de potencial interesse que o utilizador terá em relação ao respectivo item. Os conjuntos de dados que descrevem as características dos programas (título, género, actores, etc.) são armazenados de acordo com a norma TV-Anytime. Esta norma de descrição de conteúdo multimédia apresenta a vantagem de ser especificamente vocacionada para conteúdo audiovisual e está disponível livremente. O conjunto de recomendações obtidas é apresentado ao utilizador através da interacção com uma aplicação Web que permite a integração de todos os componentes do sistema. Para validação do trabalho foi considerado um dataset de teste designado de htrec2011-movielens-2k e cujo conteúdo corresponde a um conjunto de filmes classificados por diversos utilizadores num ambiente real. Este conjunto de filmes possui, para além da classificações atribuídas pelos utilizadores, um conjunto de dados que descrevem o género, directores, realizadores e país de origem. Para validação final do trabalho foram realizados diversos testes dos quais o mais relevante correspondeu à avaliação da distância entre predições e valores reais e cujo objectivo é classificar a capacidade dos algoritmos desenvolvidos preverem com precisão as classificações que os utilizadores atribuiriam aos itens analisados.
Resumo:
Proceedings of International Conference - SPIE 7477, Image and Signal Processing for Remote Sensing XV - 28 September 2009
Resumo:
Microarray allow to monitoring simultaneously thousands of genes, where the abundance of the transcripts under a same experimental condition at the same time can be quantified. Among various available array technologies, double channel cDNA microarray experiments have arisen in numerous technical protocols associated to genomic studies, which is the focus of this work. Microarray experiments involve many steps and each one can affect the quality of raw data. Background correction and normalization are preprocessing techniques to clean and correct the raw data when undesirable fluctuations arise from technical factors. Several recent studies showed that there is no preprocessing strategy that outperforms others in all circumstances and thus it seems difficult to provide general recommendations. In this work, it is proposed to use exploratory techniques to visualize the effects of preprocessing methods on statistical analysis of cancer two-channel microarray data sets, where the cancer types (classes) are known. For selecting differential expressed genes the arrow plot was used and the graph of profiles resultant from the correspondence analysis for visualizing the results. It was used 6 background methods and 6 normalization methods, performing 36 pre-processing methods and it was analyzed in a published cDNA microarray database (Liver) available at http://genome-www5.stanford.edu/ which microarrays were already classified by cancer type. All statistical analyses were performed using the R statistical software.
Resumo:
Mestrado em Radiações Aplicadas às Tecnologias da Saúde - Ramo de especialização: Terapia com Radiações
Resumo:
Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.
Resumo:
This paper reports on the analysis of tidal breathing patterns measured during noninvasive forced oscillation lung function tests in six individual groups. The three adult groups were healthy, with prediagnosed chronic obstructive pulmonary disease, and with prediagnosed kyphoscoliosis, respectively. The three children groups were healthy, with prediagnosed asthma, and with prediagnosed cystic fibrosis, respectively. The analysis is applied to the pressure–volume curves and the pseudophaseplane loop by means of the box-counting method, which gives a measure of the area within each loop. The objective was to verify if there exists a link between the area of the loops, power-law patterns, and alterations in the respiratory structure with disease. We obtained statistically significant variations between the data sets corresponding to the six groups of patients, showing also the existence of power-law patterns. Our findings support the idea that the respiratory system changes with disease in terms of airway geometry and tissue parameters, leading, in turn, to variations in the fractal dimension of the respiratory tree and its dynamics.