969 resultados para Probabilidade de default


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seismic exploration is the main tools of exploration for petroleum. as the society needs more petroleum and the level of exploration is going up, the exploration in the area of complex geology construction is the main task in oil industry, so the seismic prestack depth migration appeared, it has good ability for complex construction imaging. Its result depends on the velocity model strongly. So for seismic prestack depth migration has become the main research area. In this thesis the difference in seismic prestack depth migration between our country and the abroad has been analyzed in system. the tomographical method with no layer velocity model, the residual curve velocity analysical method based on velocity model and the deleting method in pre-processing have been developed. In the thesis, the tomographysical method in velocity analysis is been analyzed at first. It characterized with perfection in theory and diffculity in application. This method use the picked first arrivial, compare the difference between the picked first arrival and the calculated arrival in theory velocity model, and then anti-projected the difference along the ray path to get the new velocity model. This method only has the hypothesis of high frequency, no other hypothesis. So it is very effective and has high efficiency. But this method has default still. The picking of first arrival is difficult in the prestack data. The reasons are the ratio of signal to noise is very low and many other event cross each other in prestack data. These phenomenon appear strongly in the complex geology construction area. Based on these a new tomophysical methos in velocity analysis with no layer velocity model is been developed. The aim is to solve the picking problem. It do not need picking the event time contiunely. You can picking in random depending on the reliability. This methos not only need the pick time as the routine tomographysical mehtod, but also the slope of event. In this methos we use the high slope analysis method to improve the precision of picking. In addition we also make research on the residual curve velocity analysis and find that its application is not good and the efficiency is low. The reasons is that the hypothesis is rigid and it is a local optimizing method, it can solve seismic velocity problem in the area with laterical strong velocity variation. A new method is developed to improve the precision of velocity model building . So far the pattern of seismic prestack depth migration is the same as it aborad. Before the work of velocity building the original seismic data must been corrected on a datum plane, and then to make the prestack depth migration work. As we know the successful example is in Mexico bay. It characterized with the simple surface layer construction, the pre-precessing is very simple and its precision is very high. But in our country the main seismic work is in land, the surface layer is very complex, in some area the error of pre-precessing is big, it affect the velocity building. So based on this a new method is developed to delete the per-precessing error and improve the precision of velocity model building. Our main work is, (1) developing a effective tomographical velocity building method with no layer velocity model. (2) a new high resolution slope analysis method is developed. (3) developing a global optimized residual curve velocity buliding method based on velocity model. (4) a effective method of deleting the pre-precessing error is developing. All the method as listed above has been ceritified by the theorical calculation and the actual seismic data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eight experiments tested how object array structure and learning location influenced the establishing and utilization of self-to-object and object-to-object spatial representations in locomotion and reorientation. In Experiment 1 to 4, participants learned either at the periphery of or amidst regular or irregular object array, and then pointed to objects while blindfolded in three conditions: before turning (baseline), after rotating 240 degrees (updating), and after disorientation (disorientation). In Experiment 5 to 8, participants received instruction to keep track of self-to-object or object-to-object spatial representations before rotation. In each condition, the configuration error, which means the standard deviation of the means per target object of the signed pointing errors, was calculated as the index of the fidelity of representation used in each condition. Results indicate that participants form both self-to-object and object-to-object spatial representations after learning an object-array. Object-array structure influences the selection of representation during updating. By default, object-to-object spatial representation is updated when people learned the regular object-array structure, and self-to-object spatial representation is updated when people learned the irregular object array. But people could also update the other representation when they are required to do so. The fidelity of representations will confine this kind of “switch”. People could only “switch” from a low fidelity representation to a high fidelity representation or between two representations of similar fidelity. They couldn’t “switch” from a high fidelity representation to a low fidelity representation. Leaning location might influence the fidelity of representations. When people learned at the periphery of object array, they could acquire both self-to-object and object-to-object spatial representations of high fidelity. But when people learned amidst the object array, they could only acquire self-to-object spatial representation of high fidelity, and the fidelity of object-to-object spatial representation was low.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo deste documento é mostrar o potencial da integração de um sistema de informações geográficas (SIG) com um modelo de probabilidade, usando a distribuição de Poisson, para espacializar variáveis discretas. Modelos estatísticos são ferramentas importantes no estudo de variáveis ambientais, principalmente com a crescente importância da valoração do capital ambiental. A distribuição do Poisson é um bom modelo estatístico para manejo de variáveis discretas, pois mostra seu comportamento. Um passo posterior seria saber como essas variáveis se comportam no espaço, mostrando sua distribuição espacial. Nesse caso, os sistemas de informações geográficas (SIG) são bastante eficientes (Miranda, 2005). Para testar o uso de ambas as ferramentas e mostrar sua eficiência, este trabalho traz uma implementação específica usando uma variável ambiental discreta, secas mensais.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo deste documento é apresentar o potencial da integração de um sistema de informações geográficas (SIG) com um modelo estocástico, a distribuição de probabilidade Gama, para espacializar variáveis contínuas. Aplica-se um estudo de caso para a precipitação no Estado de São Paulo, usando uma série temporal de quinze anos com dados de precipitação diária, de janeiro de 1978 a dezembro de 1992.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho teve como objetivo avaliar o crescimento e a producao da borracha de clones seringueira [Havea brasiliensis ( Wild. Ex Adr. De Juss.) Muell. Arg.] sob diferentes sistemas de sangria, em condição de Cerrado dos Municipios de Barro Alto Goianesia no Estado de Goais. O plantio foi feito em fevereiro de 1992, no espacamento de 8,0 x 2,5 m (500 plantas/ha), em talhoes de 8 a 10 hectares para cada um dos clones RRIM 600, GT 1, PB 217, PB 235, PR 107 e PR 255 os quais receberam as mesmas praticas de manejo. Aos oito anos de idade, foram feitas as seguintes avaliacoes: estande final; circunferencia do caule a 1,20 m do solo; porcentagens de plantas aptas a sangria; producao de borracha acumulada na caneca pesada mensalmente; incidencia de seca de painel. A producao foi avaliada em nove sistemas de sangria em meia espiral (1/2 S), praticados cinco dias por semana (5d/7) e 10 meses ao ano (10m/12), variando na frequência de sangria (d/4 e d/7 = a cada 4 e 7 dias), a concentracao de Ethephon (ET 0,25%, 2,5%,3,3% e 5,0%) e sua frequencia de aplicacao durante o periodo chuvoso ( a cada 22, 28 e 35 dias), como segue: 1) 1/2S, d/7, ET 2,5% a cada 22 dias; 2)1/2S, d/7, ET 2,5% a cada 30 dias (referncia); 3) 1/2S. d/4, ET 2,5% a cada 30 dias; 4) 1/2S, d/7, ET 3,3% a cada 22 dias; 5)1/2S, d/7, ET 3,3% a cada 30 dias; 6) 1/2S. d/7, ET 5,0% a cada 22 dias 7) 1/2S, d/7, ET 5,0% a cada 30 dias; 8) 1/2S, d/7, ET 5,0% a cada 35 dias; 9) 1/2S, d/7, ET 0,25% (pulverizando 10 ml por painel) a cada 22 dias. Nos sistemas 1 a 8, o Ethephon foi pincelado ( 1mL) na canaleta de corte e ate 2 cm acima dela (Pa e La). O delineamento experimental foi de blocos ao acaso, com quatro repeticoes de 10 plantas poe parcela. Cada clone constitui um experimento separado, sendo os resultados de producao acumulada anual submetidos a analise de variancia e, nos caso de significancia, as medias dos sistemas foram comparadas pelo teste Tukey, ao nível de 5% de probabilidade. Nao foi constatada qualquer incidencia de seca de painel e os resultados possibilitaram as seguintes conclusoes para as condicoes da regiao: 1) o sistema 1/2S, d/7, ET 2,5% a cada 30 dias e o mais indicado par a sangria dos clones PR 255, PR 107, PB 235, PB 217 e GT 1; 2) o sistema 1/2S, d/7, ET 3,3% a cada 30 dias e o mais indicado para a sangria do clone RRIM 600; 3) a producao individual de borracha em kg/planta/ano e maior nos clones RRIM 600, PB 217 e PR 255, enquanto a producao total em kg/ha/ano e superior nos clones RRIM 600 e PB 235; 4)os clones PB 217 e PR 255 sao menos adaptados a regiao, apresentando menores valores de estande final, circunferencia do caule, porcentagem de plantas em sangria e de producao total de borracha por hectare.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMO: O presente trabalho teve como objetivo avaliar o teor e a produtividade de genótipos de girassol semeados em segunda safra no ano de 2014 em Campo Novo do Parecis ? MT, no campo experimental do Instituto Federal de Educação Ciência e Tecnologia de Mato Grosso. O delineamento experimental utilizado foi o de blocos casualizados, com 16 tratamentos (16 genótipos) e quatro repetições. As parcelas experimentais foram constituídas de 4 linhas com 6,5 m de comprimento, com espaçamento entrelinhas de 0,45 m, contendo área de 11,7 m², totalizando uma área de 748 m². Foi utilizada a população de 45000 plantas por hectare. Os dados foram submetidos à análise de variância e ao teste Scott-Knott, a 5% de probabilidade. Os genótipos que se destacaram em relação à produtividade de aquênios foram o MG 360, AGUARÁ 06, MG 305, AGUARÁ 04, CF 101, SYN 045, GNZ NEON, HELIO 251 e SYN 3950HO. Para o teor de óleo nos aquênios e produtividade de óleo, o genótipo MG 360 apresentou o maior valor e se destacou em relação aos demais genótipos analisados. ABSTRACT: This study aimed to evaluate genotypes of sunflower seeded second harvest in the year 2014 in Campus Campo Novo do Parecis, in the experimental field of the Instituto Federal de Educação Ciência e Tecnologia de Mato Grosso. The experimental design was a randomized block design with treatments 16 (16 genotypes) and four replications. The experimental plots consisted of four rows 6.5 m long with row spacing of 0.45 m, containing area of 11.7 m², totaling an area of 748 m². The population of 45000 plants per hectare is used. Data were subjected to analysis of variance and the Scott-Knott test at 5 % probability. The genotypes that stood out in relation to achenes productivity were MG 360, AGUARÁ 06, MG 305, AGUARÁ 04, CF 101, SYN 045, GNZ NEON, HELIO 251 and SYN 3950HO. For oil content and oil productivity, MG 360 genotype showed the highest value and stood out in relation to other genotypes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUMO: O presente trabalho teve como objetivo avaliar genótipos de girassol semeados em segunda safra no ano de 2014 em Campo Novo do Parecis ? MT, no campo experimental do Instituto Federal de Educação Ciência e Tecnologia de Mato Grosso. O delineamento experimental utilizado foi o de blocos casualizados, com 16 tratamentos (16 genótipos) e quatro repetições. As parcelas experimentais foram constituídas de 4 linhas com 6,5 m de comprimento, com espaçamento entrelinhas de 0,45 m, contendo área de 11,7 m², totalizando uma área de 748m². Foi utilizada a população de 45000 plantas por hectare. Os dados foram submetidos à análise de variância e ao teste Scott-Knott, a 5% de probabilidade. Para a massa de mil aquênios, os genótipos que se destacaram foram BRS 323, MG 360 e M734 enquanto que as os mais produtivos foram os genótipos MG 360, AGUARÁ 06, MG 305, AGUARÁ 04, CF 101, SYN 045, GNZ NEON, HELIO 251 e SYN 3950HO. ABSTRACT: This study aimed to evaluate genotypes of sunflower seeded second harvest in the year 2014 in Campus Campo Novo do Parecis, in the experimental field of the Instituto Federal de Educação Ciência e Tecnologia de Mato Grosso. The experimental design was a randomized block design with treatments 16 (16 genotypes) and four replications. The experimental plots consisted of four rows 6.5 m long with row spacing of 0.45 m, containing area of 11.7 m², totaling an area of 748 m². The population of 45000 plants per hectare is used. Data were subjected to analysis of variance and the Scott - Knott test at 5 % probability. For the mass of thousand achenes, genotypes that stood out were BRS 323, MG 360 and M734 while the most productive genotypes were the MG 360, AGUARÁ 06, MG 305, AGUARÁ 04, CF 101, SYN 045, GNZ NEON, HELIO 251 and SYN 3950HO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents experimental results that aimed to investigate the effects of soil liquefaction on the modal parameters (i.e. frequency and damping ratio) of pile-supported structures. The tests were carried out using the shaking table facility of the Bristol Laboratory for Advanced Dynamics Engineering (BLADE) at the University of Bristol (UK) whereby four pile-supported structures (two single piles and two pile groups) with and without superstructure mass were tested. The experimental investigation aimed to monitor the variation in natural frequency and damping of the four physical models at different degrees of excess pore water pressure generation and in full-liquefaction condition. The experimental results showed that the natural frequency of pile-supported structures may decrease considerably owing to the loss of lateral support offered by the soil to the pile. On the other hand, the damping ratio of structure may increase to values in excess of 20%. These findings have important design consequences: (a) for low-period structures, substantial reduction of spectral acceleration is expected; (b) during and after liquefaction, the response of the system may be dictated by the interactions of multiple loadings, that is, horizontal, axial and overturning moment, which were negligible prior to liquefaction; and (c) with the onset of liquefaction due to increased flexibility of pile-supported structure, larger spectral displacement may be expected, which in turn may enhance Pdelta effects and consequently amplification of overturning moment. Practical implications for pile design are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the past fifty years, the interest in issues beyond pure philology has been a watchword in comparative literary studies. Comparative studies, which by default employ a variety of methods, run the major risk – as the experience of American comparative literature shows – of descending into dangerous ‘everythingism’ or losing its identity. However, it performs well when literature remains one of the segments of comparison. In such instances, it proves efficacious in exploring the ‘correspondences of arts’, the problems of identity and multiculturalism as well as contributes to the research into the transfer of ideas. Hence, it delves into phenomena which exist on the borderlines of literature, fine arts and other fields of humanities, employing strategies of interpretation which are typical for each of those fields. This means that in the process there emerges a “borderline methodology”, whose distinctive feature is heterogeneity of conducting research. This, in turn, requires the scholar to be both ingenious and creative while selecting topics as well as to possess competence in literary studies and the related field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho de Projeto apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Terapêutica da Fala, área de especialização em Linguagem no Adulto

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Internet has brought unparalleled opportunities for expanding availability of research by bringing down economic and physical barriers to sharing. The digitally networked environment promises to democratize access, carry knowledge beyond traditional research niches, accelerate discovery, encourage new and interdisciplinary approaches to ever more complex research challenges, and enable new computational research strategies. However, despite these opportunities for increasing access to knowledge, the prices of scholarly journals have risen sharply over the past two decades, often forcing libraries to cancel subscriptions. Today even the wealthiest institutions cannot afford to sustain all of the journals needed by their faculties and students. To take advantage of the opportunities created by the Internet and to further their mission of creating, preserving, and disseminating knowledge, many academic institutions are taking steps to capture the benefits of more open research sharing. Colleges and universities have built digital repositories to preserve and distribute faculty scholarly articles and other research outputs. Many individual authors have taken steps to retain the rights they need, under copyright law, to allow their work to be made freely available on the Internet and in their institutionâ s repository. And, faculties at some institutions have adopted resolutions endorsing more open access to scholarly articles. Most recently, on February 12, 2008, the Faculty of Arts and Sciences (FAS) at Harvard University took a landmark step. The faculty voted to adopt a policy requiring that faculty authors send an electronic copy of their scholarly articles to the universityâ s digital repository and that faculty authors automatically grant copyright permission to the university to archive and to distribute these articles unless a faculty member has waived the policy for a particular article. Essentially, the faculty voted to make open access to the results of their published journal articles the default policy for the Faculty of Arts and Sciences of Harvard University. As of March 2008, a proposal is also under consideration in the University of California system by which faculty authors would commit routinely to grant copyright permission to the university to make copies of the facultyâ s scholarly work openly accessible over the Internet. Inspired by the example set by the Harvard faculty, this White Paper is addressed to the faculty and administrators of academic institutions who support equitable access to scholarly research and knowledge, and who believe that the institution can play an important role as steward of the scholarly literature produced by its faculty. This paper discusses both the motivation and the process for establishing a binding institutional policy that automatically grants a copyright license from each faculty member to permit deposit of his or her peer-reviewed scholarly articles in institutional repositories, from which the works become available for others to read and cite.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent measurement based studies reveal that most of the Internet connections are short in terms of the amount of traffic they carry (mice), while a small fraction of the connections are carrying a large portion of the traffic (elephants). A careful study of the TCP protocol shows that without help from an Active Queue Management (AQM) policy, short connections tend to lose to long connections in their competition for bandwidth. This is because short connections do not gain detailed knowledge of the network state, and therefore they are doomed to be less competitive due to the conservative nature of the TCP congestion control algorithm. Inspired by the Differentiated Services (Diffserv) architecture, we propose to give preferential treatment to short connections inside the bottleneck queue, so that short connections experience less packet drop rate than long connections. This is done by employing the RIO (RED with In and Out) queue management policy which uses different drop functions for different classes of traffic. Our simulation results show that: (1) in a highly loaded network, preferential treatment is necessary to provide short TCP connections with better response time and fairness without hurting the performance of long TCP connections; (2) the proposed scheme still delivers packets in FIFO manner at each link, thus it maintains statistical multiplexing gain and does not misorder packets; (3) choosing a smaller default initial timeout value for TCP can help enhance the performance of short TCP flows, however not as effectively as our scheme and at the risk of congestion collapse; (4) in the worst case, our proposal works as well as a regular RED scheme, in terms of response time and goodput.