928 resultados para Multiple attributes interactions
Resumo:
To extend the cross-hole seismic 2D data to outside 3D seismic data, reconstructing the low frequency data to high frequency data is necessary. Blind deconvolution method is a key technology. In this paper, an implementation of Blind deconvolution is introduced. And optimized precondition conjugate gradient method is used to improve the stability of the algorithm and reduce the computation. Then high-frequency retrieved Seismic data and the cross-hole seismic data is combined for constraint inversion. Real data processing proved the method is effective. To solve the problem that the seismic data resolution can’t meet the request of reservoir prediction in the river face thin-layers in Chinese eastern oil fields, a high frequency data reconstruction method is proposed. The extrema of the seismic data are used to get the modulation function which operated with the original seismic data to get the high frequency part of the reconstruction data to rebuild the wide band data. This method greatly saves the computation, and easy to adjust the parameters. In the output profile, the original features of the seismic events are kept, the common feint that breaking the events and adding new zeros to produce alias is avoided. And the interbeded details are enhanced compared to the original profiles. The effective band of seismic data is expended and the method is approved by the processing of the field data. Aim to the problem in the exploration and development of Chinese eastern oil field that the high frequency log data and the relative low frequency seismic data can’t be merged, a workflow of log data extrapolation constrained by time-phase model based on local wave decomposition is raised. The seismic instantaneous phase is resolved by local wave decomposition to build time-phase model, the layers beside the well is matched to build the relation of log and seismic data, multiple log info is extrapolated constrained by seismic equiphase map, high precision attributes inverse sections are produced. In the course of resolve the instantaneous phase, a new method of local wave decomposition --Hilbert transform mean mode decomposition(HMMD) is raised to improve the computation speed and noise immunity. The method is applied in the high resolution reservoir prediction in Mao2 survey of Daqing oil field, Multiple attributes profiles of wave impedance, gamma-ray, electrical resistivity, sand membership degree are produced, of which the resolution is high and the horizontal continuous is good. It’s proved to be a effective method for reservoir prediction and estimation.
Resumo:
In this paper we base on the anisotropic theory and Zoeppritz function of the transmission theory and the law of amplitude versus offset simplify seismic reflection coefficient of different media, analyze the characteristic of the gas or oil saturated stratum or the VTI and HTI models. Discuss the P wave reflection relationship and the meanings of the different parameters. We use measured parameters of a reservoir to simulate the characteristic of the reservoir, study the different effects of stratum saturated with gas or oil and analyze the characteristic of the seismic response of different models which change with different incident angles and different azimuths. Using the field data of logs ,analyze the rock property parameters, build the relationship of logs and parameters by Gassmann theory or empirical function. Calculate the density and the shear modulus and bulk modulus, reconstruct the log curves, calculate shear wave logs and correlate the logs affected by mud and other environmental factors. Finally perform the relationship of the seismic data log of saturated stratum and enhance the ability and reliability in reservoir prediction. Our aim is by the prestack seismic processing to get high solution and amplitude preserved seismic data. Because in incident angle gathers or azimuthal gathers, the low signal to noise ratio and low different covers affect the result of the prestack reservoir prediction. We apply prestack noise erase, cell regularization process and relatively amplitude preservation in the high solution seismic process routine to preserve the characteristic of stratum response, and erase the effects of the noise. In this paper we finished prestack invertion in the BYT survey and fractured reservoir depiction in MB survey. By the invertion and multiple attributes crossplot. we can get the stratum profiles and oil indicator profiles which can predict the distribution of the reservoir and oil. In the MB survey, we get orientation and density of fractured reservoir by the azimuthal seismic amplitude and depict the potential oil and gas reservoir. Prestak invertion works better in distinguishing oil and reservoir.
Resumo:
Very long-term memory for popular music was investigated. Older and younger adults listened to 20-sec excerpts of popular songs drawn from across the 20th century. The subjects gave emotionality and preference ratings and tried to name the title, artist, and year of popularity for each excerpt. They also performed a cued memory test for the lyrics. The older adults' emotionality ratings were highest for songs from their youth; they remembered more about these songs, as well. However, the stimuli failed to cue many autobiographical memories of specific events. Further analyses revealed that the older adults were less likely than the younger adults to retrieve multiple attributes of a song together (i.e., title and artist) and that there was a significant positive correlation between emotion and memory, especially for the older adults. These results have implications for research on long-term memory, as well as on the relationship between emotion and memory.
Resumo:
The increased interconnectivity and complexity of supervisory control and data acquisition (SCADA) systems in power system networks has exposed the systems to a multitude of potential vulnerabilities. In this paper, we present a novel approach for a next-generation SCADA-specific intrusion detection system (IDS). The proposed system analyzes multiple attributes in order to provide a comprehensive solution that is able to mitigate varied cyber-attack threats. The multiattribute IDS comprises a heterogeneous white list and behavior-based concept in order to make SCADA cybersystems more secure. This paper also proposes a multilayer cyber-security framework based on IDS for protecting SCADA cybersecurity in smart grids without compromising the availability of normal data. In addition, this paper presents a SCADA-specific cybersecurity testbed to investigate simulated attacks, which has been used in this paper to validate the proposed approach.
Resumo:
Dissertação apresentada à Escola Superior de Educação para a obtenção do grau de mestre em Ciências da Educação - Especialidade Educação Especial
Resumo:
This work is an exploratory study based on the principles of qualitative research aiming at the conception of landscape by Geography teachers in the city of Parnamirim (RN), as well as the pedagogical implications originated from such conceptions on the formation of students. In order to start our investigative process, we used, as theoretical and methodological reference, some principles of historical and dialectical materialism by Triviños (2007) and historical cultural approach of education by Freire (1987; 1996) e Vygotsky (1993; 2001; 2007), as well as the meaning of conception by Morin (1996) and Ferreira (2007) and the critical approach of geography by Moraes (2005), Santos (1988; 2004; 2006) and Silva (1989; 2010). Also, we used oral history as a research technique such as Moraes (2004), Bertaux (2010), Ferraroti (2010) and Nóvoa (2010) and semi-structured interviews as data collection tools. Our empirical reference is made of four teachers working in four different public schools in the city mentioned above, providing the needed data to start our research. The objective of such interviews is not the verification of the teachers‟ practice in class or outside them, but it highlights the transitoriness of the evidences mentioned in the research. Thus, we conclude that the conception of landscape mostly accepted by the teachers, once it is a process built along their lives and surrounded by their pedagogical practice, prioritizes the visual and morphological aspects and the sentimental livings related to the conception which is situated in a descriptive level of conception. Effectively, the pedagogical implications of these conceptions at school point to a materialization of geography teaching centered on the non-critical reproduction of school subjects which very little instigate the learners to process, via dialogicity, the re-significations of their essential and multiple attributes despite the several attempts and possibilities of some theoretical and methodological renovations on the application of geographical knowledge about landscape, expressed on the report of the interviewees
Resumo:
The energy flow, dE/d eta, is studied at large pseudorapidities in proton-proton collisions at the LHC, for centre-of-mass energies of 0.9 and 7 TeV. The measurements are made using the CMS detector in the pseudorapidity range 3:15 < vertical bar eta vertical bar < 4.9, for both minimum-bias events and events with at least two high-momentum jets. The data are compared to various pp Monte Carlo event generators whose theoretical models and input parameter values are sensitive to the energy-flow measurements. Inclusion of multiple-parton interactions in the Monte Carlo event generators is found to improve the description of the energy-flow measurements.
Resumo:
Measurements are presented of the production of primary KS0 and Λ particles in proton-proton collisions at √s=7 TeV in the region transverse to the leading charged-particle jet in each event. The average multiplicity and average scalar transverse momentum sum of KS0 and Λ particles measured at pseudorapidities |η|<2 rise with increasing charged-particle jet pT in the range 1-10 GeV/c and saturate in the region 10-50 GeV/c. The rise and saturation of the strange-particle yields and transverse momentum sums in the underlying event are similar to those observed for inclusive charged particles, which confirms the impact-parameter picture of multiple parton interactions. The results are compared to recent tunes of the pythia Monte Carlo event generator. The pythia simulations underestimate the data by 15%-30% for KS0 mesons and by about 50% for Λ baryons, a deficit similar to that observed for the inclusive strange-particle production in non-single-diffractive proton-proton collisions. The constant strange- to charged-particle activity ratios with respect to the leading jet pT and similar trends for mesons and baryons indicate that the multiparton-interaction dynamics is decoupled from parton hadronization, which occurs at a later stage. © 2013 CERN, for the CMS Collaboration Published by the American Physical Society under the terms of the Creative Commons Attribution 3.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
This dissertation has three separate parts: the first part deals with the general pedigree association testing incorporating continuous covariates; the second part deals with the association tests under population stratification using the conditional likelihood tests; the third part deals with the genome-wide association studies based on the real rheumatoid arthritis (RA) disease data sets from Genetic Analysis Workshop 16 (GAW16) problem 1. Many statistical tests are developed to test the linkage and association using either case-control status or phenotype covariates for family data structure, separately. Those univariate analyses might not use all the information coming from the family members in practical studies. On the other hand, the human complex disease do not have a clear inheritance pattern, there might exist the gene interactions or act independently. In part I, the new proposed approach MPDT is focused on how to use both the case control information as well as the phenotype covariates. This approach can be applied to detect multiple marker effects. Based on the two existing popular statistics in family studies for case-control and quantitative traits respectively, the new approach could be used in the simple family structure data set as well as general pedigree structure. The combined statistics are calculated using the two statistics; A permutation procedure is applied for assessing the p-value with adjustment from the Bonferroni for the multiple markers. We use simulation studies to evaluate the type I error rates and the powers of the proposed approach. Our results show that the combined test using both case-control information and phenotype covariates not only has the correct type I error rates but also is more powerful than the other existing methods. For multiple marker interactions, our proposed method is also very powerful. Selective genotyping is an economical strategy in detecting and mapping quantitative trait loci in the genetic dissection of complex disease. When the samples arise from different ethnic groups or an admixture population, all the existing selective genotyping methods may result in spurious association due to different ancestry distributions. The problem can be more serious when the sample size is large, a general requirement to obtain sufficient power to detect modest genetic effects for most complex traits. In part II, I describe a useful strategy in selective genotyping while population stratification is present. Our procedure used a principal component based approach to eliminate any effect of population stratification. The paper evaluates the performance of our procedure using both simulated data from an early study data sets and also the HapMap data sets in a variety of population admixture models generated from empirical data. There are one binary trait and two continuous traits in the rheumatoid arthritis dataset of Problem 1 in the Genetic Analysis Workshop 16 (GAW16): RA status, AntiCCP and IgM. To allow multiple traits, we suggest a set of SNP-level F statistics by the concept of multiple-correlation to measure the genetic association between multiple trait values and SNP-specific genotypic scores and obtain their null distributions. Hereby, we perform 6 genome-wide association analyses using the novel one- and two-stage approaches which are based on single, double and triple traits. Incorporating all these 6 analyses, we successfully validate the SNPs which have been identified to be responsible for rheumatoid arthritis in the literature and detect more disease susceptibility SNPs for follow-up studies in the future. Except for chromosome 13 and 18, each of the others is found to harbour susceptible genetic regions for rheumatoid arthritis or related diseases, i.e., lupus erythematosus. This topic is discussed in part III.
Resumo:
The municipality of San Juan La Laguna, Guatemala is home to approximately 5,200 people and located on the western side of the Lake Atitlán caldera. Steep slopes surround all but the eastern side of San Juan. The Lake Atitlán watershed is susceptible to many natural hazards, but most predictable are the landslides that can occur annually with each rainy season, especially during high-intensity events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the Atitlán region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. This study used data from multiple attributes, at every landslide and non-landslide point, and applied different multivariate analyses to optimize a model for landslides prediction during high-intensity precipitation events like Hurricane Stan. The attributes considered in this study are: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The attributes were pre-evaluated for their ability to predict landslides using four different attribute evaluators, all available in the open source data mining software Weka: filtered subset, information gain, gain ratio and chi-squared. Three multivariate algorithms (decision tree J48, logistic regression and BayesNet) were optimized for landslide prediction using different attributes. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points. The probability map developed in this study was also compared to the results of a bivariate landslide susceptibility analysis conducted for the watershed, encompassing Lake Atitlán and San Juan. Landslides from Tropical Storm Agatha 2010 were used to independently validate this study’s multivariate model and the bivariate model. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.
Resumo:
The process pp ! W±J/ provides a powerful probe of the production mechanism of charmonium in hadronic collisions, and is also sensitive to multiple parton interactions in the colliding protons. Using the 2011 ATLAS dataset of 4.5 fb−1 of p s =7TeV pp collisions at the LHC, the first observation is made of the production of W± +prompt J/ events in hadronic collisions, using W± → μѵμ and Jψ → μ+μ−. A yield of 27.4+7.5−6.5 W± + prompt J/ψ events is observed, with a statistical significance of 5.1ơ. The production rate as a ratio to the inclusive W± boson production rate is measured, and the double parton scattering contribution to the cross section is estimated.
Resumo:
Consumers are often less satisfied with a product chosen from a large assortment than a limited one. Experienced choice difficulty presumably causes this as consumers have to engage in a great number of individual comparisons. In two studies we tested whether partitioning the choice task so that consumers decided sequentially on each individual attribute may provide a solution. In a Starbucks coffee house, consumers who chose from the menu rated the coffee as less tasty when chosen from a large rather than a small assortment. However, when the consumers chose it by sequentially deciding about one attribute at a time, the effect reversed. In a tailored-suit customization, consumers who chose multiple attributes at a time were less satisfied with their suit, compared to those who chose one attribute at a time. Sequential attribute-based processing proves to be an effective strategy to reap the benefits of a large assortment.
Resumo:
Concerns about increasing atmospheric CO2 concentrations and global warming have initiated studies on the consequences of multiple-stressor interactions on marine organisms and ecosystems. We present a fully-crossed factorial mesocosm study and assess how warming and acidification affect the abundance, body size, and fatty acid composition of copepods as a measure of nutritional quality. The experimental set-up allowed us to determine whether the effects of warming and acidification act additively, synergistically, or antagonistically on the abundance, body size, and fatty acid content of copepods, a major group of lower level consumers in marine food webs. Copepodite (developmental stages 1-5) and nauplii abundance were antagonistically affected by warming and acidification. Higher temperature decreased copepodite and nauplii abundance, while acidification partially compensated for the temperature effect. The abundance of adult copepods was negatively affected by warming. The prosome length of copepods was significantly reduced by warming, and the interaction of warming and CO2 antagonistically affected prosome length. Fatty acid composition was also significantly affected by warming. The content of saturated fatty acids increased, and the ratios of the polyunsaturated essential fatty acids docosahexaenoic- (DHA) and arachidonic acid (ARA) to total fatty acid content increased with higher temperatures. Additionally, here was a significant additive interaction effect of both parameters on arachidonic acid. Our results indicate that in a future ocean scenario, acidification might partially counteract some observed effects of increased temperature on zooplankton, while adding to others. These may be results of a fertilizing effect on phytoplankton as a copepod food source. In summary, copepod populations will be more strongly affected by warming rather than by acidifying oceans, but ocean acidification effects can modify some temperature impacts