830 resultados para blended workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In China and world, more than half the recent basin discovered reserves involve lithologic hydrocarbon reservoir reserves. The major target for further hydrocarbon basin exploration is the subtle reservoir. The Liaodong Bay prospect is much important in Bohai Sea, which includes Liaoxi low uplift, Liaodong uplift, Liaoxi sag and Liaozhong sag. After dozens years’ exploration in Liaodong Bay, few unexplored big-and-middle-sized favorable structural traps are remained and most of the stock structure targets are bad for fragmentary. Thus seeking for new prospect area and making a breakthrough, have become the unique way to relieve the severe exploration condition in Liaodong Bay. Technique Route Based on the petrophysical property of target area, the seismic forward inference of typical subtle trap model is expanded with analysis of logging, seismic and geologic data. According to petrophysical characteristics and forward inference and research on seismic response of actual seismic data in target area, the optimization of geophysical technique is used in subtle trap identification and the geophysical identification technique system of subtle reservoir is formed. The Key Research ① Petrophysical Model The petrophysical parameter is the basic parameter for seismic wave simulation. The seismic response difference of rocks bearing different fluids is required. With the crossplot of log data, the influence of petrophysical parameters on rock elastic properties of target area is analyzed, such as porosity, shale index, fluid property and saturation. Based on the current research on Biot-Gassmann and Kuster-Toksoz model, the petrophysical parameter calculator program which can be used for fluid substitution is established. ② S-wave evaluation based on conventional log data The shear velocity is needed during forward inference of AVO or other elastic wave field. But most of the recent conventional log data is lack of shear wave. Thus according to the research on petrophysical model, the rock S-wave parameter can be evaluated from conventional log data with probability inverse method. ③ AVO forward modeling based on well data For 6 wells in JZ31-6 block and 9 wells in LD22-1 block, the AVO forward modeling recording is made by log curve. The classification of AVO characteristics in objective interval is made by the lithologic information. ④ The 2D parameter model building and forward modeling of subtle hydrocarbon trap in target area. According to the formation interpretation of ESS03D seismic area, the 2D parameter model building and seismic wave field forward modeling are carried on the given and predicted subtle hydrocarbon trap with log curve. ⑤ The lithology and fluid identification of subtle trap in target area After study the seismic response characteristics of lithology and fluid in given target area, the optimization of geophysical technique is used for lithology identification and fluid forecast. ⑥The geophysical identification technique system of subtle reservoir The Innovative Points of this Paper ① Based on laboratory measurement and petrophysical model theory, the rock S-wave parameter can be evaluated from conventional log data with probability inverse method. Then the fluid substitution method based on B-G and K-T theory is provided. ② The method and workflow for simulating seismic wave field property of subtle hydrocarbon trap are established based on the petrophysical model building and forward modeling of wave equation. ③ The description of subtle trap structural feature is launched. According to the different reflection of frequency wave field structural attribute, the fluid property of subtle trap can be identified by wave field attenuation attribute and absorption analysis. ④ It’s the first time to identify subtle trap by geophysical technique and provide exploration drilling well location. ⑤ The technique system of subtle reservoir geophysical identification is formed to provide available workflow and research ideas for other region of interest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To extend the cross-hole seismic 2D data to outside 3D seismic data, reconstructing the low frequency data to high frequency data is necessary. Blind deconvolution method is a key technology. In this paper, an implementation of Blind deconvolution is introduced. And optimized precondition conjugate gradient method is used to improve the stability of the algorithm and reduce the computation. Then high-frequency retrieved Seismic data and the cross-hole seismic data is combined for constraint inversion. Real data processing proved the method is effective. To solve the problem that the seismic data resolution can’t meet the request of reservoir prediction in the river face thin-layers in Chinese eastern oil fields, a high frequency data reconstruction method is proposed. The extrema of the seismic data are used to get the modulation function which operated with the original seismic data to get the high frequency part of the reconstruction data to rebuild the wide band data. This method greatly saves the computation, and easy to adjust the parameters. In the output profile, the original features of the seismic events are kept, the common feint that breaking the events and adding new zeros to produce alias is avoided. And the interbeded details are enhanced compared to the original profiles. The effective band of seismic data is expended and the method is approved by the processing of the field data. Aim to the problem in the exploration and development of Chinese eastern oil field that the high frequency log data and the relative low frequency seismic data can’t be merged, a workflow of log data extrapolation constrained by time-phase model based on local wave decomposition is raised. The seismic instantaneous phase is resolved by local wave decomposition to build time-phase model, the layers beside the well is matched to build the relation of log and seismic data, multiple log info is extrapolated constrained by seismic equiphase map, high precision attributes inverse sections are produced. In the course of resolve the instantaneous phase, a new method of local wave decomposition --Hilbert transform mean mode decomposition(HMMD) is raised to improve the computation speed and noise immunity. The method is applied in the high resolution reservoir prediction in Mao2 survey of Daqing oil field, Multiple attributes profiles of wave impedance, gamma-ray, electrical resistivity, sand membership degree are produced, of which the resolution is high and the horizontal continuous is good. It’s proved to be a effective method for reservoir prediction and estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reflectivity sequences extraction is a key part of impedance inversion in seismic exploration. Although many valid inversion methods exist, with crosswell seismic data, the frequency brand of seismic data can not be broadened to satisfy the practical need. It is an urgent problem to be solved. Pre-stack depth migration which developed in these years becomes more and more robust in the exploration. It is a powerful technology of imaging to the geological object with complex structure and its final result is reflectivity imaging. Based on the reflectivity imaging of crosswell seismic data and wave equation, this paper completed such works as follows: Completes the workflow of blind deconvolution, Cauchy criteria is used to regulate the inversion(sparse inversion). Also the precondition conjugate gradient(PCG) based on Krylov subspace is combined with to decrease the computation, improves the speed, and the transition matrix is not necessary anymore be positive and symmetric. This method is used to the high frequency recovery of crosswell seismic section and the result is satisfactory. Application of rotation transform and viterbi algorithm in the preprocess of equation prestack depth migration. In equation prestack depth migration, the grid of seismic dataset is required to be regular. Due to the influence of complex terrain and fold, the acquisition geometry sometimes becomes irregular. At the same time, to avoid the aliasing produced by the sparse sample along the on-line, interpolation should be done between tracks. In this paper, I use the rotation transform to make on-line run parallel with the coordinate, and also use the viterbi algorithm to complete the automatic picking of events, the result is satisfactory. 1. Imaging is a key part of pre-stack depth migration besides extrapolation. Imaging condition can influence the final result of reflectivity sequences imaging greatly however accurate the extrapolation operator is. The author does migration of Marmousi under different imaging conditions. And analyzes these methods according to the results. The results of computation show that imaging condition which stabilize source wave field and the least-squares estimation imaging condition in this paper are better than the conventional correlation imaging condition. The traditional pattern of "distributed computing and mass decision" is wisely adopted in the field of seismic data processing and becoming an obstacle of the promoting of the enterprise management level. Thus at the end of this paper, a systemic solution scheme, which employs the mode of "distributed computing - centralized storage - instant release", is brought forward, based on the combination of C/S and B/S release models. The architecture of the solution, the corresponding web technology and the client software are introduced. The application shows that the validity of this scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gerenciamento de conteúdo. Instalação do plone. Criando instância do plone. Criação de usuários. Esquema de workflow de documentos. Alterando a interface do plone. Como realizar backup de instancias do plone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An understanding of research is important to enable nurses to provide evidencebasedcare. However, undergraduate nursing students often find research a challenging subject. The purpose of this paper is to present an evaluation of the introduction of podcasts in an undergraduate research module to enhance research teaching linkages between the theoretical content and research in practice and improve the level of student support offered in a blended learning environment. Two cohorts of students (n=228 and n=233) were given access to a series of 5 “guest speaker” podcasts made up of presentations and interviews with research experts within Edinburgh Napier. These staff would not normally have contact with students on this module, but through the podcasts were able to share their research expertise and methods with our learners. The main positive results of the podcasts suggest the increased understanding achieved by students due to the multi-modal delivery approach, a more personal student/tutor relationship leading to greater engagement, and the effective use of materials for revision and consolidation purposes. Negative effects of the podcasts centred around problems with the technology, most often difficulty in downloading and accessing the material. This paper contributes to the emerging knowledge base of podcasting in nurse education by demonstrating how podcasts can be used to enhance research-teaching linkages and raises the question of why students do not exploit the opportunities for mobile learning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional methods for phenotyping skeletal muscle (e.g., immunohistochemistry) are labor-intensive and ill-suited to multixplex analysis, i.e., assays must be performed in a series. Addressing these concerns represents a largely unmet research need but more comprehensive parallel analysis of myofibrillar proteins could advance knowledge regarding age- and activity-dependent changes in human muscle. We report a label-free, semi-automated and time efficient LC-MS proteomic workflow for phenotyping the myofibrillar proteome. Application of this workflow in old and young as well as trained and untrained human skeletal muscle yielded several novel observations that were subsequently verified by multiple reaction monitoring (MRM).We report novel data demonstrating that human ageing is associated with lesser myosin light chain 1 content and greater myosin light chain 3 content, consistent with an age-related reduction in type II muscle fibers. We also disambiguate conflicting data regarding myosin regulatory light chain, revealing that age-related changes in this protein more closely reflect physical activity status than ageing per se. This finding reinforces the need to control for physical activity levels when investigating the natural process of ageing. Taken together, our data confirm and extend knowledge regarding age- and activity-related phenotypes. In addition, the MRM transitions described here provide a methodological platform that can be fine-tuned to suite multiple research needs and thus advance myofibrillar phenotyping.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wydział Studiów Edukacyjnych

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DSpace is an open source software platform that enables organizations to: - Capture and describe digital material using a submission workflow module, or a variety of programmatic ingest options - Distribute an organization's digital assets over the web through a search and retrieval system - Preserve digital assets over the long term This system documentation includes a functional overview of the system, which is a good introduction to the capabilities of the system, and should be readable by nontechnical personnel. Everyone should read this section first because it introduces some terminology used throughout the rest of the documentation. For people actually running a DSpace service, there is an installation guide, and sections on configuration and the directory structure. Note that as of DSpace 1.2, the administration user interface guide is now on-line help available from within the DSpace system. Finally, for those interested in the details of how DSpace works, and those potentially interested in modifying the code for their own purposes, there is a detailed architecture and design section.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This module will introduce the item submission workflows available in DSpace. Workflows allow submissions to be checked before entering the repository. Submissions may be checked for accuracy, in order to improve the metadata, or simply to decide if they are OK to be archived. The module will show the three workflow steps available in DSpace, along with details about adding, changing and removing them from the submission process of collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(adapted from the DSpace Procedures Manual developed by Kalamazoo College Digital Archive)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Million Mom March (favoring gun control) and Code Pink: Women for Peace (focusing on foreign policy, especially the war in Iraq) are organizations that have mobilized women as women in an era when other women's groups struggled to maintain critical mass and turned away from non-gender-specific public issues. This article addresses how these organizations fostered collective consciousness among women, a large and diverse group, while confronting the echoes of backlash against previous mobilization efforts by women. We argue that the March and Code Pink achieved mobilization success by creating hybrid organizations that blended elements of three major collective action frames: maternalism, egalitarianism, and feminine expression. These innovative organizations invented hybrid forms that cut across movements, constituencies, and political institutions. Using surveys, interviews, and content analysis of organizational documents, this article explains how the March and Code Pink met the contemporary challenges facing women's collective action in similar yet distinct ways. It highlights the role of feminine expression and concerns about the intersectional marginalization of women in resolving the historic tensions between maternalism and egalitarianism. It demonstrates hybridity as a useful analytical lens to understand gendered organizing and other forms of grassroots collective action. © 2010 American Political Science Association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Haward of Dictionary of Music (1983), defines variation as "a technique modifying a given musical idea." From the Baroque period on, the form and the techniques of variation were developed and enriched in Germany and France. Therefore, I presented the works of composers from these two nations. Even though there was a vast number of possibilities, I wanted to be scholastically fair and interesting in making my selections by choosing well-known pieces along with lesser-known ones. Haydn's well-known Variations in F minor consist of two sets of double variations which break into an improvisation fantasy. The first movement of Beethoven Sonata in A flat major, Op. 26, is a set of five variations on the composer's original theme. The variations are positioned in the first movement instead of Sonata-Allegro form. In 1861 Brahrns composed the Variations and Fugue, Op. 24, on the theme of Handel. Brahms displays a wealth of rhythmic, harmonic and textural contrasts in the variations. Chopin's E Major Variations without opus number are written on a Swiss influenced German folksong. Faure's Theme and Variations in C sharp minor, Op. 73, includes eleven variations. The work displays the composer's subtlety, grace and reticence. 12 Variationen iiber ein eigenes Thema were written by Alban Berg as a composition study with Schonberg. The Finale of Dutilleux's Piano Sonata, titled "Chorale with Variations", is written in an impressionistic style. A rich expressiveness is well blended in a classical form. In 1742, the remarkable Aria and thirty variations known as the Goldberg Variations were composed by J. S. Bach. The thirty Variations are unified by the bass line, which forms the foundation of the Aria. The pieces discussed above were presented in three recitals. Compact disc recordings of these recitals are available in the Michelle Smith Performing Arts Library of the Clarice Smith Performing Arts Center at the University of Maryland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.