883 resultados para information as a property good


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In research field of oil geophysical prospecting, reservoir prediction is refers to forecasting physical properties of petroleum reservoir by using data of seismic and well logging, it is a research which can guide oil field development. Singularities of seismic and logging data are caused by the heterogeneity of reservoir physical property. It's one of important methods that using singularity characteristics of seismic and logging data to study the reservoir physical property in recently. Among them, realization of reservoir quantitative prediction by analyzing singularity of the data and enhancing transition description of data is difficulty in method research. Based on wavelet transform and the fractal theory, the paper studied the singularity judgment criterion for seismic and logging data, not only analyzed quantitative relation between singularity data and reservoir physical property, but also applied it in practical reservoir prediction. The main achievements are: 1. A new method which provides singular points and their strength information estimation at only one single scale is proposed by Herrmann (1999). Based on that, the dissertation proposed modified algorithm which realized singularity polarity detection. 2. The dissertation introduced onset function to generalize the traditional geologic boundaries variations model which used singularity characteristics to represent the abruptness of the lithologic velocity transition. We show that singularity analysis reveals generic singularity information conducted from velocity or acoustic impedance to seismogram based on the convolution seismic-model theory. Theory and applications indicated that singularity information calculated from seismic data was a natural attribute for delineating stratigraphy boundaries due to its excellent ability in detecting detailed geologic features. We demonstrated that singularity analysis was a powerful tool to delineate stratigraphy boundaries and inverse acoustic impedance and velocity. 3. The geologic significances of logging data singularity information were also presented. According to our analysis, the positions of singularities indicate the sequence stratigraphic boundary, and there is subtle relationship between the singularity strength and sedimentary environment, meanwhile the singularity polarity used to recognize stratigraphic base-level cycle. Based on all those above, a new method which provided sedimentary cycle analysis based on the singularity information of logging data in multiple scales was proposed in this dissertation. This method provided a quantitative tool for judging interface of stratum sequence and achieved good results in the actual application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional 3D seismic exploration cannot meet the demand of high yield and high efficiency safe production in coal mine any more. Now it is urgent to improve the discovery degree of coal mine geological structures for coal production in China. Based on 3D3C seismic exploration data, multi-component seismic information is fully excavated. First systematic research on 3D3C seismic data interpretation of coal measure strata is carried out. Firstly, by analyzing the coal measure strata, the seismic-geologic model of coal measure strata is built. Shear wave logging is built by using regression analysis. Horizon calibration methods of PP-wave and PS-wave are studied and the multi-wave data are used together to interpret small faults. Using main amplitude analysis technology, small faults which cannot be found from PP-wave sections can be interpreted from the low frequency PS-wave sections. Thus, the purpose to applying PS-wave data to fine structure assistant interpretation is achieved. Secondly, PP- and PS-wave post-stack well constrained inversion methods of coal measure strata are studied. Joint PP- and PS-wave post-stack inversion flow is established. More attribute parameters, which are applied in fine lithology interpretation of coal measure strata, are obtained from combinations of the inversion results. Exploring the relation between rock with negative Poisson’s ratio and anisotropy, fracture development in coal seam are predicted. Petrophysical features of coal measure strata are studied, and the relations between elastic parameters and lithology, fluid and physical properties are established. Inversions of the physical parameters such as porosity, permeability and water saturation, which reflect lithology and fluid property, are obtained. Finally, the approaches of shear wave splitting and Thomsen parameters inversion, which provide new ideas for seismic anisotropy interpretation of coal measure strata, are studied to predict fracture development. The results of practical application indicate that the methods in this paper have good feasibility and applicability. They have positive significance for high yield and high efficiency safe production in coal mine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a typical geological and environmental hazard, landslide has been causing more and more property and life losses. However, to predict its accurate occurring time is very difficult or even impossible due to landslide's complex nature. It has been realized that it is not a good solution to spend a lot of money to treat with and prevent landslide. The research trend is to study landslide's spatial distribution and predict its potential hazard zone under certain region and certain conditions. GIS(Geographical Information System) is a power tools for data management, spatial analysis based on reasonable spatial models and visualization. It is new and potential study field to do landslide hazard analysis and prediction based on GIS. This paper systematically studies the theory and methods for GIS based landslide hazard analysis. On the basis of project "Mountainous hazard study-landslide and debris flows" supported by Chinese Academy of Sciences and the former study foundation, this paper carries out model research, application, verification and model result analysis. The occurrence of landslide has its triggering factors. Landslide has its special landform and topographical feature which can be identify from field work and remote sensing image (aerial photo). Historical record of landslide is the key to predict the future behaviors of landslide. These are bases for landslide spatial data base construction. Based on the plenty of literatures reviews, the concept framework of model integration and unit combinations is formed. Two types of model, CF multiple regression model and landslide stability and hydrological distribution coupled model are bought forward. CF multiple regression model comes form statistics and possibility theory based on data. Data itself contains the uncertainty and random nature of landslide hazard, so it can be seen as a good method to study and understand landslide's complex feature and mechanics. CF multiple regression model integrates CF (landslide Certainty Factor) and multiple regression prediction model. CF can easily treat with the problems of data quantifying and combination of heteroecious data types. The combination of CF can assist to determine key landslide triggering factors which are then inputted into multiple regression model. CF regression model can provide better prediction results than traditional model. The process of landslide can be described and modeled by suitable physical and mechanical model. Landslide stability and hydrological distribution coupled model is such a physical deterministic model that can be easily used for landslide hazard analysis and prediction. It couples the general limit equilibrium method and hydrological distribution model based on DEM, and can be used as a effective approach to predict the occurrence of landslide under different precipitation conditions as well as landslide mechanics research. It can not only explain pre-existed landslides, but also predict the potential hazard region with environmental conditions changes. Finally, this paper carries out landslide hazard analysis and prediction in Yunnan Xiaojiang watershed, including landslide hazard sensitivity analysis and regression prediction model based on selected key factors, determining the relationship between landslide occurrence possibility and triggering factors. The result of landslide hazard analysis and prediction by coupled model is discussed in details. On the basis of model verification and validation, the modeling results are showing high accuracy and good applying potential in landslide research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SIR is a computer system, programmed in the LISP language, which accepts information and answers questions expressed in a restricted form of English. This system demonstrates what can reasonably be called an ability to "understand" semantic information. SIR's semantic and deductive ability is based on the construction of an internal model, which uses word associations and property lists, for the relational information normally conveyed in conversational statements. A format-matching procedure extracts semantic content from English sentences. If an input sentence is declarative, the system adds appropriate information to the model. If an input sentence is a question, the system searches the model until it either finds the answer or determines why it cannot find the answer. In all cases SIR reports its conclusions. The system has some capacity to recognize exceptions to general rules, resolve certain semantic ambiguities, and modify its model structure in order to save computer memory space. Judging from its conversational ability, SIR, is a first step toward intelligent man-machine communication. The author proposes a next step by describing how to construct a more general system which is less complex and yet more powerful than SIR. This proposed system contains a generalized version of the SIR model, a formal logical system called SIR1, and a computer program for testing the truth of SIR1 statements with respect to the generalized model by using partial proof procedures in the predicate calculus. The thesis also describes the formal properties of SIR1 and how they relate to the logical structure of SIR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Srinivasan, A., King, R. D. and Bain, M.E. (2003) An Empirical Study of the Use of Relevance Information in Inductive Logic Programming. Journal of Machine Learning Research. 4(Jul):369-383

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Yeoman, A., Durbin, J. & Urquhart, C. (2004). Evaluating SWICE-R (South West Information for Clinical Effectiveness - Rural). Final report for South West Workforce Development Confederations, (Knowledge Resources Development Unit). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: South West WDCs (NHS)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Durbin, J., Urquhart, C. & Yeoman, A. (2003). Evaluation of resources to support production of high quality health information for patients and the public. Final report for NHS Research Outputs Programme. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Department of Health

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the World Wide Web (Web) is increasingly adopted as the infrastructure for large-scale distributed information systems, issues of performance modeling become ever more critical. In particular, locality of reference is an important property in the performance modeling of distributed information systems. In the case of the Web, understanding the nature of reference locality will help improve the design of middleware, such as caching, prefetching, and document dissemination systems. For example, good measurements of reference locality would allow us to generate synthetic reference streams with accurate performance characteristics, would allow us to compare empirically measured streams to explain differences, and would allow us to predict expected performance for system design and capacity planning. In this paper we propose models for both temporal and spatial locality of reference in streams of requests arriving at Web servers. We show that simple models based only on document popularity (likelihood of reference) are insufficient for capturing either temporal or spatial locality. Instead, we rely on an equivalent, but numerical, representation of a reference stream: a stack distance trace. We show that temporal locality can be characterized by the marginal distribution of the stack distance trace, and we propose models for typical distributions and compare their cache performance to our traces. We also show that spatial locality in a reference stream can be characterized using the notion of self-similarity. Self-similarity describes long-range correlations in the dataset, which is a property that previous researchers have found hard to incorporate into synthetic reference strings. We show that stack distance strings appear to be strongly self-similar, and we provide measurements of the degree of self-similarity in our traces. Finally, we discuss methods for generating synthetic Web traces that exhibit the properties of temporal and spatial locality that we measured in our data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a general technique for determining upper bounds on maximal values (or lower bounds on minimal costs) in stochastic dynamic programs. In this approach, we relax the nonanticipativity constraints that require decisions to depend only on the information available at the time a decision is made and impose a "penalty" that punishes violations of nonanticipativity. In applications, the hope is that this relaxed version of the problem will be simpler to solve than the original dynamic program. The upper bounds provided by this dual approach complement lower bounds on values that may be found by simulating with heuristic policies. We describe the theory underlying this dual approach and establish weak duality, strong duality, and complementary slackness results that are analogous to the duality results of linear programming. We also study properties of good penalties. Finally, we demonstrate the use of this dual approach in an adaptive inventory control problem with an unknown and changing demand distribution and in valuing options with stochastic volatilities and interest rates. These are complex problems of significant practical interest that are quite difficult to solve to optimality. In these examples, our dual approach requires relatively little additional computation and leads to tight bounds on the optimal values. © 2010 INFORMS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use an information-theoretic method developed by Neifeld and Lee [J. Opt. Soc. Am. A 25, C31 (2008)] to analyze the performance of a slow-light system. Slow-light is realized in this system via stimulated Brillouin scattering in a 2 km-long, room-temperature, highly nonlinear fiber pumped by a laser whose spectrum is tailored and broadened to 5 GHz. We compute the information throughput (IT), which quantifies the fraction of information transferred from the source to the receiver and the information delay (ID), which quantifies the delay of a data stream at which the information transfer is largest, for a range of experimental parameters. We also measure the eye-opening (EO) and signal-to-noise ratio (SNR) of the transmitted data stream and find that they scale in a similar fashion to the information-theoretic method. Our experimental findings are compared to a model of the slow-light system that accounts for all pertinent noise sources in the system as well as data-pulse distortion due to the filtering effect of the SBS process. The agreement between our observations and the predictions of our model is very good. Furthermore, we compare measurements of the IT for an optimal flattop gain profile and for a Gaussian-shaped gain profile. For a given pump-beam power, we find that the optimal profile gives a 36% larger ID and somewhat higher IT compared to the Gaussian profile. Specifically, the optimal (Gaussian) profile produces a fractional slow-light ID of 0.94 (0.69) and an IT of 0.86 (0.86) at a pump-beam power of 450 mW and a data rate of 2.5 Gbps. Thus, the optimal profile better utilizes the available pump-beam power, which is often a valuable resource in a system design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the seventh edition, the book has been updated and revised to reflect changes in the market, the development of appraisal methods and the subsequent changes in professional practice. The intial overview in Part I of the book, The Economic and Legal Framework, has been revisd to show the present position. Changes in appraisal techniques based on the research of the authors have been incorporated in Part II on Investment Valuation. Revisions have also been made in part II, again based on the research activities of the authors, which examines Investment Appraisal.The serves a number of purposes. First, it provides a critical examination of valuation techniques, with particular reference to the investment method of valuation. Second, it supplies practising valuers and appraisers with more effective data, information and techniques to enable them to carry out their valuations, appraisals and negotiations in an increasily competitive field. Finally, it provides assistance to students and academics in understanding the context of and a range of approaches to the valuation and appraisal of property investments. This book has been a key text in property investment appraisal for more than 30 years, it has sold many thousands of copies globally to academics, students and practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Healthy and Biologically Diverse Seas Evidence Group (HBDSEG) has been tasked with providing the technical advice for the implementation of the Marine Strategy Framework Directive (MSFD) with respect to descriptors linked to biodiversity. A workshop was held in London to address one of the Research and Development (R&D) proposals entitled: ‘Mapping the extent and distribution of habitats using acoustic and remote techniques, relevant to indicators for area/extent/habitat loss.’ The aim of the workshop was to identify, define and assess the feasibility of potential indicators of benthic habitat distribution and extent, and identify the R&D work which could be required to fully develop these indicators. The main points that came out of the workshop were: (i) There are many technical aspects of marine habitat mapping that still need to be resolved if cost-effective spatial indicators are to be developed. Many of the technical aspects that need addressing surround issues of consistency, confidence and repeatability. These areas should be tackled by the JNCC Habitat Mapping and Classification Working Group and the HBDSEG Seabed Mapping Working Group. (ii) There is a need for benthic ecologists (through the HBDSEG Benthic Habitats Subgroup and the JNCC Marine Indicators Group) to finalise the list of habitats for which extent and/or distribution indicators should be considered for development, building upon the recommendations from this report. When reviewing the list of indicators, benthic habitats could also be distinguished into those habitats that are defined/determined primarily by physical parameters (although including biological assemblages) (e.g. subtidal shallow sand) and those defined primarily by their biological assemblage (e.g. seagrass beds). This distinction is important as some anthropogenic pressures may influence the biological component of the ecosystem despite not having a quantifiable effect on the physical habitat distribution/extent. (iii) The scale and variety of UK benthic habitats makes any attempt to undertake comprehensive direct mapping exercises prohibitively expensive (especially where there is a need for repeat surveys for assessment). There is a clear need therefore to develop a risk-based approach that uses indirect indicators (e.g. modelling), such as habitats at risk from pressures caused by current human activities, to develop priorities for information gathering. The next steps that came out of the workshop were: (i) A combined approach should be developed by the JNCC Marine Indicators Group together with the HBDSEG Benthic Habitats Subgroup, which will compile and ultimately synthesise all the criteria used by the three different groups from the workshop. The agreed combined approach will be used to undertake a final review of the habitats considered during the workshop, and to evaluate any remaining habitats in order to produce a list of habitats for indicator development for which extent and/or distribution indicators could be appropriate. (ii) The points of advice raised at this workshop, alongside the combined approach aforementioned, and the final list of habitats for extent and/or distribution indicator development will be used to develop a prioritised list of actions to inform the next round of R&D proposals for benthic habitat indicator development in 2014. This will be done through technical discussions within JNCC and the relevant HBDSEG Subgroups. The preparation of recommendations by these groups should take into account existing work programmes, and consider the limited resources available to undertake any further R&D work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We augment discussions about the Good Environmental Status of the North Sea by developing two extreme visions and assessing their societal benefits. One vision (‘Then’) assumes restoration of benthic functioning; we contend that trawling had already degraded the southern North Sea a century ago. Available information is used to speculate about benthic functioning in a relatively undisturbed southern North Sea. The second vision (‘Now’) draws on recent benthic functioning. The supply of five ecosystem services, supported by benthic functioning, is discussed. ‘Then’ offers confidence in the sustainable supply of diverse services but restoration of past function is uncertain and likely to be paired with costs, notably trawling restraints. ‘Now’ delivers known and valued services but sustained delivery is threatened by, for example, climate change. We do not advocate either vision. Our purpose is to stimulate debate about what society wants, and might receive, from the future southern North Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conflict’s coverage, since its inception, has been closely linked to the relationship that both the military and the media have. The freedom they maintained during their first conflicts, although not without problems, though they suffered strict censorship suffered during World War I, and lastly the straitjacket treatment that they have endured during recent wars. The Vietnam War marked a turning point in this relationship, and after the invasion of Grenada, the military would launch new information guidelines, called Department of Defense National Media Pool. The lack of clear guidance of both control and space, has made for a complicated relationship between media and military, so the rules have evolved after every conflict shaping the future of press coverage and thus, war reporting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the relation between selection power and selection labor for information retrieval (IR). It is the first part of the development of a labor theoretic approach to IR. Existing models for evaluation of IR systems are reviewed and the distinction of operational from experimental systems partly dissolved. The often covert, but powerful, influence from technology on practice and theory is rendered explicit. Selection power is understood as the human ability to make informed choices between objects or representations of objects and is adopted as the primary value for IR. Selection power is conceived as a property of human consciousness, which can be assisted or frustrated by system design. The concept of selection power is further elucidated, and its value supported, by an example of the discrimination enabled by index descriptions, the discovery of analogous concepts in partly independent scholarly and wider public discourses, and its embodiment in the design and use of systems. Selection power is regarded as produced by selection labor, with the nature of that labor changing with different historical conditions and concurrent information technologies. Selection labor can itself be decomposed into description and search labor. Selection labor and its decomposition into description and search labor will be treated in a subsequent article, in a further development of a labor theoretic approach to information retrieval.