960 resultados para information flow


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mudrocks and carbonates of the Isa superbasin in the Lawn Hill platform in northern Australia host major base metal sulfide mineralization, including the giant strata-bound Century Zn-Pb deposit. Mineral paragenesis, stable isotope, and K-Ar dating studies demonstrate that long-lived structures such as the Termite Range fault acted as hot fluid conduits several times during the Paleoproterozoic and Mesoproterozoic in response to major tectonic events. Illite and chlorite crystallinity studies suggest the southern part of the platform has experienced higher temperatures (up to 300 degrees C) than similar stratigraphic horizons in the north. The irregular downhole variation of illite crystallinity values provides further information oil the thermal regime in the basin and shows that clay formation was controlled not only by temperature increase with depth but also by high water/rock ratios along relatively permeable zones. K-Ar dating of illite, in combination with other data, may indicate three major thermal events in the central and northern Lawn Hill platform Lit 1500, 1440 to 1400, and 1250 to 1150 Ma. This study did not detect the earlier Century base metal mineralizing event at 1575 Ma. 1500 Ma ages are recorded only in the south and correspond to the age of the Late Isan orogeny and deposition of the Lower Roper superbasin. They may reflect exhumation of a provenance region. The 1440 to 1300 Ma ages are related to fault reactivation and a thermal pulse at similar to 1440 to 1400 Ma possibly accompanied by fluid flow, with subsequent enhanced cooling possibly due to thermal relaxation or further crustal exhumation. The youngest thermal and/or fluid-flow event at 1250 to 1150 Ma is recorded mainly to the cast of the Tern-lite Range fault and may be related to the assembly of the Rodinian supercontinent. Fluids in equilibrium with illite that formed over a range of temperatures, at different times in different parts of the platform. have relatively uniform oxygen isotope compositions and more variable hydrogen isotope compositions (delta O-18 = 3.5-9.7 parts per thousand V-SMOW; delta D = -94 to -36 parts per thousand V-SMOW). The extent of the 180 enrichment and the variably depleted hydrogen isotope compositions suggest the illite interacted with deep-basin hypersaline brines that were composed of evaporated seawater and/or highly evolved meteoric water. Siderite is the most abundant iron-rich gangue phase in the Century Zn-Pb deposit, which is surrounded by all extensive ferroan carbonate alteration halo. Modeling suggests that the ore siderite formed at temperatures of 120 degrees to 150 degrees C, whereas siderite and ankerite in the alteration halo formed at temperatures of 150 degrees to 180 degrees C. The calculated isotopic compositions of the fluids are consistent with O-18-rich basinal brines and mixed inorganic and organic carbon Sources (6180 = 3-10 parts per thousand V-SMOW, delta C-13 = -7 to -3 parts per thousand V-PDB). in the northeast Lawn Hill platform carbonate-rich rocks preserve marine to early diagenetic carbon and oxygen isotope compositions, whereas ferroan carbonate cements in siltstones and shales in the Desert Creek borehole are O-18 and C-13 depleted relative to the sedimentary carbonates. The good agreement between temperature estimates from illite crystallinity and organic reflectance (160 degrees-270 degrees C) and inverse correlation with carbonate delta O-18 values indicates that organic maturation and carbonate precipitation in the northeast Lawn Hill platform resulted from interaction with the 1250 to 1150 Ma fluids. The calculated isotopic compositions of the fluid are consistent with evolved basinal brine (delta O-18 = 5.1-9.4 parts per thousand V-SMOW; delta C-13 = -13.2 to -3.7 parts per thousand V-PDB) that contained a variable organic carbon component from the oxidation and/or hydrolysis of organic matter in the host sequence. The occurrence of extensive O-18- and C-13-depleted ankerite and siderite alteration in Desert Creek is related to the high temperature of the 1250 to 1150 Ma fluid-flow event in the northeast Lawn Hill platform, in contrast to the lower temperature fluids associated with the earlier Century Zn-Pb deposit in the central Lawn Hill platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A complete workflow specification requires careful integration of many different process characteristics. Decisions must be made as to the definitions of individual activities, their scope, the order of execution that maintains the overall business process logic, the rules governing the discipline of work list scheduling to performers, identification of time constraints and more. The goal of this paper is to address an important issue in workflows modelling and specification, which is data flow, its modelling, specification and validation. Researchers have neglected this dimension of process analysis for some time, mainly focussing on structural considerations with limited verification checks. In this paper, we identify and justify the importance of data modelling in overall workflows specification and verification. We illustrate and define several potential data flow problems that, if not detected prior to workflow deployment may prevent the process from correct execution, execute process on inconsistent data or even lead to process suspension. A discussion on essential requirements of the workflow data model in order to support data validation is also given..

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An experimental and theoretical study of the transport of mineral wool fibre agglomerates in nuclear power plant containment sumps is being performed. A racetrack channel was devised to provide data for the validation of numerical models, which are intended to model the transport of fibre agglomerates. The racetrack channel provides near uniform and steady conditions that lead to either the sedimentation or suspension of the agglomerates. Various experimental techniques were used to determine the velocity conditions and the distribution of the fibre agglomerates in the channel. The fibre agglomerates are modelled as fluid particles in the Eulerian reference frame. Simulations of pure sedimentation of a known mass and volume of agglomerations show that the transport of the fibre agglomerates can be replicated. The suspension of the fibres is also replicated in the simulations; however, the definition of the fibre agglomerate phase is strongly dependent on the selected density and diameter. Detailed information on the morphology of the fibre agglomerates is lacking for the suspension conditions, as the fibre agglomerates may undergo breakage and erosion. Therefore, ongoing work, which is described here, is being pursued to improve the experimental characterisation of the suspended transport of the fibre agglomerates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental investigations and computer modelling studies have been made on the refrigerant-water counterflow condenser section of a small air to water heat pump. The main object of the investigation was a comparative study between the computer modelling predictions and the experimental observations for a range of operating conditions but other characteristics of a counterflow heat exchanger are also discussed. The counterflow condenser consisted of 15 metres of a thermally coupled pair of copper pipes, one containing the R12 working fluid and the other water flowing in the opposite direction. This condenser was mounted horizontally and folded into 0.5 metre straight sections. Thermocouples were inserted in both pipes at one metre intervals and transducers for pressure and flow measurement were also included. Data acquisition, storage and analysis was carried out by a micro-computer suitably interfaced with the transducers and thermocouples. Many sets of readings were taken under a variety of conditions, with air temperature ranging from 18 to 26 degrees Celsius, water inlet from 13.5 to 21.7 degrees, R12 inlet temperature from 61.2 to 81.7 degrees and water mass flow rate from 6.7 to 32.9 grammes per second. A Fortran computer model of the condenser (originally prepared by Carrington[1]) has been modified to match the information available from experimental work. This program uses iterative segmental integration over the desuperheating, mixed phase and subcooled regions for the R12 working fluid, the water always being in the liquid phase. Methods of estimating the inlet and exit fluid conditions from the available experimental data have been developed for application to the model. Temperature profiles and other parameters have been predicted and compared with experimental values for the condenser for a range of evaporator conditions and have shown that the model gives a satisfactory prediction of the physical behaviour of a simple counterflow heat exchanger in both single phase and two phase regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is undertaken in the attempt to understand the processes at work at the cutting edge of the twist drill. Extensive drill life testing performed by the University has reinforced a survey of previously published information. This work demonstrated that there are two specific aspects of drilling which have not previously been explained comprehensively. The first concerns the interrelating of process data between differing drilling situations, There is no method currently available which allows the cutting geometry of drilling to be defined numerically so that such comparisons, where made, are purely subjective. Section one examines this problem by taking as an example a 4.5mm drill suitable for use with aluminium. This drill is examined using a prototype solid modelling program to explore how the required numerical information may be generated. The second aspect is the analysis of drill stiffness. What aspects of drill stiffness provide the very great difference in performance between short flute length, medium flute length and long flute length drills? These differences exist between drills of identical point geometry and the practical superiority of short drills has been known to shop floor drilling operatives since drilling was first introduced. This problem has been dismissed repeatedly as over complicated but section two provides a first approximation and shows that at least for smaller drills of 4. 5mm the effects are highly significant. Once the cutting action of the twist drill is defined geometrically there is a huge body of machinability data that becomes applicable to the drilling process. Work remains to interpret the very high inclination angles of the drill cutting process in terms of cutting forces and tool wear but aspects of drill design may already be looked at in new ways with the prospect of a more analytical approach rather than the present mix of experience and trial and error. Other problems are specific to the twist drill, such as the behaviour of the chips in the flute. It is now possible to predict the initial direction of chip flow leaving the drill cutting edge. For the future the parameters of further chip behaviour may also be explored within this geometric model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for an adequate information system for the Highways Departments in the United Kingdom has been recognised by the report of a committee presented to the Minister of Transport in 1970, (The Marshall Report). This research aims to present a comprehensive information system on a sound theoretical basis which should enable the different levels of management to execute their work adequately. The suggested system presented in this research covers the different functions of the Highways Department, and presents a suggested solution for problems which may occur during the planning and controlling of work in the different locations of the Highways Department. The information system consists of:- 1. A coding system covering the cost units, cost centres and cost elements. 2. Cost accounting records for the cost units and cost centres. 3. A budgeting and budgetary control system covering, the different planning methods and procedures which are required for preparing the capital expenditure budget, the improvement and maintenance operation flexible budgets and programme of work, the plant budget, the administration budget, and the purchasing budget. 4. A reporting system which ensures that the different levels of management are receiving relevant and timely information. 5. The flow of documents which covers the relationship between the prime documents, the cost accounting records, budgets, reports and their relation to the different sections and offices within the department. A comprehensive cost units, cost centres, and cost elements codes together with a number of examples demonstrating the results of the survey, and examples of the application and procedures of the suggested information system have been illustrated separately as appendices. The emphasis is on the information required for internal control by management personnel within the County Council.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-preemptive two-machine flow-shop scheduling problem with uncertain processing times of n jobs is studied. In an uncertain version of a scheduling problem, there may not exist a unique schedule that remains optimal for all possible realizations of the job processing times. We find necessary and sufficient conditions (Theorem 1) when there exists a dominant permutation that is optimal for all possible realizations of the job processing times. Our computational studies show the percentage of the problems solvable under these conditions for the cases of randomly generated instances with n ≤ 100 . We also show how to use additional information about the processing times of the completed jobs during optimal realization of a schedule (Theorems 2 – 4). Computational studies for randomly generated instances with n ≤ 50 show the percentage of the two- machine flow-shop scheduling problems solvable under the sufficient conditions given in Theorems 2 – 4.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the state of the art in measuring, modeling, and managing clogging in subsurface-flow treatment wetlands. Methods for measuring in situ hydraulic conductivity in treatment wetlands are now available, which provide valuable insight into assessing and evaluating the extent of clogging. These results, paired with the information from more traditional approaches (e.g., tracer testing and composition of the clog matter) are being incorporated into the latest treatment wetland models. Recent finite element analysis models can now simulate clogging development in subsurface-flow treatment wetlands with reasonable accuracy. Various management strategies have been developed to extend the life of clogged treatment wetlands, including gravel excavation and/or washing, chemical treatment, and application of earthworms. These strategies are compared and available cost information is reported. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a new network-flow interpretation of Łukasiewicz’s logic based on models with an increased effectiveness. The obtained results show that the presented network-flow models principally may work for multivalue logics with more than three states of the variables i.e. with a finite set of states in the interval from 0 to 1. The described models give the opportunity to formulate various logical functions. If the results from a given model that are contained in the obtained values of the arc flow functions are used as input data for other models then it is possible in Łukasiewicz’s logic to interpret successfully other sophisticated logical structures. The obtained models allow a research of Łukasiewicz’s logic with specific effective methods of the network-flow programming. It is possible successfully to use the specific peculiarities and the results pertaining to the function ‘traffic capacity of the network arcs’. Based on the introduced network-flow approach it is possible to interpret other multivalue logics – of E.Post, of L.Brauer, of Kolmogorov, etc.