901 resultados para source code analysis
Resumo:
The liquid metal flow in induction crucible models is known to be unstable, turbulent and difficult to predict in the regime of medium frequencies when the electromagnetic skin-layer is of considerable extent. We present long term turbulent flow measurements by a permanent magnet incorporated potential difference velocity probe in a cylindrical container filled with eutectic melt In-Ga-Sn. The parallel numerical simulation of the long time scale development of the turbulent average flow is presented. The numerical flow model uses an implicit pseudo-spectral code and k-w turbulence model, which was recently developed for the transitional flow modelling. The results compare reasonably to the experiment and demonstrate the time development of the turbulent flow field and the turbulence energy.
Resumo:
The liquid metal flow in inducation crucible models is known to be higly unstable and turbutlen in the regim e of medium frequecies when the elctronmagnetic skin-layer is of considerable extent. We present long term turbulent flow measurements by a permanent magnet incorporated potential difference veolocity probe in a cylindirical container filled with eutecti mlt In-Ga-SN. The parallel numerical simulation of the long time scale development of the turbulen average flow is presented. The numerical lfow model uses a pseud-spectral code and k-w turbulence model, which was recently developed for the transitional flow modelling. The result compare reasonably to the experiment and demonstrate the time development of the turbulent flow field.
Resumo:
SMARTFIRE, an open architecture integrated CFD code and knowledge based system attempts to make fire field modeling accessible to non-experts in Computational Fluid Dynamics (CFD) such as fire fighters, architects and fire safety engineers. This is achieved by embedding expert knowledge into CFD software. This enables the 'black-art' associated with the CFD analysis such as selection of solvers, relaxation parameters, convergence criteria, time steps, grid and boundary condition specification to be guided by expert advice from the software. The user is however given the option of overriding these decisions, thus retaining ultimate control. SMARTFIRE also makes use of recent developments in CFD technology such as unstructured meshes and group solvers in order to make the CFD analysis more efficient. This paper describes the incorporation within SMARTFIRE of the expert fire modeling knowledge required for automatic problem setup and mesh generation as well as the concept and use of group solvers for automatic and manual dynamic control of the CFD code.
Resumo:
A cell-centred finite volume(CC-FV) solid mechanics formulation, based on a computational fluid dynamics(CFD) procedure, is presented. A CFD code is modified such that the velocity variable is used as to the displacement variable. Displacement and pressure fields are considered as unknown variables. The results are validated with finite element(FE) and cell-vertex finite volume(CV-FV) predictions based on discretisation of the equilibrium equations. The developed formulation is applicable for both compressible and incompressible solids behaviour. The method is general and can be extended for the simultaneous analysis of problems involving flow-thermal and stress effects.
Resumo:
The Aircraft Accident Statistics and Knowledge (AASK) database is a repository of survivor accounts from aviation accidents. Its main purpose is to store observational and anecdotal data from the actual interviews of the occupants involved in aircraft accidents. The database has wide application to aviation safety analysis, being a source of factual data regarding the evacuation process. It is also key to the development of aircraft evacuation models such as airEXODUS, where insight into how people actually behave during evacuation from survivable aircraft crashes is required. This paper describes recent developments with the database leading to the development of AASK v3.0. These include significantly increasing the number of passenger accounts in the database, the introduction of cabin crew accounts, the introduction of fatality information, improved functionality through the seat plan viewer utility and improved ease of access to the database via the internet. In addition, the paper demonstrates the use of the database by investigating a number of important issues associated with aircraft evacuation. These include issues associated with social bonding and evacuation, the relationship between the number of crew and evacuation efficiency, frequency of exit/slide failures in accidents and exploring possible relationships between seating location and chances of survival. Finally, the passenger behavioural trends described in analysis undertaken with the earlier database are confirmed with the wider data set.
Resumo:
This chapter discusses the code parallelization environment, where a number of tools that address the main tasks, such as code parallelization, debugging, and optimization are available. The parallelization tools include ParaWise and CAPO, which enable the near automatic parallelization of real world scientific application codes for shared and distributed memory-based parallel systems. The chapter discusses the use of ParaWise and CAPO to transform the original serial code into an equivalent parallel code that contains appropriate OpenMP directives. Additionally, as user involvement can introduce errors, a relative debugging tool (P2d2) is also available and can be used to perform near automatic relative debugging of an OpenMP program that has been parallelized either using the tools or manually. In order for these tools to be effective in parallelizing a range of applications, a high quality fully inter-procedural dependence analysis, as well as user interaction is vital to the generation of efficient parallel code and in the optimization of the backtracking and speculation process used in relative debugging. Results of parallelized NASA codes are discussed and show the benefits of using the environment.
Resumo:
Despite the apparent simplicity of the OpenMP directive shared memory programming model and the sophisticated dependence analysis and code generation capabilities of the ParaWise/CAPO tools, experience shows that a level of expertise is required to produce efficient parallel code. In a real world application the investigation of a single loop in a generated parallel code can soon become an in-depth inspection of numerous dependencies in many routines. The additional understanding of dependencies is also needed to effectively interpret the information provided and supply the required feedback. The ParaWise Expert Assistant has been developed to automate this investigation and present questions to the user about, and in the context of, their application code. In this paper, we demonstrate that knowledge of dependence information and OpenMP are no longer essential to produce efficient parallel code with the Expert Assistant. It is hoped that this will enable a far wider audience to use the tools and subsequently, exploit the benefits of large parallel systems.
Resumo:
Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.
Resumo:
An aerodynamic sound source extraction from a general flow field is applied to a number of model problems and to a problem of engineering interest. The extraction technique is based on a variable decomposition, which results to an acoustic correction method, of each of the flow variables into a dominant flow component and a perturbation component. The dominant flow component is obtained with a general-purpose Computational Fluid Dynamics (CFD) code which uses a cell-centred finite volume method to solve the Reynolds-averaged Navier–Stokes equations. The perturbations are calculated from a set of acoustic perturbation equations with source terms extracted from unsteady CFD solutions at each time step via the use of a staggered dispersion-relation-preserving (DRP) finite-difference scheme. Numerical experiments include (1) propagation of a 1-D acoustic pulse without mean flow, (2) propagation of a 2-D acoustic pulse with/without mean flow, (3) reflection of an acoustic pulse from a flat plate with mean flow, and (4) flow-induced noise generated by the an unsteady laminar flow past a 2-D cavity. The computational results demonstrate the accuracy for model problems and illustrate the feasibility for more complex aeroacoustic problems of the source extraction technique.
Resumo:
We explore the potential application of cognitive interrogator network (CIN) in remote monitoring of mobile subjects in domestic environments, where the ultra-wideband radio frequency identification (UWB-RFID) technique is considered for accurate source localization. We first present the CIN architecture in which the central base station (BS) continuously and intelligently customizes the illumination modes of the distributed transceivers in response to the systempsilas changing knowledge of the channel conditions and subject movements. Subsequently, the analytical results of the locating probability and time-of-arrival (TOA) estimation uncertainty for a large-scale CIN with randomly distributed interrogators are derived based upon the implemented cognitive intelligences. Finally, numerical examples are used to demonstrate the key effects of the proposed cognitions on the system performance
Resumo:
During the 1970’s and 1980’s, the late Dr Norman Holme undertook extensive towed sledge surveys in the English Channel and some in the Irish Sea. Only a minority of the resulting images were analysed and reported before his death in 1989 but logbooks, video and film material has been archived in the National Marine Biological Library (NMBL) in Plymouth. A scoping study was therefore commissioned by the Joint Nature Conservation Committee and as a part of the Mapping European Seabed Habitats (MESH) project to identify the value of the material archived and the procedure and cost to undertake further work. The results of the scoping study are: 1. NMBL archives hold 106 videotapes (reel-to-reel Sony HD format) and 59 video cassettes (including 15 from the Irish Sea) in VHS format together with 90 rolls of 35 mm colour transparency film (various lengths up to about 240 frames per film). These are stored in the Archive Room, either in a storage cabinet or in original film canisters. 2. Reel-to-reel material is extensive and had already been selectively copied to VHS cassettes. The cost of transferring it to an accepted ‘long-life’ medium (Betamax) would be approximately £15,000. It was not possible to view the tapes as a suitable machine was not located. The value of the tapes is uncertain but they are likely to become beyond salvation within one to two years. 3. Video cassette material is in good condition and is expected to remain so for several more years at least. Images viewed were generally of poor quality and the speed of tow often makes pictures blurred. No immediate action is required. 4. Colour transparency films are in good condition and the images are very clear. They provide the best source of information for mapping seabed biotopes. They should be scanned to digital format but inexpensive fast copying is problematic as there are no between-frame breaks between images and machines need to centre the image based on between-frame breaks. The minimum cost to scan all of the images commercially is approximately £6,000 and could be as much as £40,000 on some quotations. There is a further cost in coding and databasing each image and, all-in-all it would seem most economic to purchase a ‘continuous film’ scanner and undertake the work in-house. 5. Positional information in ships logs has been matched to films and to video tapes. Decca Chain co-ordinates recorded in the logbooks have been converted to latitude and longitude (degrees, minutes and seconds) and a further routine developed to convert to degrees and decimal degrees required for GIS mapping. However, it is unclear whether corrections to Decca positions were applied at the time the position was noted. Tow tracks have been mapped onto an electronic copy of a Hydrographic Office chart. 6. The positions of start and end of each tow were entered to a spread sheet so that they can be displayed on GIS or on a Hydrographic Office Chart backdrop. The cost of the Hydrographic Office chart backdrop at a scale of 1:75,000 for the whole area was £458 incl. VAT. 7. Viewing all of the video cassettes to note habitats and biological communities, even by an experienced marine biologist, would take at least in the order of 200 hours and is not recommended. English Channel towed sledge seabed images. Phase 1: scoping study and example analysis. 6 8. Once colour transparencies are scanned and indexed, viewing to identify seabed habitats and biological communities would probably take about 100 hours for an experienced marine biologist and is recommended. 9. It is expected that identifying biotopes along approximately 1 km lengths of each tow would be feasible although uncertainties about Decca co-ordinate corrections and exact positions of images most likely gives a ±250 m position error. More work to locate each image accurately and solve the Decca correction question would improve accuracy of image location. 10. Using codings (produced by Holme to identify different seabed types), and some viewing of video and transparency material, 10 biotopes have been identified, although more would be added as a result of full analysis. 11. Using the data available from the Holme archive, it is possible to populate various fields within the Marine Recorder database. The overall ‘survey’ will be ‘English Channel towed video sled survey’. The ‘events’ become the 104 tows. Each tow could be described as four samples, i.e. the start and end of the tow and two areas in the middle to give examples along the length of the tow. These samples would have their own latitude/longitude co-ordinates. The four samples would link to a GIS map. 12. Stills and video clips together with text information could be incorporated into a multimedia presentation, to demonstrate the range of level seabed types found along a part of the northern English Channel. More recent images taken during SCUBA diving of reef habitats in the same area as the towed sledge surveys could be added to the Holme images.
Resumo:
Coccolithophores are the largest source of calcium carbonate in the oceans and are considered to play an important role in oceanic carbon cycles. Current methods to detect the presence of coccolithophore blooms from Earth observation data often produce high numbers of false positives in shelf seas and coastal zones due to the spectral similarity between coccolithophores and other suspended particulates. Current methods are therefore unable to characterise the bloom events in shelf seas and coastal zones, despite the importance of these phytoplankton in the global carbon cycle. A novel approach to detect the presence of coccolithophore blooms from Earth observation data is presented. The method builds upon previous optical work and uses a statistical framework to combine spectral, spatial and temporal information to produce maps of coccolithophore bloom extent. Validation and verification results for an area of the north east Atlantic are presented using an in situ database (N = 432) and all available SeaWiFS data for 2003 and 2004. Verification results show that the approach produces a temporal seasonal signal consistent with biological studies of these phytoplankton. Validation using the in situ coccolithophore cell count database shows a high correct recognition rate of 80% and a low false-positive rate of 0.14 (in comparison to 63% and 0.34 respectively for the established, purely spectral approach). To guide its broader use, a full sensitivity analysis for the algorithm parameters is presented.
Resumo:
Carbon and nitrogen stable isotope ratios of amino acids (δ13CAA and δ15NAA) have been recently used to unravel trophic relationships in aquatic and terrestrial environments. However, none have studied the specific case of a symbiotic relationship. Here we use the stable isotope ratios of amino acids (AAs) to investigate the link between a scarab larva (Pericoptustruncatus) and its mite guest (Mumulaelaps, Mesostigmata: Laelapidae: Hypoaspidini). Five scenarios for the relationship between larva and mite were proposed and δ13CAA and δ15NAA respective data and patterns helped eliminate those that were inconsistent. The calculated gap of two trophic levels ruled out a parasitic trophic relationship scenario. The trophic relationship between P. truncatus was shown to most likely be commensalistic with the mites feeding on the larva's castings. Alongside this study, a comparison with the stable isotope bulk analysis method was made and demonstrated that the AA method brings a significant refinement to the results by providing a means of determining absolute tropic level without the need for prior knowledge of the isotopic composition of primary source material.
Resumo:
Aims/Hypothesis: To describe the epidemiology of childhood-onset Type 1 (insulin-dependent) diabetes in Europe, the EURODIAB collaborative group has established prospective, geographically-defined registers of children diagnosed under 15 years. A total of 16,362 cases were registered by 44 centres during the period 1989-1994. The registers cover a population of approximately 28 million children with most European countries represented. Methods In most centres a primary and a secondary source of ascertainment were used so that the completeness of registration could be assessed by the capture-recapture method. Ecological correlation and regression analyses were used to study the relationship between incidence and various environmental, health and economic indicators. Findings: The standardised average annual incidence rate during the period 1989-94 ranged from 3.2 cases per 100,000 per annum in the Former Yugoslavian Republic of Macedonia to 40.2 cases per 100,000 per annum in Finland. Indicators of national prosperity such as infant mortality (r= -0.64) and gross domestic product (r= 0.58) were most strongly and significantly correlated with incidence rate, and previously-reported associations with coffee consumption (r= 0.51), milk consumption (r= 0.58) and latitude (r= 0.40) were also observed. Conclusion/Interpretation: The wide variation in childhood type 1 diabetes incidence rates within Europe could be partially explained by indicators of national prosperity. These indicators could reflect differences in environmental risk factors such as nutrition or lifestyle that are important in determining a country's incidence rate.
Resumo:
This paper proposes a modification to the ACI 318-02 equivalent frame method of analysis of reinforced concrete flat plate exterior panels. Two existing code methods were examined: ACI 318 and BS 8110. The derivation of the torsional stiffness of the edge strip as proposed by ACI 318 is examined and a more accurate estimate of this value is proposed, based on both theoretical analysis and experimental results. A series of 1/3-scale models of flat plate exterior panels have been tested. Unique experimental results were obtained by measuring strains in reinforcing bars at approximately 200 selected locations in the plate panel throughout the entire loading history. The measured strains were used to calculate curvature and, hence, bending moments; these were used along with moments in the columns to assess the accuracy of the equivalent frame methods. The proposed method leads to a more accurate prediction of the moments in the plate at the column front face, at the panel midspan, and in the edge column. Registered Subscribers: View the full article. This document is available as a free download to qualified members. An electronic (PDF) version is available for purchase and download. Click on the Order Now button to continue with the download.