231 resultados para crash modelling
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
The European research project TIDE (Tidal Inlets Dynamics and Environment) is developing and validating coupled models describing the morphological, biological and ecological evolution of tidal environments. The interactions between the physical and biological processes occurring in these regions requires that the system be studied as a whole rather than as separate parts. Extensive use of remote sensing including LiDAR is being made to provide validation data for the modelling. This paper describes the different uses of LiDAR within the project and their relevance to the TIDE science objectives. LiDAR data have been acquired from three different environments, the Venice Lagoon in Italy, Morecambe Bay in England, and the Eden estuary in Scotland. LiDAR accuracy at each site has been evaluated using ground reference data acquired with differential GPS. A semi-automatic technique has been developed to extract tidal channel networks from LiDAR data either used alone or fused with aerial photography. While the resulting networks may require some correction, the procedure does allow network extraction over large areas using objective criteria and reduces fieldwork requirements. The networks extracted may subsequently be used in geomorphological analyses, for example to describe the drainage patterns induced by networks and to examine the rate of change of networks. Estimation of the heights of the low and sparse vegetation on marshes is being investigated by analysis of the statistical distribution of the measured LiDAR heights. Species having different mean heights may be separated using the first-order moments of the height distribution.
Resumo:
The role of convective processes in moistening the atmosphere during suppressed periods of the suppressed phase of a Madden-Julian oscillation is investigated in cloud-resolving model (CRM) simulations, and the impact of moistening on the subsequent evolution of convection is assessed as part of a Global Energy and Water Cycle Experiment Cloud System Study (GCSS) intercomparison project. The ability of single-column model (SCM) versions of a number of state-of-the-art climate and numerical weather prediction models to capture these convective processes is also evaluated. During the suppressed periods, the CRMs are found to simulate a maximum moistening around 3 km, which is associated with a predominance of shallow convection. All SCMs produce adequate amounts of shallow convection during the suppressed periods, comparable to that seen in CRMs, but the relatively drier SCMs have higher precipitation rates than the relatively wetter SCMs and CRMs. The relatively drier SCMs dry, rather than moisten, the lower troposphere below the melting level. During the transition periods, convective processes act to moisten the atmosphere above the level at which mean advection changes from moistening to drying, despite an overall drying effect for the column. The SCMs capture some essence of this moistening at upper levels. A gradual transition from shallow to deep convection is simulated by the CRMs and the wetter SCMs during the transition periods, but the onset of deep convection is delayed in the drier SCMs. This results in lower precipitation rates for these SCMs during the active periods, although much better agreement exists between the models at this time.
Resumo:
Two ongoing projects at ESSC that involve the development of new techniques for extracting information from airborne LiDAR data and combining this information with environmental models will be discussed. The first project in conjunction with Bristol University is aiming to improve 2-D river flood flow models by using remote sensing to provide distributed data for model calibration and validation. Airborne LiDAR can provide such models with a dense and accurate floodplain topography together with vegetation heights for parameterisation of model friction. The vegetation height data can be used to specify a friction factor at each node of a model’s finite element mesh. A LiDAR range image segmenter has been developed which converts a LiDAR image into separate raster maps of surface topography and vegetation height for use in the model. Satellite and airborne SAR data have been used to measure flood extent remotely in order to validate the modelled flood extent. Methods have also been developed for improving the models by decomposing the model’s finite element mesh to reflect floodplain features such as hedges and trees having different frictional properties to their surroundings. Originally developed for rural floodplains, the segmenter is currently being extended to provide DEMs and friction parameter maps for urban floods, by fusing the LiDAR data with digital map data. The second project is concerned with the extraction of tidal channel networks from LiDAR. These networks are important features of the inter-tidal zone, and play a key role in tidal propagation and in the evolution of salt-marshes and tidal flats. The study of their morphology is currently an active area of research, and a number of theories related to networks have been developed which require validation using dense and extensive observations of network forms and cross-sections. The conventional method of measuring networks is cumbersome and subjective, involving manual digitisation of aerial photographs in conjunction with field measurement of channel depths and widths for selected parts of the network. A semi-automatic technique has been developed to extract networks from LiDAR data of the inter-tidal zone. A multi-level knowledge-based approach has been implemented, whereby low level algorithms first extract channel fragments based mainly on image properties then a high level processing stage improves the network using domain knowledge. The approach adopted at low level uses multi-scale edge detection to detect channel edges, then associates adjacent anti-parallel edges together to form channels. The higher level processing includes a channel repair mechanism.
Resumo:
Nitrogen oxide biogenic emissions from soils are driven by soil and environmental parameters. The relationship between these parameters and NO fluxes is highly non linear. A new algorithm, based on a neural network calculation, is used to reproduce the NO biogenic emissions linked to precipitations in the Sahel on the 6 August 2006 during the AMMA campaign. This algorithm has been coupled in the surface scheme of a coupled chemistry dynamics model (MesoNH Chemistry) to estimate the impact of the NO emissions on NOx and O3 formation in the lower troposphere for this particular episode. Four different simulations on the same domain and at the same period are compared: one with anthropogenic emissions only, one with soil NO emissions from a static inventory, at low time and space resolution, one with NO emissions from neural network, and one with NO from neural network plus lightning NOx. The influence of NOx from lightning is limited to the upper troposphere. The NO emission from soils calculated with neural network responds to changes in soil moisture giving enhanced emissions over the wetted soil, as observed by aircraft measurements after the passing of a convective system. The subsequent enhancement of NOx and ozone is limited to the lowest layers of the atmosphere in modelling, whereas measurements show higher concentrations above 1000 m. The neural network algorithm, applied in the Sahel region for one particular day of the wet season, allows an immediate response of fluxes to environmental parameters, unlike static emission inventories. Stewart et al (2008) is a companion paper to this one which looks at NOx and ozone concentrations in the boundary layer as measured on a research aircraft, examines how they vary with respect to the soil moisture, as indicated by surface temperature anomalies, and deduces NOx fluxes. In this current paper the model-derived results are compared to the observations and calculated fluxes presented by Stewart et al (2008).
Resumo:
Discussion of the numerical modeling of NDT methods based on the potential drop and the disruption of power lines to describe the nature, importance and application of modeling. La 1ère partie est consacrée aux applications aux contrôles par courants de Foucault. The first part is devoted to applications for inspection by eddy currents.
Resumo:
Control by voltage drop DC. Contrôle par chute de potentiel de courant alternatif. Control by voltage drop AC.
Resumo:
Eddy current testing by current deflection detects surface cracks and geometric features by sensing the re-routing of currents. Currents are diverted by cracks in two ways: down the walls, and along their length at the surface. Current deflection utilises the latter currents, detecting them via their tangential magnetic field. Results from 3-D finite element computer modelling, which show the two forms of deflection, are presented. Further results indicate that the current deflection technique is suitable for the detection of surface cracks in smooth materials with varying material properties.
Resumo:
Goal modelling is a well known rigorous method for analysing problem rationale and developing requirements. Under the pressures typical of time-constrained projects its benefits are not accessible. This is because of the effort and time needed to create the graph and because reading the results can be difficult owing to the effects of crosscutting concerns. Here we introduce an adaptation of KAOS to meet the needs of rapid turn around and clarity. The main aim is to help the stakeholders gain an insight into the larger issues that might be overlooked if they make a premature start into implementation. The method emphasises the use of obstacles, accepts under-refined goals and has new methods for managing crosscutting concerns and strategic decision making. It is expected to be of value to agile as well as traditional processes.
Resumo:
Uncertainties associated with the representation of various physical processes in global climate models (GCMs) mean that, when projections from GCMs are used in climate change impact studies, the uncertainty propagates through to the impact estimates. A complete treatment of this ‘climate model structural uncertainty’ is necessary so that decision-makers are presented with an uncertainty range around the impact estimates. This uncertainty is often underexplored owing to the human and computer processing time required to perform the numerous simulations. Here, we present a 189-member ensemble of global river runoff and water resource stress simulations that adequately address this uncertainty. Following several adaptations and modifications, the ensemble creation time has been reduced from 750 h on a typical single-processor personal computer to 9 h of high-throughput computing on the University of Reading Campus Grid. Here, we outline the changes that had to be made to the hydrological impacts model and to the Campus Grid, and present the main results. We show that, although there is considerable uncertainty in both the magnitude and the sign of regional runoff changes across different GCMs with climate change, there is much less uncertainty in runoff changes for regions that experience large runoff increases (e.g. the high northern latitudes and Central Asia) and large runoff decreases (e.g. the Mediterranean). Furthermore, there is consensus that the percentage of the global population at risk to water resource stress will increase with climate change.
Resumo:
Northern hemisphere snow water equivalent (SWE) distribution from remote sensing (SSM/I), the ERA40 reanalysis product and the HadCM3 general circulation model are compared. Large differences are seen in the February climatologies, particularly over Siberia. The SSM/I retrieval algorithm may be overestimating SWE in this region, while comparison with independent runoff estimates suggest that HadCM3 is underestimating SWE. Treatment of snow grain size and vegetation parameterizations are concerns with the remotely sensed data. For this reason, ERA40 is used as `truth' for the following experiments. Despite the climatology differences, HadCM3 is able to reproduce the distribution of ERA40 SWE anomalies when assimilating ERA40 anomaly fields of temperature, sea level pressure, atmospheric winds and ocean temperature and salinity. However when forecasts are released from these assimilated initial states, the SWE anomaly distribution diverges rapidly from that of ERA40. No predictability is seen from one season to another. Strong links between European SWE distribution and the North Atlantic Oscillation (NAO) are seen, but forecasts of this index by the assimilation scheme are poor. Longer term relationships between SWE and the NAO, and SWE and the El Ni\~no-Southern Oscillation (ENSO) are also investigated in a multi-century run of HadCM3. SWE is impacted by ENSO in the Himalayas and North America, while the NAO affects SWE in North America and Europe. While significant connections with the NAO index were only present in DJF (and to an extent SON), the link between ENSO and February SWE distribution was seen to exist from the previous JJA ENSO index onwards. This represents a long lead time for SWE prediction for hydrological applications such as flood and wildfire forecasting. Further work is required to develop reliable large scale observation-based SWE datasets with which to test these model-derived connections.