69 resultados para Virtual and remote laboratories
Resumo:
Reaction Injection Moulding (RIM) is a moulding technology used for the production of large size and complex plastic parts. The RIM process is characterized essentially by the injection of a highly reactive chemical system (usually polyurethane) and fast cure, in a mould properly closed and thermally controlled. Several studies show that rapid manufacturing moulds obtained in epoxy resins for Thermoplastic Injection Moulding (TIM) affect the moulding process and the final properties of parts. The cycle time and mechanical properties of final parts are reduced, due to a low thermal conductivity of epoxy materials. In contrast, the low conductivity of materials usually applied for the rapid manufacturing of RIM moulds, increase the mechanical properties of final injected parts and reduce the cycle time. This study shows the effect of the rapid manufacturing moulds material during the RIM process. Several materials have been tested for rapid manufacturing of RIM moulds and the analysis of both, temperature profile of moulded parts during injection and the cure data experimentally obtained in a mixing and reaction cell, allow to determine and model the real effect of the mould material on the RIM process.
Resumo:
Currently many ontologies are available for addressing different domains. However, it is not always possible to deploy such ontologies to support collaborative working, so that their full potential can be exploited to implement intelligent cooperative applications capable of reasoning over a network of context-specific ontologies. The main problem arises from the fact that presently ontologies are created in an isolated way to address specific needs. However we foresee the need for a network of ontologies which will support the next generation of intelligent applications/devices, and, the vision of Ambient Intelligence. The main objective of this paper is to motivate the design of a networked ontology (Meta) model which formalises ways of connecting available ontologies so that they are easy to search, to characterise and to maintain. The aim is to make explicit the virtual and implicit network of ontologies serving the Semantic Web.
Resumo:
Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data to detect flooded regions in urban areas is described. The study uses a TerraSAR-X image of a 1 in 150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SAR End-To-End simulator (SETES) was used in conjunction with airborne scanning laser altimetry (LiDAR) data to estimate regions of the image in which water would not be visible due to shadow or layover caused by buildings and taller vegetation. A semi-automatic algorithm for the detection of floodwater in urban areas is described, together with its validation using the aerial photographs. 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. The algorithm is aimed at producing urban flood extents with which to calibrate and validate urban flood inundation models, and these findings indicate that TerraSAR-X is capable of providing useful data for this purpose.
Resumo:
As part of its Data User Element programme, the European Space Agency funded the GlobMODEL project which aimed at investigating the scientific, technical, and organizational issues associated with the use and exploitation of remotely-sensed observations, particularly from new sounders. A pilot study was performed as a "demonstrator" of the GlobMODEL idea, based on the use of new data, with a strong European heritage, not yet assimilated operationally. Two parallel assimilation experiments were performed, using either total column ozone or ozone profiles retrieved at the Royal Netherlands Meteorological Institute (KNMI) from the Ozone Monitoring Instrument (OMI). In both cases, the impact of assimilating OMI data in addition to the total ozone columns from the SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY) on the European Centre for Medium Range Weather Forecasts (ECMWF) ozone analyses was assessed by means of independent measurements. We found that the impact of OMI total columns is mainly limited to the region between 20 and 80 hPa, and is particularly important at high latitudes in the Southern hemisphere where the stratospheric ozone transport and chemical depletion are generally difficult to model with accuracy. Furthermore, the assimilation experiments carried out in this work suggest that OMI DOAS (Differential Optical Absorption Spectroscopy) total ozone columns are on average larger than SCIAMACHY total columns by up to 3 DU, while OMI total columns derived from OMI ozone profiles are on average about 8 DU larger than SCIAMACHY total columns. At the same time, the demonstrator brought to light a number of issues related to the assimilation of atmospheric composition profiles, such as the shortcomings arising when the vertical resolution of the instrument is not properly accounted for in the assimilation. The GlobMODEL demonstrator accelerated scientific and operational utilization of new observations and its results - prompted ECMWF to start the operational assimilation of OMI total column ozone data.
Resumo:
In a recent investigation, Landsat TM and ETM+ data were used to simulate different resolutions of remotely-sensed images (from 30 to 1100 m) and to analyze the effect of resolution on a range of landscape metrics associated with spatial patterns of forest fragmentation in Chapare, Bolivia since the mid-1980s. Whereas most metrics were found to be highly dependent on pixel size, several fractal metrics (DLFD, MPFD, and AWMPFD) were apparently independent of image resolution, in contradiction with a sizeable body of literature indicating that fractal dimensions of natural objects depend strongly on image characteristics. The present re-analysis of the Chapare images, using two alternative algorithms routinely used for the evaluation of fractal dimensions, shows that the values of the box-counting and information fractal dimensions are systematically larger, sometimes by as much as 85%, than the "fractal" indices DLFD, MPFD, and AWMFD for the same images. In addition, the geometrical fractal features of the forest and non-forest patches in the Chapare region strongly depend on the resolution of images used in the analysis. The largest dependency on resolution occurs for the box-counting fractal dimension in the case of the non-forest patches in 1993, where the difference between the 30 and I 100 m-resolution images corresponds to 24% of the full theoretical range (1.0 to 2.0) of the mass fractal dimension. The observation that the indices DLFD, MPFD, and AWMPFD, unlike the classical fractal dimensions, appear relatively unaffected by resolution in the case of the Chapare images seems due essentially to the fact that these indices are based on a heuristic, "non-geometric" approach to fractals. Because of their lack of a foundation in fractal geometry, nothing guarantees that these indices will be resolution-independent in general. (C) 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Resumo:
Airborne laser altimetry has the potential to make frequent detailed observations that are important for many aspects of studying land surface processes. However, the uncertainties inherent in airborne laser altimetry data have rarely been well measured. Uncertainty is often specified as generally as 20cm in elevation, and 40cm planimetric. To better constrain these uncertainties, we present an analysis of several datasets acquired specifically to study the temporal consistency of laser altimetry data, and thus assess its operational value. The error budget has three main components, each with a time regime. For measurements acquired less than 50ms apart, elevations have a local standard deviation in height of 3.5cm, enabling the local measurement of surface roughness of the order of 5cm. Points acquired seconds apart acquire an additional random error due to Differential Geographic Positioning System (DGPS) fluctuation. Measurements made up to an hour apart show an elevation drift of 7cm over a half hour. Over months, this drift gives rise to a random elevation offset between swathes, with an average of 6.4cm. The RMS planimetric error in point location was derived as 37.4cm. We conclude by considering the consequences of these uncertainties on the principle application of laser altimetry in the UK, intertidal zone monitoring.
Resumo:
This article reviews recent developments in the application of capillary electrophoresis (CE) for the analysis of foods and food components. CE has been applied to a number of important areas of food analysis and is fast becoming an established technique within food analytical and research laboratories. Papers are reviewed that were published during the two years to date following the previous review.
Resumo:
Identifying a stimulus as the target for a goal-directed movement involves inhibiting competing responses. Separable inhibitory interconnections bias local competition to ensure only one stimulus is selected and to alter movement initiation. Behavioural evidence of these inhibitory processes comes from the effects of distracters on oculomotor landing positions and saccade latencies. Here, we investigate the relationship between these two sources of inhibition. Targets were presented with or without close and remote distracters. In separate experiments the possible position and identity of the target and distracters were manipulated. In all cases saccade landing position was found to be less affected by the presence of the close distracter when remote distracters were also present. The involuntary increase in the latency of saccade initiation caused by the presence of the remote distracters alters the state of competitive processes involved in selecting the saccade target thus changing its landing position.
Resumo:
Stroke is a leading cause of disability in particular affecting older people. Although the causes of stroke are well known and it is possible to reduce these risks, there is still a need to improve rehabilitation techniques. Early studies in the literature suggest that early intensive therapies can enhance a patient's recovery. According to physiotherapy literature, attention and motivation are key factors for motor relearning following stroke. Machine mediated therapy offers the potential to improve the outcome of stroke patients engaged on rehabilitation for upper limb motor impairment. Haptic interfaces are a particular group of robots that are attractive due to their ability to safely interact with humans. They can enhance traditional therapy tools, provide therapy "on demand" and can present accurate objective measurements of a patient's progression. Our recent studies suggest the use of tele-presence and VR-based systems can potentially motivate patients to exercise for longer periods of time. The creation of human-like trajectories is essential for retraining upper limb movements of people that have lost manipulation functions following stroke. By coupling models for human arm movement with haptic interfaces and VR technology it is possible to create a new class of robot mediated neuro rehabilitation tools. This paper provides an overview on different approaches to robot mediated therapy and describes a system based on haptics and virtual reality visualisation techniques, where particular emphasis is given to different control strategies for interaction derived from minimum jerk theory and the aid of virtual and mixed reality based exercises.
Resumo:
In the decade since OceanObs `99, great advances have been made in the field of ocean data dissemination. The use of Internet technologies has transformed the landscape: users can now find, evaluate and access data rapidly and securely using only a web browser. This paper describes the current state of the art in dissemination methods for ocean data, focussing particularly on ocean observations from in situ and remote sensing platforms. We discuss current efforts being made to improve the consistency of delivered data and to increase the potential for automated integration of diverse datasets. An important recent development is the adoption of open standards from the Geographic Information Systems community; we discuss the current impact of these new technologies and their future potential. We conclude that new approaches will indeed be necessary to exchange data more effectively and forge links between communities, but these approaches must be evaluated critically through practical tests, and existing ocean data exchange technologies must be used to their best advantage. Investment in key technology components, cross-community pilot projects and the enhancement of end-user software tools will be required in order to assess and demonstrate the value of any new technology.