892 resultados para Parallel computing, Virtual machine, Composition, Determinism, Abstraction
Resumo:
Poster presented in the 11th Mediterranean Congress of Chemical Engineering, Barcelona, October 21-24, 2008.
Resumo:
Interacting with a computer system in the operating room (OR) can be a frustrating experience for a surgeon, who currently has to verbally delegate to an assistant every computer interaction task. This indirect mode of interaction is time consuming, error prone and can lead to poor usability of OR computer systems. This thesis describes the design and evaluation of a joystick-like device that allows direct surgeon control of the computer in the OR. The device was tested extensively in comparison to a mouse and delegated dictation with seven surgeons, eleven residents, and five graduate students. The device contains no electronic parts, is easy to use, is unobtrusive, has no physical connection to the computer and makes use of an existing tool in the OR. We performed a user study to determine its effectiveness in allowing a user to perform all the tasks they would be expected to perform on an OR computer system during a computer-assisted surgery. Dictation was found to be superior to the joystick in qualitative measures, but the joystick was preferred over dictation in user satisfaction responses. The mouse outperformed both joystick and dictation, but it is not a readily accepted modality in the OR.
Resumo:
With the quick advance of web service technologies, end-users can conduct various on-line tasks, such as shopping on-line. Usually, end-users compose a set of services to accomplish a task, and need to enter values to services to invoke the composite services. Quite often, users re-visit websites and use services to perform re-occurring tasks. The users are required to enter the same information into various web services to accomplish such re-occurring tasks. However, repetitively typing the same information into services is a tedious job for end-users. It can negatively impact user experience when an end-user needs to type the re-occurring information repetitively into web services. Recent studies have proposed several approaches to help users fill in values to services automatically. However, prior studies mainly suffer the following drawbacks: (1) limited support of collecting and analyzing user inputs; (2) poor accuracy of filling values to services; (3) not designed for service composition. To overcome the aforementioned drawbacks, we need maximize the reuse of previous user inputs across services and end-users. In this thesis, we introduce our approaches that prevent end-users from entering the same information into repetitive on-line tasks. More specifically, we improve the process of filling out services in the following 4 aspects: First, we investigate the characteristics of input parameters. We propose an ontology-based approach to automatically categorize parameters and fill values to the categorized input parameters. Second, we propose a comprehensive framework that leverages user contexts and usage patterns into the process of filling values to services. Third, we propose an approach for maximizing the value propagation among services and end-users by linking a set of semantically related parameters together and similar end-users. Last, we propose a ranking-based framework that ranks a list of previous user inputs for an input parameter to save a user from unnecessary data entries. Our framework learns and analyzes interactions of user inputs and input parameters to rank user inputs for input parameters under different contexts.
Resumo:
We present a machine learning-based system for automatically computing interpretable, quantitative measures of animal behavior. Through our interactive system, users encode their intuition about behavior by annotating a small set of video frames. These manual labels are converted into classifiers that can automatically annotate behaviors in screen-scale data sets. Our general-purpose system can create a variety of accurate individual and social behavior classifiers for different organisms, including mice and adult and larval Drosophila.
Resumo:
We have analyzed inorganic and organic carbons and determined the isotopic composition of both sedimentary organic carbon and inorganic carbon in carbonates contained in sediments recovered from Holes 434, 434A, 434B, 435, and 435A in the landward slope of Japan and from Hole 436 in the oceanic slope of the Japan Trench. Both inorganic and organic carbons were assayed at the P. P. Shirshov Institute of Oceanology, in the same sample, using the Knopp technique and measuring evolved CO2 gravimetrically. Each sample was analyzed twice in parallel. Measurements were of a ±0.05 per cent accuracy and a probability level of 0.95. Carbon isotopic analysis was carried out on a MI-1305 mass spectrometer at the I. M. Gubkin Institute of Petrochemical and Gas Industry and the results presented as dC13 values related to the PDB standard. The procedure for preparing samples for organic carbon isotopic analysis involved (1) drying damp sediments at 60°C; (2) treating samples, while heating, with 10 N HCl to remove carbonate carbon; and (3) evaporating surplus HCl at 60°C. The organic substance was turned to CO2 by oxidizing it in an oxygen atmosphere. To prepare samples for inorganic carbon isotopic analysis we decomposed the carbonates with orthophosphoric acid and refined the gas evolved. The dC13 measurements, including a full cycle of sample preparation, were of a ±0.5 per cent accuracy and a probability level of 0.95.
Resumo:
The Middle Valley segment at the northern end of the Juan de Fuca Ridge is a deep extensional rift blanketed with 200-500 m of Pleistocene turbiditic sediment. Sites 857 and 858 were drilled during Ocean Drilling Program Leg 139 to determine whether these two sites were hydrologically linked end members of an active hydrothermal circulation system. Site 858 was placed in an area of active hydrothermal discharge with fluids up to 270°C venting through anhydrite-bearing mounds on top of altered sediment. The shallow basement of fine-grained basalt that underlies the vents at Site 858 is interpreted as a seamount that was subsequently buried by turbidites. Site 857 was placed 1.6 km south of the Site 858 vents in a zone of high heat flow and numerous seismically imaged ridge-parallel faults. Drilling at Site 857 encountered sediments that are increasingly altered with depth and that overlie a series of mafic sills at depths of 460-940 m below sea floor. Sill margins and adjacent baked sediment are highly altered to magnesian chlorite and crosscut with veins filled with quartz, chlorite, sulfides, epidote, and wairakite. The sill interiors vary from slightly altered, with unaltered plagioclase and clinopyroxene in a mesostasis replaced by chlorite, to local zones of intense alteration and brecciation. In these latter zones, the sill interiors are pervasively replaced by chlorite, epidote, quartz, pyrite, titanite, and rare actinolite. The most complete replacement is associated with brecciated horizons with low recovery and slickensides on fracture surfaces, which we interpret as intersections between faults and the sills. Geochemically, the alteration of the sill complex is reflected in significant whole-rock depletions in Ca, Sr, and Na with corresponding enrichments in Mg, Al, and most metals. The latter results from the formation of conspicuous sulfide poikiloblasts. In contrast, metamorphism of the Site 858 seamount includes incomplete albitization of plagioclase phenocrysts and replacement of sparse mafic phenocrysts. Much of the basement alteration at Site 858 is confined to crosscutting veins except for a highly altered and veined horizon at the contact between basaltic basement and the overlying sediment. The sill complex at Site 857 is more highly depleted in 18O (d18O = 2.4 per mil - 4.7 per mil) and more pervasively replaced by secondary minerals relative to the extrusives at Site 858 (d18O = 4.5 per mil - 5.5 per mil). There is no evidence of significant albitization of the plagioclase at Site 857, suggesting high Ca/Na in the pore fluids. Fluid-inclusion data from hydrothermal minerals in altered mafic rocks and veins at Sites 857 and 858 show a consistency of homogenization temperatures, varying from 245 to 270°C, which is within the range of temperatures observed for the fluids venting at Site 858. The consistency of the fluid inclusion temperatures, the lack of albitization within the Site 857 sills, and the apparently low water/rock ratio collectively suggest that the sill complex at Site 857 is in thermal equilibrium and being altered by a highly evolved Ca-rich fluid similar to the fluids now venting at Site 858. The alteration evident in these two deep crustal drillsites is a result of the ongoing hydrothermal circulation and is consistent with downhole logging results, instrumented borehole results, and hydrothermal fluid chemistry. The pervasive alteration of the laterally extensive sill-sediment complex at Site 857 determines the chemistry of the fluids that are venting at Site 858. The limited alteration of the Site 858 lavas suggests that this basement edifice acts as a penetrator or ventilator for the regional hydrothermal reservoir with much of the flow focussed at the highly altered and veined sediment-basalt contact.
Resumo:
Sedimentary processes in the southeastern Weddell Sea are influenced by glacial-interglacial ice-shelf dynamics and the cyclonic circulation of the Weddell Gyre, which affects all water masses down to the sea floor. Significantly increased sedimentation rates occur during glacial stages, when ice sheets advance to the shelf edge and trigger gravitational sediment transport to the deep sea. Downslope transport on the Crary Fan and off Dronning Maud and Coats Land is channelized into three huge channel systems, which originate on the eastern-, the central and the western Crary Fan. They gradually turn from a northerly direction eastward until they follow a course parallel to the continental slope. All channels show strongly asymmetric cross sections with well-developed levees on their northwestern sides, forming wedge-shaped sediment bodies. They level off very gently. Levees on the southeastern sides are small, if present at all. This characteristic morphology likely results from the process of combined turbidite-contourite deposition. Strong thermohaline currents of the Weddell Gyre entrain particles from turbidity-current suspensions, which flow down the channels, and carry them westward out of the channel where they settle on a surface gently dipping away from the channel. These sediments are intercalated with overbank deposits of high-energy and high-volume turbidity currents, which preferentially flood the left of the channels (looking downchannel) as a result of Coriolis force. In the distal setting of the easternmost channel-levee complex, where thermohaline currents are directed northeastward as a result of a recirculation of water masses from the Enderby Basin, the setting and the internal structures of a wedge-shaped sediment body indicate a contourite drift rather than a channel levee. Dating of the sediments reveals that the levees in their present form started to develop with a late Miocene cooling event, which caused an expansion of the East Antarctic Ice Sheet and an invigoration of thermohaline current activity.
Resumo:
Effectively using heterogeneous, distributed information has attracted much research in recent years. Current web services technologies have been used successfully in some non data intensive distributed prototype systems. However, most of them can not work well in data intensive environment. This paper provides an infrastructure layer in data intensive environment for the effectively providing spatial information services by using the web services over the Internet. We extensively investigate and analyze the overhead of web services in data intensive environment, and propose some new optimization techniques which can greatly increase the system’s efficiency. Our experiments show that these techniques are suitable to data intensive environment. Finally, we present the requirement of these techniques for the information of web services over the Internet.
Resumo:
Despite the insight gained from 2-D particle models, and given that the dynamics of crustal faults occur in 3-D space, the question remains, how do the 3-D fault gouge dynamics differ from those in 2-D? Traditionally, 2-D modeling has been preferred over 3-D simulations because of the computational cost of solving 3-D problems. However, modern high performance computing architectures, combined with a parallel implementation of the Lattice Solid Model (LSM), provide the opportunity to explore 3-D fault micro-mechanics and to advance understanding of effective constitutive relations of fault gouge layers. In this paper, macroscopic friction values from 2-D and 3-D LSM simulations, performed on an SGI Altix 3700 super-cluster, are compared. Two rectangular elastic blocks of bonded particles, with a rough fault plane and separated by a region of randomly sized non-bonded gouge particles, are sheared in opposite directions by normally-loaded driving plates. The results demonstrate that the gouge particles in the 3-D models undergo significant out-of-plane motion during shear. The 3-D models also exhibit a higher mean macroscopic friction than the 2-D models for varying values of interparticle friction. 2-D LSM gouge models have previously been shown to exhibit accelerating energy release in simulated earthquake cycles, supporting the Critical Point hypothesis. The 3-D models are shown to also display accelerating energy release, and good fits of power law time-to-failure functions to the cumulative energy release are obtained.
Resumo:
This paper reports on a current research project in which virtual reality simulators are being investigated as a means of simulating hazardous Rail work conditions in order to allow train drivers to practice decision-making under stress. When working under high stress conditions train drivers need to move beyond procedural responses into a response activated through their own problem-solving and decision-making skills. This study focuses on the use of stress inoculation training which aims to build driver’s confidence in the use of new decision-making skills by being repeatedly required to respond to hazardous driving conditions. In particular, the study makes use of a train cab driving simulator to reproduce potentially stress inducing real-world scenarios. Initial pilot research has been undertaken in which drivers have experienced the training simulation and subsequently completed surveys on the level of immersion experienced. Concurrently drivers have also participated in a velocity perception experiment designed to objectively measure the fidelity of the virtual training environment. Baseline data, against which decision-making skills post training will be measured, is being gathered via cognitive task analysis designed to identify primary decision requirements for specific rail events. While considerable efforts have been invested in improving Virtual Reality technology, little is known about how to best use this technology for training personnel to respond to workplace conditions in the Rail Industry. To enable the best use of simulators for training in the Rail context the project aims to identify those factors within virtual reality that support required learning outcomes and use this information to design training simulations that reliably and safely train staff in required workplace accident response skills.
Synthesis of serial communications controller using higher abstraction level derivation (HALD) model
Resumo:
The software implementation of the emergency shutdown feature in a major radiotherapy system was analyzed, using a directed form of code review based on module dependences. Dependences between modules are labelled by particular assumptions; this allows one to trace through the code, and identify those fragments responsible for critical features. An `assumption tree' is constructed in parallel, showing the assumptions which each module makes about others. The root of the assumption tree is the critical feature of interest, and its leaves represent assumptions which, if not valid, might cause the critical feature to fail. The analysis revealed some unexpected assumptions that motivated improvements to the code.