968 resultados para Space Telescope Science Institute (U.S.)
Resumo:
The current gold standard for the design of orthopaedic implants is 3D models of long bones obtained using computed tomography (CT). However, high-resolution CT imaging involves high radiation exposure, which limits its use in healthy human volunteers. Magnetic resonance imaging (MRI) is an attractive alternative for the scanning of healthy human volunteers for research purposes. Current limitations of MRI include difficulties of tissue segmentation within joints and long scanning times. In this work, we explore the possibility of overcoming these limitations through the use of MRI scanners operating at a higher field strength. We quantitatively compare the quality of anatomical MR images of long bones obtained at 1.5 T and 3 T and optimise the scanning protocol of 3 T MRI. FLASH images of the right leg of five human volunteers acquired at 1.5 T and 3 T were compared in terms of signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR). The comparison showed a relatively high CNR and SNR at 3 T for most regions of the femur and tibia, with the exception of the distal diaphyseal region of the femur and the mid diaphyseal region of the tibia. This was accompanied by an ~65% increase in the longitudinal spin relaxation time (T1) of the muscle at 3 T compared to 1.5 T. The results suggest that MRI at 3 T may be able to enhance the segmentability and potentially improve the accuracy of 3D anatomical models of long bones, compared to 1.5 T. We discuss how the total imaging times at 3 T can be kept short while maximising the CNR and SNR of the images obtained.
Resumo:
This paper details the progress to date, toward developing a small autonomous helicopter. We describe system architecture, avionics, visual state estimation, custom IMU design, aircraft modelling, as well as various linear and neuro/fuzzy control algorithms. Experimental results are presented for state estimation using fused stereo vision and IMU data, heading control, and attitude control. FAM attitude and velocity controllers have been shown to be effective in simulation.
Resumo:
Online social networks can be modelled as graphs; in this paper, we analyze the use of graph metrics for identifying users with anomalous relationships to other users. A framework is proposed for analyzing the effectiveness of various graph theoretic properties such as the number of neighbouring nodes and edges, betweenness centrality, and community cohesiveness in detecting anomalous users. Experimental results on real-world data collected from online social networks show that the majority of users typically have friends who are friends themselves, whereas anomalous users’ graphs typically do not follow this common rule. Empirical analysis also shows that the relationship between average betweenness centrality and edges identifies anomalies more accurately than other approaches.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
We present and analyze several gaze-based graphical password schemes based on recall and cued-recall of grid points; eye-trackers are used to record user's gazes, which can prevent shoulder-surfing and may be suitable for users with disabilities. Our 22-subject study observes that success rate and entry time for the grid-based schemes we consider are comparable to other gaze-based graphical password schemes. We propose the first password security metrics suitable for analysis of graphical grid passwords and provide an in-depth security analysis of user-generated passwords from our study, observing that, on several metrics, user-generated graphical grid passwords are substantially weaker than uniformly random passwords, despite our attempts at designing schemes to improve quality of user-generated passwords.
Resumo:
This paper is concerned with the optimal path planning and initialization interval of one or two UAVs in presence of a constant wind. The method compares previous literature results on synchronization of UAVs along convex curves, path planning and sampling in 2D and extends it to 3D. This method can be applied to observe gas/particle emissions inside a control volume during sampling loops. The flight pattern is composed of two phases: a start-up interval and a sampling interval which is represented by a semi-circular path. The methods were tested in four complex model test cases in 2D and 3D as well as one simulated real world scenario in 2D and one in 3D.
Resumo:
The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.
Resumo:
The main objective of this paper is to describe the development of a remote sensing airborne air sampling system for Unmanned Aerial Systems (UAS) and provide the capability for the detection of particle and gas concentrations in real time over remote locations. The design of the air sampling methodology started by defining system architecture, and then by selecting and integrating each subsystem. A multifunctional air sampling instrument, with capability for simultaneous measurement of particle and gas concentrations was modified and integrated with ARCAA’s Flamingo UAS platform and communications protocols. As result of the integration process, a system capable of both real time geo-location monitoring and indexed-link sampling was obtained. Wind tunnel tests were conducted in order to evaluate the performance of the air sampling instrument in controlled nonstationary conditions at the typical operational velocities of the UAS platform. Once the remote fully operative air sampling system was obtained, the problem of mission design was analyzed through the simulation of different scenarios. Furthermore, flight tests of the complete air sampling system were then conducted to check the dynamic characteristics of the UAS with the air sampling system and to prove its capability to perform an air sampling mission following a specific flight path.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the approach of Berbain et al. to Trivium-like ciphers and perform new algebraic analyses on them, namely Trivium and its reduced versions: Trivium-N, Bivium-A and Bivium-B. In doing so, we answer an open question in the literature. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques which use the F4 algorithm to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analysis. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail.
Resumo:
This research makes a major contribution which enables efficient searching and indexing of large archives of spoken audio based on speaker identity. It introduces a novel technique dubbed as “speaker attribution” which is the task of automatically determining ‘who spoke when?’ in recordings and then automatically linking the unique speaker identities within each recording across multiple recordings. The outcome of the research will also have significant impact in improving the performance of automatic speech recognition systems through the extracted speaker identities.
Resumo:
Radio Frequency Identification is a wireless identification method that utilizes the reception of electromagnetic radio waves. This research has proposed a novel model to allow for an in-depth security analysis of current protocols and developed new flexible protocols that can be adapted to offer either stronger security or better efficiency.
Resumo:
Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.
Resumo:
User interfaces for source code editing are a crucial component in any software development environment, and in many editors visual annotations (overlaid on the textual source code) are used to provide important contextual information to the programmer. This paper focuses on the real-time programming activity of ‘cyberphysical’ programming, and considers the type of visual annotations which may be helpful in this programming context.
Resumo:
The main theme of this thesis is to allow the users of cloud services to outsource their data without the need to trust the cloud provider. The method is based on combining existing proof-of-storage schemes with distance-bounding protocols. Specifically, cloud customers will be able to verify the confidentiality, integrity, availability, fairness (or mutual non-repudiation), data freshness, geographic assurance and replication of their stored data directly, without having to rely on the word of the cloud provider.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.