146 resultados para Software metrics
Resumo:
OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.
METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.
RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).
CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.
ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.
Resumo:
The upcoming IEEE 802.11ac standard boosts the throughput of previous IEEE 802.11n by adding wider 80 MHz and 160 MHz channels with up to 8 antennas (versus 40 MHz channel and 4 antennas in 802.11n). This necessitates new 1-8 stream 256/512-point Fast Fourier Transform (FFT) / inverse FFT (IFFT) processing with 80/160 MSample/s throughput. Although there are abundant related work, they all fail to meet the requirements of IEEE 802.11ac FFT/IFFT on point size, throughput and multiple data streams at the same time. This paper proposes the first software defined FFT/IFFT architecture as a solution. By making use of a customised soft stream processor on FPGA, we show how a software defined FFT architecture can meet all the requirements of IEEE 802.11ac with low cost and high resource efficiency. When compared with dedicated Xilinx FFT core, our implementation exhibits only one third of the resources also up to three times of resource efficiency.
Resumo:
Software-programmable `soft' processors have shown tremendous potential for efficient realisation of high performance signal processing operations on Field Programmable Gate Array (FPGA), whilst lowering the design burden by avoiding the need to design fine-grained custom circuit archi-tectures. However, the complex data access patterns, high memory bandwidth and computational requirements of sliding window applications, such as Motion Estimation (ME) and Matrix Multiplication (MM), lead to low performance, inefficient soft processor realisations. This paper resolves this issue, showing how by adding support for block data addressing and accelerators for high performance loop execution, performance and resource efficiency over four times better than current best-in-class metrics can be achieved. In addition, it demonstrates the first recorded real-time soft ME estimation realisation for H.263 systems.
Resumo:
BACKGROUND: Smart tags attached to freely-roaming animals recording multiple parameters at infra-second rates are becoming commonplace, and are transforming our understanding of the way wild animals behave. Interpretation of such data is complex and currently limits the ability of biologists to realise the value of their recorded information.
DESCRIPTION: This work presents Framework4, an all-encompassing software suite which operates on smart sensor data to determine the 4 key elements considered pivotal for movement analysis from such tags (Endangered Species Res 4: 123-37, 2008). These are; animal trajectory, behaviour, energy expenditure and quantification of the environment in which the animal moves. The program transforms smart sensor data into dead-reckoned movements, template-matched behaviours, dynamic body acceleration-derived energetics and position-linked environmental data before outputting it all into a single file. Biologists are thus left with a single data set where animal actions and environmental conditions can be linked across time and space.
CONCLUSIONS: Framework4 is a user-friendly software that assists biologists in elucidating 4 key aspects of wild animal ecology using data derived from tags with multiple sensors recording at high rates. Its use should enhance the ability of biologists to derive meaningful data rapidly from complex data.
Resumo:
PURPOSE: To describe and evaluate a new method for measuring anterior chamber volume (ACV). DESIGN: Observational case series. METHODS: The authors measured ACV using the anterior chamber (AC) optical coherence tomographer (OCT) and applied image-processing software developed by them. Repeatability was evaluated. The ACV was measured in patient groups with normal ACs, shallow ACs, and deep ACs. The volume difference before and after laser peripheral iridotomy (LPI) was analyzed for the shallow and deep groups. RESULTS: Coefficients of repeatability for intraoperator, interoperator, and interimage measurements were 0.406%, 0.958%, and 0.851%, respectively. The limits of agreement for intraoperator and interoperator measurement were -0.911 microl to 1.343 microl and -7.875 microl to -2.463 microl, respectively. There were significant ACV differences in normal, shallow, and deep AC eyes (P < .001) and before and after LPI in shallow AC (P < .001) and deep AC (P = .008) eyes. CONCLUSIONS: The ACV values obtained by this method were repeatable and in accord with clinical observation.
Resumo:
Network management tools must be able to monitor and analyze traffic flowing through network systems. According to the OpenFlow protocol applied in Software-Defined Networking (SDN), packets are classified into flows that are searched in flow tables. Further actions, such as packet forwarding, modification, and redirection to a group table, are made in the flow table with respect to the search results. A novel hardware solution for SDN-enabled packet classification is presented in this paper. The proposed scheme is focused on a label-based search method, achieving high flexibility in memory usage. The implemented hardware architecture provides optimal lookup performance by configuring the search algorithm and by performing fast incremental update as programmed the software controller.
Resumo:
Recent trends, such as Software-Defined Networking (SDN), introduce programmability to the network with the opportunity to dynamically route traffic based on flow descriptions. Packet header lookup is the first phase in this process. In this paper, we illustrate improved header lookup and flow rule update speeds over conventional lookup algorithms. This is achieved by performing individual packet header field searches and combining the search results. We propose that individual algorithms should be selected for packet classification based on the application requirements. Improving the network processing performance with our configurable solution will directly support the proposed capability of programmability in SDN.
Resumo:
Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.
Resumo:
PURPOSE: To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program.
METHODS: Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated.
RESULTS: All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images.
CONCLUSION: This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions.
Resumo:
Virtual Reality techniques are relatively new, having experienced significant development only during the last few years, in accordance with the progress achieved by computer science and hardware and software technologies. The study of such advanced design systems has led to the realization of an immersive environment in which new procedures for the evaluation of product prototypes, ergonomics and manufacturing operations have been simulated. The application of the environment realized to robotics, ergonomics, plant simulations and maintainability verifications has allowed us to highlight the advantages offered by a design methodology: the possibility of working on the industrial product in the first phase of conception; of placing the designer in front of the virtual reproduction of the product in a realistic way; and of interacting with the same concept. The aim of this book is to present an updated vision of VM through different aspects. We will describe the trends and results achieved in the automotive, aerospace and railway fields, in terms of the Digital Product Creation Process to design the product and the manufacturing process.