893 resultados para METRICS
Resumo:
Very high speed and low area hardware architectures of the SHACAL-1 encryption algorithm are presented in this paper. The SHACAL algorithm was a submission to the New European Schemes for Signatures, Integrity and Encryption (NESSIE) project and it is based on the SHA-1 hash algorithm. To date, there have been no performance metrics published on hardware implementations of this algorithm. A fully pipelined SHACAL-1 encryption architecture is described in this paper and when implemented on a Virtex-II X2V4000 FPGA device, it runs at a throughput of 17 Gbps. A fully pipelined decryption architecture achieves a speed of 13 Gbps when implemented on the same device. In addition, iterative architectures of the algorithm are presented. The SHACAL-1 decryption algorithm is derived and also presented in this paper, since it was not provided in the submission to NESSIE. © Springer-Verlag Berlin Heidelberg 2003.
Resumo:
Quantum-dot cellular automata (QCA) is potentially a very attractive alternative to CMOS for future digital designs. Circuit designs in QCA have been extensively studied. However, how to properly evaluate the QCA circuits has not been carefully considered. To date, metrics and area-delay cost functions directly mapped from CMOS technology have been used to compare QCA designs, which is inappropriate due to the differences between these two technologies. In this paper, several cost metrics specifically aimed at QCA circuits are studied. It is found that delay, the number of QCA logic gates, and the number and type of crossovers, are important metrics that should be considered when comparing QCA designs. A family of new cost functions for QCA circuits is proposed. As fundamental components in QCA computing arithmetic, QCA adders are reviewed and evaluated with the proposed cost functions. By taking the new cost metrics into account, previous best adders become unattractive and it has been shown that different optimization goals lead to different “best” adders.
Resumo:
Directional modulation (DM) is an emerging technology for securing wireless communications at the physical layer. This promising technology, unlike the conventional key-based cryptographic methods and the key-based physical layer security approaches, locks information signals without any requirements of keys. The locked information can only be fully recovered by the legitimate receiver(s) priory known by DM transmitters. This paper reviews the origin of the DM concept and, particularly, its development in recent years, including its mathematical model, assessment metrics, synthesis approaches, physical realizations, and finally its potential aspects for future studies.
Resumo:
Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.
Resumo:
AIMS: To determine the incidence and predictive factors of rib fracture and chest wall pain after lung stereotactic ablative radiotherapy (SABR).
MATERIALS AND METHODS: Patients were treated with lung SABR of 48-60 Gy in four to five fractions. The treatment plan and follow-up computed tomography scans of 289 tumours in 239 patients were reviewed. Dose-volume histogram (DVH) metrics and clinical factors were evaluated as potential predictors of chest wall toxicity.
RESULTS: The median follow-up was 21.0 months (range 6.2-52.1). Seventeen per cent (50/289) developed a rib fracture, 44% (22/50) were symptomatic; the median time to fracture was 16.4 months. On univariate analysis, female gender, osteoporosis, tumours adjacent (within 5 mm) to the chest wall and all of the chest wall DVH metrics predicted for rib fracture, but only tumour location adjacent to the chest wall remained significant on the multivariate model (P < 0.01). The 2 year fracture-free probability for those adjacent to the chest wall was 65.6%. Among those tumours adjacent to the chest wall, only osteoporosis (P = 0.02) predicted for fracture, whereas none of the chest wall DVH metrics were predictive. Eight per cent (24/289) experienced chest wall pain without fracture.
CONCLUSIONS: None of the chest wall DVH metrics independently predicted for SABR-induced rib fracture when tumour location is taken into account. Patients with tumours adjacent (within 5 mm) to the chest wall are at greater risk of rib fracture after lung SABR, and among these, an additional risk was observed in osteoporotic patients.
Resumo:
Power, and consequently energy, has recently attained first-class system resource status, on par with conventional metrics such as CPU time. To reduce energy consumption, many hardware- and OS-level solutions have been investigated. However, application-level information - which can provide the system with valuable insights unattainable otherwise - was only considered in a handful of cases. We introduce OpenMPE, an extension to OpenMP designed for power management. OpenMP is the de-facto standard for programming parallel shared memory systems, but does not yet provide any support for power control. Our extension exposes (i) per-region multi-objective optimization hints and (ii) application-level adaptation parameters, in order to create energy-saving opportunities for the whole system stack. We have implemented OpenMPE support in a compiler and runtime system, and empirically evaluated its performance on two architectures, mobile and desktop. Our results demonstrate the effectiveness of OpenMPE with geometric mean energy savings across 9 use cases of 15 % while maintaining full quality of service.
Resumo:
Bridge construction responds to the need for environmentally friendly design of motorways and facilitates the passage through sensitive natural areas and the bypassing of urban areas. However, according to numerous research studies, bridge construction presents substantial budget overruns. Therefore, it is necessary early in the planning process for the decision makers to have reliable estimates of the final cost based on previously constructed projects. At the same time, the current European financial crisis reduces the available capital for investments and financial institutions are even less willing to finance transportation infrastructure. Consequently, it is even more necessary today to estimate the budget of high-cost construction projects -such as road bridges- with reasonable accuracy, in order for the state funds to be invested with lower risk and the projects to be designed with the highest possible efficiency. In this paper, a Bill-of-Quantities (BoQ) estimation tool for road bridges is developed in order to support the decisions made at the preliminary planning and design stages of highways. Specifically, a Feed-Forward Artificial Neural Network (ANN) with a hidden layer of 10 neurons is trained to predict the superstructure material quantities (concrete, pre-stressed steel and reinforcing steel) using the width of the deck, the adjusted length of span or cantilever and the type of the bridge as input variables. The training dataset includes actual data from 68 recently constructed concrete motorway bridges in Greece. According to the relevant metrics, the developed model captures very well the complex interrelations in the dataset and demonstrates strong generalisation capability. Furthermore, it outperforms the linear regression models developed for the same dataset. Therefore, the proposed cost estimation model stands as a useful and reliable tool for the construction industry as it enables planners to reach informed decisions for technical and economic planning of concrete bridge projects from their early implementation stages.
Resumo:
This paper addresses the representation of landscape complexity in stated preferences research. It integrates landscape ecology and landscape economics and conducts the landscape analysis in a three-dimensional space to provide ecologically meaningful quantitative landscape indicators that are used as variables for the monetary valuation of landscape in a stated preferences study. Expected heterogeneity in taste intensity across respondents is addressed with a mixed logit model in Willingness to Pay space. The results suggest that the integration of landscape ecology metrics in a stated preferences model provides useful insights for valuing landscape and landscape changes
Resumo:
We present the Coordinated Synoptic Investigation of NGC 2264, a continuous 30 day multi-wavelength photometric monitoring campaign on more than 1000 young cluster members using 16 telescopes. The unprecedented combination of multi-wavelength, high-precision, high-cadence, and long-duration data opens a new window into the time domain behavior of young stellar objects. Here we provide an overview of the observations, focusing on results from Spitzer and CoRoT. The highlight of this work is detailed analysis of 162 classical T Tauri stars for which we can probe optical and mid-infrared flux variations to 1% amplitudes and sub-hour timescales. We present a morphological variability census and then use metrics of periodicity, stochasticity, and symmetry to statistically separate the light curves into seven distinct classes, which we suggest represent different physical processes and geometric effects. We provide distributions of the characteristic timescales and amplitudes and assess the fractional representation within each class. The largest category (>20%) are optical "dippers" with discrete fading events lasting ~1-5 days. The degree of correlation between the optical and infrared light curves is positive but weak; notably, the independently assigned optical and infrared morphology classes tend to be different for the same object. Assessment of flux variation behavior with respect to (circum)stellar properties reveals correlations of variability parameters with Hα emission and with effective temperature. Overall, our results point to multiple origins of young star variability, including circumstellar obscuration events, hot spots on the star and/or disk, accretion bursts, and rapid structural changes in the inner disk. Based on data from the Spitzer and CoRoT missions. The CoRoT space mission was developed and is operated by the French space agency CNES, with participation of ESA's RSSD and Science Programmes, Austria, Belgium, Brazil, Germany, and Spain.
Resumo:
Software-programmable `soft' processors have shown tremendous potential for efficient realisation of high performance signal processing operations on Field Programmable Gate Array (FPGA), whilst lowering the design burden by avoiding the need to design fine-grained custom circuit archi-tectures. However, the complex data access patterns, high memory bandwidth and computational requirements of sliding window applications, such as Motion Estimation (ME) and Matrix Multiplication (MM), lead to low performance, inefficient soft processor realisations. This paper resolves this issue, showing how by adding support for block data addressing and accelerators for high performance loop execution, performance and resource efficiency over four times better than current best-in-class metrics can be achieved. In addition, it demonstrates the first recorded real-time soft ME estimation realisation for H.263 systems.
Resumo:
This paper presents initial results of evaluating suitability of the conventional two-tone CW passive intermodulation (PIM) test for characterization of modulated signal distortion by passive nonlinearities in base station antennas and RF front-end. A comprehensive analysis of analog and digitally modulated waveforms in the transmission lines with weak distributed nonlinearity has been performed using the harmonic balance analysis and X-parameters in Advanced Design System (ADS) simulator. The nonlinear distortion metrics used in the conventional two-tone CW PIM test have been compared with the respective spectral metrics applied to the modulated waveforms, such as adjacent channel power ratio (ACPR) and error vector magnitude (EVM). It is shown that the results of two-tone CW PIM tests are consistent with the metrics used for assessment of signal integrity of both analog and digitally modulated waveforms.
Resumo:
Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.
Resumo:
This paper addresses the representation of landscape complexity in stated preferences research. It integrates landscape ecology and landscape economics and conducts the landscape analysis in a three-dimensional space to provide ecologically meaningful quantitative landscape indicators that are used as variables for the monetary valuation of landscape in a stated preferences study. Expected heterogeneity in taste intensity across respondents is addressed with a mixed logit model in Willingness to Pay space. Our methodology is applied to value, in monetary terms, the landscape of the Sorrento Peninsula in Italy, an area that has faced increasing pressure from urbanization affecting its traditional horticultural, herbaceous, and arboreal structure, with loss of biodiversity, and an increasing risk of landslides. We find that residents of the Sorrento Peninsula would prefer landscapes characterized by large open views and natural features. Residents also appear to dislike heterogeneous landscapes and the presence of lemon orchards and farmers' stewardship, which are associated with the current failure of protecting the traditional landscape. The outcomes suggest that the use of landscape ecology metrics in a stated preferences model may be an effective way to move forward integrated methodologies to better understand and represent landscape and its complexity.
Resumo:
Key generation from the randomness of wireless channels is a promising alternative to public key cryptography for the establishment of cryptographic keys between any two users. This paper reviews the current techniques for wireless key generation. The principles, performance metrics and key generation procedure are comprehensively surveyed. Methods for optimizing the performance of key generation are also discussed. Key generation applications in various environments are then introduced along with the challenges of applying the approach in each scenario. The paper concludes with some suggestions for future studies.
Resumo:
Passive intermodulation (PIM) often limits the performance of communication systems with analog and digitally-modulated signals and especially of systems supporting multiple carriers. Since the origins of the apparently multiple physical sources of nonlinearity causing PIM are not fully understood, the behavioral models are frequently used to describe the process of PIM generation. In this paper a polynomial model of memoryless nonlinearity is deduced from PIM measurements of a microstrip line with distributed nonlinearity with two-tone CW signals. The analytical model of nonlinearity is incorporated in Keysight Technology’s ADS simulator to evaluate the metrics of signal fidelity in the receive band for analog and digitally-modulated signals. PIM-induced distortion and cross-band interference with modulated signals are compared to those with two-tone CW signals. It is shown that conventional metrics can be applied to quantify the effect of distributed nonlinearities on signal fidelity. It is found that the two-tone CW test provides a worst-case estimate of cross-band interference for two-carrier modulated signals whereas with a three-carrier signal PIM interference in the receive band is noticeably overestimated. The simulated constellation diagrams for QPSK signals demonstrate that PIM interference exhibits the distinctive signatures of correlated distortion and this indicates that there are opportunities for mitigating PIM interference and that PIM interference cannot be treated as noise. One of the interesting results is that PIM distortion on a transmission line results in asymmetrical regrowth of output PIM interference for modulated signals.