935 resultados para Lightweight Ships
Resumo:
Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.
Resumo:
Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.
Resumo:
Existing benchmarking methods are time consuming processes as they typically benchmark the entire Virtual Machine (VM) in order to generate accurate performance data, making them less suitable for real-time analytics. The research in this paper is aimed to surmount the above challenge by presenting DocLite - Docker Container-based Lightweight benchmarking tool. DocLite explores lightweight cloud benchmarking methods for rapidly executing benchmarks in near real-time. DocLite is built on the Docker container technology, which allows a user-defined memory size and number of CPU cores of the VM to be benchmarked. The tool incorporates two benchmarking methods - the first referred to as the native method employs containers to benchmark a small portion of the VM and generate performance ranks, and the second uses historic benchmark data along with the native method as a hybrid to generate VM ranks. The proposed methods are evaluated on three use-cases and are observed to be up to 91 times faster than benchmarking the entire VM. In both methods, small containers provide the same quality of rankings as a large container. The native method generates ranks with over 90% and 86% accuracy for sequential and parallel execution of an application compared against benchmarking the whole VM. The hybrid method did not improve the quality of the rankings significantly.
Resumo:
This study looked at the potential of bauxite residue or red mud to be used in the manufacture of lightweight aggregate in replacement of pulverised fuel ash (PFA), commonly used as a way of recycling problematic wastes. The percentage replacements of red mud with PFA were as follows: 25, 31, 38, 44 and 50%. These were blended in a mix with waste excavated clay and sewage sludge – all from the Chongqing municipality in China. Lightweight pellets were produced using a Trefoil rotary kiln and were sintered to 1200 °C. Results showed that 44 % bauxite residue replacement produced lightweight pellets with the highest compressive strength, highest density and largest water holding capacity. This would be expected in materials with a low level of silicates, which causes insufficient glass phase viscosity and therefore poor bloating during firing; producing an aggregate with a higher density but with open pores that allowed for larger water absorption. All ratios of red mud aggregates were significantly reduced in pH after firing to around pH 8, and this reduced the leachability of the aggregates to levels below those set by the European landfill directive (2003/33/EC).
Resumo:
This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.
Resumo:
The purpose of this research project was to explore how women lightweight rowers in Ontario negotiate their gender and body identity. Through a feminist post-structural lens I investigated both ‘acceptable’ and contradictory gender and sport performances that exist in the culture of rowing in order to understand how identity is constructed at the intersection of these discourses. My goal was to learn how human experiences are shaped by discourses of power, and resulting constructions of acceptable gender attributes. Seven university-aged lightweight women’s rowers were interviewed, and the following themes were uncovered: the women are constantly engaging in acts of bodily control; often body image is affected by participation in the sport; there are instances of femininity that exist within the culture of lightweight rowing; inequalities are present within the culture, as are excuse making practices; and the potential for resistance is extremely complicated.
Resumo:
In order to minimize the risk of failures or major renewals of hull structures during the ship's expected life span, it is imperative that the precaution must be taken with regard to an adequate margin of safety against any one or combination of failure modes including excessive yielding, buckling, brittle fracture, fatigue and corrosion. The most efficient system for combating underwater corrosion is 'cathodic protection'. The basic principle of this method is that the ship's structure is made cathodic, i.e. the anodic (corrosion) reactions are suppressed by the application of an opposing current and the ship is there by protected. This paper deals with state of art in cathodic protection and its programming in ship structure
Resumo:
The purpose of resource management is the efficient and effective use of network resources, for instance bandwidth. In this article, a connection oriented network scenario is considered, where a certain amount of bandwidth is reserved for each label switch path (LSP), which is a logical path, in a MPLS or GMPLS environment. Assuming there is also some kind of admission control (explicit or implicit), these environments typically provide quality of service (QoS) guarantees. It could happen that some LSPs become busy, thus rejecting connections, while other LSPs may be under-utilised. We propose a distributed lightweight monitoring technique, based on threshold values, the objective of which is to detect congestion when it occurs in an LSP and activate the corresponding alarm which will trigger a dynamic bandwidth reallocation mechanism
Resumo:
Despite the importance of microphysical cloud processes on the climate system, some topics are under-explored. For example, few measurements of droplet charges in nonthunderstorm clouds exist. Balloon carried charge sensors can be used to provide new measurements. A charge sensor is described for use with meteorological balloons, which has been tested over a range of atmospheric temperatures from -60 to 20 degrees C, in cloudy and clear air. The rapid time response of the sensor (to >10 V s(-1)) permits charge densities from 100 fC m(-3) to 1 nC m(-3) to be determined, which is sufficient for it to act as a cloud edge charge detector at weakly charged horizontal cloud boundaries.
Resumo:
The relationship between rock art and the material qualities of the rock surface on which it is executed is investigated. The case study of Revheim, Rogaland, Southwest Norway, is the starting-point for a discussion on the way in which the contours of the rock, quartz outcrops and the flow of water across the rock surface affect the placement of images on the rock. It is argued that a fuller examination of the interrelationship between rock and rock art provides a more coherent interpretation of rock art images.
The mountain of ships. The organisation of the Bronze Age cemetery at Snäckedal, Misterhult, Småland