357 resultados para Proximal Point Algorithm
Resumo:
Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Resumo:
The increase in data center dependent services has made energy optimization of data centers one of the most exigent challenges in today's Information Age. The necessity of green and energy-efficient measures is very high for reducing carbon footprint and exorbitant energy costs. However, inefficient application management of data centers results in high energy consumption and low resource utilization efficiency. Unfortunately, in most cases, deploying an energy-efficient application management solution inevitably degrades the resource utilization efficiency of the data centers. To address this problem, a Penalty-based Genetic Algorithm (GA) is presented in this paper to solve a defined profile-based application assignment problem whilst maintaining a trade-off between the power consumption performance and resource utilization performance. Case studies show that the penalty-based GA is highly scalable and provides 16% to 32% better solutions than a greedy algorithm.
Resumo:
From an economic perspective, the sustainability crisis is ultimately characterized by a worsening relationship between the resources required to support the global population and the ability of the earth to supply them. Despite the ever-increasing threat of a calamity, modern society appears unable to alter its course. The very systems which underpin global human endeavor seem to actively prevent meaningful change and the one irrepressible goal to which all societies seem to strive is the very thing that makes such endeavor ultimately life threatening: that of global growth. Using the Australian experience as an exemplar, this paper explores how the concept of growth infiltrates societal reactions to the crisis at various scales – global, national and regional. Analysis includes historic studies, a critique of current misconceptions around population demographics, comparative evaluation of various interventions in the Australian context and considerations around potential ways to address the crisis.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Although live VM migration has been intensively studied, the problem of live migration of multiple interdependent VMs has hardly been investigated. The most important problem in the live migration of multiple interdependent VMs is how to schedule VM migrations as the schedule will directly affect the total migration time and the total downtime of those VMs. Aiming at minimizing both the total migration time and the total downtime simultaneously, this paper presents a Strength Pareto Evolutionary Algorithm 2 (SPEA2) for the multi-VM migration scheduling problem. The SPEA2 has been evaluated by experiments, and the experimental results show that the SPEA2 can generate a set of VM migration schedules with a shorter total migration time and a shorter total downtime than an existing genetic algorithm, namely Random Key Genetic Algorithm (RKGA). This paper also studies the scalability of the SPEA2.
Resumo:
Projective Hjelmslev planes and affine Hjelmslev planes are generalisations of projective planes and affine planes. We present an algorithm for constructing projective Hjelmslev planes and affine Hjelmslev planes that uses projective planes, affine planes and orthogonal arrays. We show that all 2-uniform projective Hjelmslev planes, and all 2-uniform affine Hjelmslev planes can be constructed in this way. As a corollary it is shown that all $2$-uniform affine Hjelmslev planes are sub-geometries of $2$-uniform projective Hjelmslev planes.
Resumo:
This paper investigates communication protocols for relaying sensor data from animal tracking applications back to base stations. While Delay Tolerant Networks (DTNs) are well suited to such challenging environments, most existing protocols do not consider the available energy that is particularly important when tracking devices can harvest energy. This limits both the network lifetime and delivery probability in energy-constrained applications to the point when routing performance becomes worse than using no routing at all. Our work shows that substantial improvement in data yields can be achieved through simple yet efficient energy-aware strategies. Conceptually, there is need for balancing the energy spent on sensing, data mulling, and delivery of direct packets to destination. We use empirical traces collected in a flying fox (fruit bat) tracking project and show that simple threshold-based energy-aware strategies yield up to 20% higher delivery rates. Furthermore, these results generalize well for a wide range of operating conditions.
Resumo:
PURPOSE: To compare pressures generated by 2 different cement pressurisers at various locations in the proximal femur. METHODS: Two groups of 5 synthetic femurs were used, and 6 pressure sensors were placed in the femur at 20-mm intervals proximally to distally. Cement was filled into the femoral canal retrogradely using a cement gun with either the half-moon pressuriser or the femoral canal pressuriser. Maximum pressures and pressure time integrals (cumulative pressure over time) of the 2 pressurisers were compared. RESULTS: At all sensors, the half-moon pressuriser produced higher maximum pressures and pressure time integrals than the femoral canal pressuriser, but the difference was significant only at sensor 1 (proximal femur). This may result in reduced cement interdigitation in the proximal femur. CONCLUSION: The half-moon pressuriser produced higher maximum cementation pressures and pressure time integrals than the femoral canal pressuriser in the proximal femur region, which is critical for rotational stability of the implant and prevention of implant fracture. KEYWORDS: arthroplasty, replacement, hip; bone cements; femur
Resumo:
Interstitial fibrosis, a histological process common to many kidney diseases, is the precursor state to end stage kidney disease, a devastating and costly outcome for the patient and the health system. Fibrosis is historically associated with chronic kidney disease (CKD) but emerging evidence is now linking many forms of acute kidney disease (AKD) with the development of CKD. Indeed, we and others have observed at least some degree of fibrosis in up to 50% of clinically defined cases of AKD. Epithelial cells of the proximal tubule (PTEC) are central in the development of kidney interstitial fibrosis. We combine the novel techniques of laser capture microdissection and multiplex-tandem PCR to identify and quantitate “real time” gene transcription profiles of purified PTEC isolated from human kidney biopsies that describe signaling pathways associated with this pathological fibrotic process. Our results: (i) confirm previous in-vitro and animal model studies; kidney injury molecule-1 is up-regulated in patients with acute tubular injury, inflammation, neutrophil infiltration and a range of chronic disease diagnoses, (ii) provide data to inform treatment; complement component 3 expression correlates with inflammation and acute tubular injury, (iii) identify potential new biomarkers; proline 4-hydroxylase transcription is down-regulated and vimentin is up-regulated across kidney diseases, (iv) describe previously unrecognized feedback mechanisms within PTEC; Smad-3 is down-regulated in many kidney diseases suggesting a possible negative feedback loop for TGF-β in the disease state, whilst tight junction protein-1 is up-regulated in many kidney diseases, suggesting feedback interactions with vimentin expression. These data demonstrate that the combined techniques of laser capture microdissection and multiplex-tandem PCR have the power to study molecular signaling within single cell populations derived from clinically sourced tissue.
Resumo:
Animal models of critical illness are vital in biomedical research. They provide possibilities for the investigation of pathophysiological processes that may not otherwise be possible in humans. In order to be clinically applicable, the model should simulate the critical care situation realistically, including anaesthesia, monitoring, sampling, utilising appropriate personnel skill mix, and therapeutic interventions. There are limited data documenting the constitution of ideal technologically advanced large animal critical care practices and all the processes of the animal model. In this paper, we describe the procedure of animal preparation, anaesthesia induction and maintenance, physiologic monitoring, data capture, point-of-care technology, and animal aftercare that has been successfully used to study several novel ovine models of critical illness. The relevant investigations are on respiratory failure due to smoke inhalation, transfusion related acute lung injury, endotoxin-induced proteogenomic alterations, haemorrhagic shock, septic shock, brain death, cerebral microcirculation, and artificial heart studies. We have demonstrated the functionality of monitoring practices during anaesthesia required to provide a platform for undertaking systematic investigations in complex ovine models of critical illness.
Resumo:
Reconstructing 3D motion data is highly under-constrained due to several common sources of data loss during measurement, such as projection, occlusion, or miscorrespondence. We present a statistical model of 3D motion data, based on the Kronecker structure of the spatiotemporal covariance of natural motion, as a prior on 3D motion. This prior is expressed as a matrix normal distribution, composed of separable and compact row and column covariances. We relate the marginals of the distribution to the shape, trajectory, and shape-trajectory models of prior art. When the marginal shape distribution is not available from training data, we show how placing a hierarchical prior over shapes results in a convex MAP solution in terms of the trace-norm. The matrix normal distribution, fit to a single sequence, outperforms state-of-the-art methods at reconstructing 3D motion data in the presence of significant data loss, while providing covariance estimates of the imputed points.
Resumo:
Rolling-element bearing failures are the most frequent problems in rotating machinery, which can be catastrophic and cause major downtime. Hence, providing advance failure warning and precise fault detection in such components are pivotal and cost-effective. The vast majority of past research has focused on signal processing and spectral analysis for fault diagnostics in rotating components. In this study, a data mining approach using a machine learning technique called anomaly detection (AD) is presented. This method employs classification techniques to discriminate between defect examples. Two features, kurtosis and Non-Gaussianity Score (NGS), are extracted to develop anomaly detection algorithms. The performance of the developed algorithms was examined through real data from a test to failure bearing. Finally, the application of anomaly detection is compared with one of the popular methods called Support Vector Machine (SVM) to investigate the sensitivity and accuracy of this approach and its ability to detect the anomalies in early stages.
Resumo:
We present an algorithm for multiarmed bandits that achieves almost optimal performance in both stochastic and adversarial regimes without prior knowledge about the nature of the environment. Our algorithm is based on augmentation of the EXP3 algorithm with a new control lever in the form of exploration parameters that are tailored individually for each arm. The algorithm simultaneously applies the “old” control lever, the learning rate, to control the regret in the adversarial regime and the new control lever to detect and exploit gaps between the arm losses. This secures problem-dependent “logarithmic” regret when gaps are present without compromising on the worst-case performance guarantee in the adversarial regime. We show that the algorithm can exploit both the usual expected gaps between the arm losses in the stochastic regime and deterministic gaps between the arm losses in the adversarial regime. The algorithm retains “logarithmic” regret guarantee in the stochastic regime even when some observations are contaminated by an adversary, as long as on average the contamination does not reduce the gap by more than a half. Our results for the stochastic regime are supported by experimental validation.
Resumo:
This paper reviews the innovations that have been introduced in the milling train at Rocky Point mill since 2001 and provides some operational, performance and maintenance comparisons of the technologies in use. The decision to install BHEM mills in the #2 and #3 mill positions to complement the six-roll mills in the #1 and #4 mill positions has proven a good one. Satisfactory performance is being obtained by these mills while maintenance costs are significantly less. Very good #1 mill extraction and final bagasse moisture content are being achieved. The innovation of using Hägglunds hydraulic drives at higher speed…
Resumo:
We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.