1000 resultados para Ecm Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A composite SaaS (Software as a Service) is a software that is comprised of several software components and data components. The composite SaaS placement problem is to determine where each of the components should be deployed in a cloud computing environment such that the performance of the composite SaaS is optimal. From the computational point of view, the composite SaaS placement problem is a large-scale combinatorial optimization problem. Thus, an Iterative Cooperative Co-evolutionary Genetic Algorithm (ICCGA) was proposed. The ICCGA can find reasonable quality of solutions. However, its computation time is noticeably slow. Aiming at improving the computation time, we propose an unsynchronized Parallel Cooperative Co-evolutionary Genetic Algorithm (PCCGA) in this paper. Experimental results have shown that the PCCGA not only has quicker computation time, but also generates better quality of solutions than the ICCGA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Breast cancer is a leading contributor to the burden of disease in Australia. Fortunately, the recent introduction of diverse therapeutic strategies have improved the survival outcome for many women. Despite this, the clinical management of breast cancer remains problematic as not all approaches are sufficiently sophisticated to take into account the heterogeneity of this disease and are unable to predict disease progression, in particular, metastasis. As such, women with good prognostic outcomes are exposed to the side effects of therapies without added benefit. Furthermore, women with aggressive disease for whom these advanced treatments would deliver benefit cannot be distinguished and opportunities for more intensive or novel treatment are lost. This study is designed to identify novel factors associated with disease progression, and the potential to inform disease prognosis. Frequently overlooked, yet common mediators of disease are the interactions that take place between the insulin-like growth factor (IGF) system and the extracellular matrix (ECM). Our laboratory has previously demonstrated that multiprotein insulin-like growth factor-I (IGF-I): insulin-like growth factor binding protein (IGFBP): vitronectin (VN) complexes stimulate migration of breast cancer cells in vitro, via the cooperative involvement of the insulin-like growth factor type I receptor (IGF-IR) and VN-binding integrins. However, the effects of IGF and ECM protein interactions on the dissemination and progression of breast cancer in vivo are unknown. It was hypothesised that interactions between proteins required for IGF induced signalling events and those within the ECM contribute to breast cancer metastasis and are prognostic and predictive indicators of patient outcome. To address this hypothesis, semiquantitative immunohistochemistry (IHC) analyses were performed to compare the extracellular and subcellular distribution of IGF and ECM induced signalling proteins between matched normal, primary cancer, and metastatic cancer among archival formalin-fixed paraffin-embedded (FFPE) breast tissue samples collected from women attending the Princess Alexandra Hospital, Brisbane. Multivariate Cox proportional hazards (PH) regression survival models in conjunction with a modified „purposeful selection of covariates. method were applied to determine the prognostic potential of these proteins. This study provides the first in-depth, compartmentalised analysis of the distribution of IGF and ECM induced signalling proteins. As protein function and protein localisation are closely correlated, these findings provide novel insights into IGF signalling and ECM protein function during breast cancer development and progression. Distinct IGF signalling and ECM protein immunoreactivity was observed in the stroma and/or in subcellular locations in normal breast, primary cancer and metastatic cancer tissues. Analysis of the presence and location of stratifin (SFN) suggested a causal relationship in ECM remodelling events during breast cancer development and progression. The results of this study have also suggested that fibronectin (FN) and ¥â1 integrin are important for the formation of invadopodia and epithelial-to-mesenchymal transition (EMT) events. Our data also highlighted the importance of the temporal and spatial distribution of IGF induced signalling proteins in breast cancer metastasis; in particular, SFN, enhancer-of-split and hairy-related protein 2 (SHARP-2), total-akt/protein kinase B 1 (Total-AKT1), phosphorylated-akt/protein kinase B (P-AKT), extracellular signal-related kinase-1 and extracellular signal-related kinase-2 (ERK1/2) and phosphorylated-extracellular signal-related kinase-1 and extracellular signal-related kinase-2 (P-ERK1/2). Multivariate survival models were created from the immunohistochemical data. These models were found to fit well with these data with very high statistical confidence. Numerous prognostic confounding effects and effect modifications were identified among elements of the ECM and IGF signalling cascade and corroborate the survival models. This finding provides further evidence for the prognostic potential of IGF and ECM induced signalling proteins. In addition, the adjusted measures of associations obtained in this study have strengthened the validity and utility of the resulting models. The findings from this study provide insights into the biological interactions that occur during the development of breast tissue and contribute to disease progression. Importantly, these multivariate survival models could provide important prognostic and predictive indicators that assist the clinical management of breast disease, namely in the early identification of cancers with a propensity to metastasise, and/or recur following adjuvant therapy. The outcomes of this study further inform the development of new therapeutics to aid patient recovery. The findings from this study have widespread clinical application in the diagnosis of disease and prognosis of disease progression, and inform the most appropriate clinical management of individuals with breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2010 LAGI competition was held on three underutilized sites in the United Arab Emirates. By choosing Staten Island, New York in 2012 the competition organises have again brought into question new roles for public open space in the contemporary city. In the case of the UEA sites, the competition produced many entries which aimed to create a sculpture and by doing so, they attracted people to the selected empty spaces in an arid climate. In a way these proposals were the incubators and the new characters of these empty spaces. The competition was thus successful at advancing understandings of the expanded role of public open spaces in EAU and elsewhere. LAGI 2012 differs significantly to the UAE program because Fresh Kills Park has already been planned as a public open space for New Yorkers - with or without these clean energy sculptures. Furthermore, Fresh Kills Park is already an (gas) energy generating site in its own right. We believe Fresh Kills Park, as a site, presents a problem which somewhat transcends the aims of the competition brief. Advancing a sustainable urban design proposition for the site therefore requires a fundamental reconsideration of the established paradigms public open space. Hence our strategy is to not only create an energy generating, site specific art work, but to create synergy between the public and the site engagement while at the same time complement the idiosyncrasies of the pre-existing engineered landscape. Current PhD research about energy generation in public open spaces informs this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Software as a Service (SaaS) in Cloud is getting more and more significant among software users and providers recently. A SaaS that is delivered as composite application has many benefits including reduced delivery costs, flexible offers of the SaaS functions and decreased subscription cost for users. However, this approach has introduced a new problem in managing the resources allocated to the composite SaaS. The resource allocation that has been done at the initial stage may be overloaded or wasted due to the dynamic environment of a Cloud. A typical data center resource management usually triggers a placement reconfiguration for the SaaS in order to maintain its performance as well as to minimize the resource used. Existing approaches for this problem often ignore the underlying dependencies between SaaS components. In addition, the reconfiguration also has to comply with SaaS constraints in terms of its resource requirements, placement requirement as well as its SLA. To tackle the problem, this paper proposes a penalty-based Grouping Genetic Algorithm for multiple composite SaaS components clustering in Cloud. The main objective is to minimize the resource used by the SaaS by clustering its component without violating any constraint. Experimental results demonstrate the feasibility and the scalability of the proposed algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Improving energy efficiency has become increasingly important in data centers in recent years to reduce the rapidly growing tremendous amounts of electricity consumption. The power dissipation of the physical servers is the root cause of power usage of other systems, such as cooling systems. Many efforts have been made to make data centers more energy efficient. One of them is to minimize the total power consumption of these servers in a data center through virtual machine consolidation, which is implemented by virtual machine placement. The placement problem is often modeled as a bin packing problem. Due to the NP-hard nature of the problem, heuristic solutions such as First Fit and Best Fit algorithms have been often used and have generally good results. However, their performance leaves room for further improvement. In this paper we propose a Simulated Annealing based algorithm, which aims at further improvement from any feasible placement. This is the first published attempt of using SA to solve the VM placement problem to optimize the power consumption. Experimental results show that this SA algorithm can generate better results, saving up to 25 percentage more energy than First Fit Decreasing in an acceptable time frame.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental problem faced by stereo vision algorithms is that of determining correspondences between two images which comprise a stereo pair. This paper presents work towards the development of a new matching algorithm, based on the rank transform. This algorithm makes use of both area-based and edge-based information, and is therefore referred to as a hybrid algorithm. In addition, this algorithm uses a number of matching constraints,including the novel rank constraint. Results obtained using a number of test pairs show that the matching algorithm is capable of removing a significant proportion of invalid matches. The accuracy of matching in the vicinity of edges is also improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental problem faced by stereo vision algorithms is that of determining correspondences between two images which comprise a stereo pair. This paper presents work towards the development of a new matching algorithm, based on the rank transform. This algorithm makes use of both area-based and edge-based information, and is therefore referred to as a hybrid algorithm. In addition, this algorithm uses a number of matching constraints, including the novel rank constraint. Results obtained using a number of test pairs show that the matching algorithm is capable of removing most invalid matches. The accuracy of matching in the vicinity of edges is also improved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to implement a Game-Theory based offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. The goal of this work is then to develop a Multi-Objective (MO) optimisation tool able to provide a set of optimal solutions for the inspection task, given the environment data, the mission requirements and the definition of the objectives to minimise. Results indicate the robustness and capability of the method to find the trade-off between the Pareto-optimal solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a framework for both gradient descent image and object alignment in the Fourier domain. Our method centers upon the classical Lucas & Kanade (LK) algorithm where we represent the source and template/model in the complex 2D Fourier domain rather than in the spatial 2D domain. We refer to our approach as the Fourier LK (FLK) algorithm. The FLK formulation is advantageous when one pre-processes the source image and template/model with a bank of filters (e.g. oriented edges, Gabor, etc.) as: (i) it can handle substantial illumination variations, (ii) the inefficient pre-processing filter bank step can be subsumed within the FLK algorithm as a sparse diagonal weighting matrix, (iii) unlike traditional LK the computational cost is invariant to the number of filters and as a result far more efficient, and (iv) this approach can be extended to the inverse compositional form of the LK algorithm where nearly all steps (including Fourier transform and filter bank pre-processing) can be pre-computed leading to an extremely efficient and robust approach to gradient descent image matching. Further, these computational savings translate to non-rigid object alignment tasks that are considered extensions of the LK algorithm such as those found in Active Appearance Models (AAMs).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algorithm for computing dense correspondences between images of a stereo pair or image sequence is presented. The algorithm can make use of both standard matching metrics and the rank and census filters, two filters based on order statistics which have been applied to the image matching problem. Their advantages include robustness to radiometric distortion and amenability to hardware implementation. Results obtained using both real stereo pairs and a synthetic stereo pair with ground truth were compared. The rank and census filters were shown to significantly improve performance in the case of radiometric distortion. In all cases, the results obtained were comparable to, if not better than, those obtained using standard matching metrics. Furthermore, the rank and census have the additional advantage that their computational overhead is less than these metrics. For all techniques tested, the difference between the results obtained for the synthetic stereo pair, and the ground truth results was small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an adaptive metering algorithm for enhancing the electronic screening (e-screening) operation at truck weight stations. This algorithm uses a feedback control mechanism to control the level of truck vehicles entering the weight station. The basic operation of the algorithm allows more trucks to be inspected when the weight station is underutilized by adjusting the weight threshold lower. Alternatively, the algorithm restricts the number of trucks to inspect when the station is overutilized to prevent queue spillover. The proposed control concept is demonstrated and evaluated in a simulation environment. The simulation results demonstrate the considerable benefits of the proposed algorithm in improving overweight enforcement with minimal negative impacts on nonoverweighed trucks. The test results also reveal that the effectiveness of the algorithm improves with higher truck participation rates in the e-screening program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long traffic queues on off-ramps significantly compromise the safety and throughput of motorways. Obtaining accurate queue information is crucial for countermeasure strategies. However, it is challenging to estimate traffic queues with locally installed inductive loop detectors. This paper deals with the problem of queue estimation with the interpretation of queuing dynamics and the corresponding time-occupancy distribution over motorway off-ramps. A novel algorithm for real-time queue estimation with two detectors is presented and discussed. Results derived from microscopic traffic simulation validated the effectiveness of the algorithm and revealed some of its useful features: (a) long and intermediate traffic queues could be accurately measured, (b) relatively simple detector input (i.e., time occupancy) was required, and (c) the estimation philosophy was independent with signal timing changes and provided the potential to cooperate with advanced strategies for signal control. Some issues concerning field implementation are also discussed.