547 resultados para least common subgraph algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Software-as-a-Service or SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. Components in a composite SaaS may need to be scaled – replicated or deleted, to accommodate the user’s load. It may not be necessary to replicate all components of the SaaS, as some components can be shared by other instances. On the other hand, when the load is low, some of the instances may need to be deleted to avoid resource underutilisation. Thus, it is important to determine which components are to be scaled such that the performance of the SaaS is still maintained. Extensive research on the SaaS resource management in Cloud has not yet addressed the challenges of scaling process for composite SaaS. Therefore, a hybrid genetic algorithm is proposed in which it utilises the problem’s knowledge and explores the best combination of scaling plan for the components. Experimental results demonstrate that the proposed algorithm outperforms existing heuristic-based solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent findings from the clinic and the laboratory have transformed the way proteases and their inhibitors are perceived in the outermost layer of the skin, the epidermis. It now appears that an integrated proteolytic network operates within the epidermis, comprising more than 30 enzymes that carry out a growing list of essential functions. Equally, defective regulation or execution of protease-mediated processes is emerging as a key contributor to diverse human skin pathologies, and in recent years the number of diseases attributable to aberrant proteolytic activity has more than doubled. Here, we survey the different roles of proteases in epidermal homeostasis (from processing enzymes to signalling molecules) and explore the spectrum of rare and common human skin disorders where proteolytic pathways are dysregulated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several analytical methods for Dynamic System Optimum (DSO) assignment have been proposed but they are basically classified into two kinds. This chapter attempts to establish DSO by equilbrating the path dynamic marginal time (DMT). The authors analyze the path DMT for a single path with tandem bottlenecks and showed that the path DMT is not the simple summation of DMT associated with each bottleneck along the path. Next, the authors examined the DMT of several paths passing through a common bottleneck. It is shown that the externality at the bottleneck is shared by the paths in proportion to their demand from the current time until the queue vanishes. This share of the externality is caused by the departure rate shift under first in first out (FIFO) and the externality propagates to the downstream bottlenecks. However, the externalities propagates to the downstream are calculated out if downstream bottlenecks exist. Therefore, the authors concluded that the path DMT can be evaluated without considering the propagation of the externalities, but just as in the evaluation of the path DMT for a single path passing through a series of bottlenecks between the origin and destination. Based on the DMT analysis, the authors finally proposed a heuristic solution algorithm and verified it by comparing the numerical solution with the analytical one.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most urban dwelling Australians take secure and safe water supplies for granted. That is, they have an adequate quantity of water at a quality that can be used by people without harm from human and animal wastes, salinity and hardness or pollutants from agriculture and manufacturing industries. Australia wide urban and peri-urban dwellers use safe water for all domestic as well as industrial purposes. However, this is not the situation remote regions in Australia where availability and poor quality water can be a development constraint. Nor is it the case in Sri Lanka where people in rural regions are struggling to obtain a secure supply of water, irrespective of it being safe because of the impact of faecal and other contaminants. The purposes of this paper are to overview: the population and environmental health challenges arising from the lack of safe water in rural and remote communities; response pathways to address water quality issues; and the status of and need for integrated catchment management (ICM) in selected remote regions of Australia and vulnerable and lagging rural regions in Sri Lanka. Conclusions are drawn that focus on the opportunity for inter-regional collaborations between Australia and Sri Lanka for the delivery of safe water through ICM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Capacity reduction programmes, in the form of buybacks or decommissioning, have had relatively widespread application in fisheries in the US, Europe and Australia. A common criticism of such programmes is that they remove the least efficient vessels first, resulting in an increase in average efficiency of the remaining fleet, which tends to increase the effective fishing power of the remaining fleet. In this paper, the effects of a buyback programme on average technical efficiency in Australia’s Northern Prawn Fishery are examined using a multi-output production function approach with an explicit inefficiency model. As expected, the results indicate that average efficiency of the remaining vessels was generally greater than that of the removed vessels. Further, there was some evidence of an increase in average scale efficiency in the fleet as the remaining vessels were closer, on average, to the optimal scale. Key factors affecting technical efficiency included company structure and the number of vessels fishing. In regard to fleet size, our model suggests positive externalities associated with more boats fishing at any point in time (due to information sharing and reduced search costs), but also negative externalities due to crowding, with the latter effect dominating the former. Hence, the buyback resulted in a net increase in the individual efficiency of the remaining vessels due to reduced crowding, as well as raising average efficiency through removal of less efficient vessels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ORIGO Stepping Stones is written and developed by a team of experts to provide teachers with a world-class elementary math program. Our expert team of authors and consultants are utilizing all available educational research to create a unique program that has never before been available to teachers. The full color Student Practice Book provides practice pages that support previous and current lessons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, the tissue origin of MDA-MB-435 cell line has been the subject of considerable debate. In this study, we set out to determine whether MDA-MB-435-DTP cells shown to express melanoma-specific genes were identical to various other MDA-MB-435 cell stocks worldwide. CGH-microarray, genetic polymorphism genotyping, microsatellite fingerprint analysis and/or chromosomal number confirmed that the MDA-MB-435 cells maintained at the Lombardi Comprehensive Cancer Center (MDA-MB-435-LCC) are almost identical to the MDA-MB-435-DTP cells, and showed a very similar profile to those obtained from the same original source (MD Anderson Cancer Center) but maintained independently (MDA-MB-435-PMCC). Gene expression profile analy-sis confirmed common expression of genes among different MDA-MB-435-LCC cell stocks, and identified some unique gene products in MDA-MB-435-PMCC cells. RT-PCR analysis confirmed the expression of the melanoma marker tyrosinase across multiple MDA-MB-435 cell stocks. Collectively, our results show that the MDA-MB-435 cells used widely have identical origins to those that exhibit a melanoma-like gene expression signature, but exhibit a small degree of genotypic and phenotypic drift.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical approach is used in the design of a battery-supercapacitor energy storage system for a wind farm. The design exploits the technical merits of the two energy storage mediums, in terms of the differences in their specific power and energy densities, and their ability to accommodate different rates of change in the charging/discharging powers. By treating the input wind power as random and using a proposed coordinated power flows control strategy for the battery and the supercapacitor, the approach evaluates the energy storage capacities, the corresponding expected life cycle cost/year of the storage mediums, and the expected cost/year of unmet power dispatch. A computational procedure is then developed for the design of a least-cost/year hybrid energy storage system to realize wind power dispatch at a specified confidence level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interior permanent-magnet synchronous motors (IPMSMs) become attractive candidates in modern hybrid electric vehicles and industrial applications. Usually, to obtain good control performance, the electric drives of this kind of motor require one position, one dc link, and at least two current sensors. Failure of any of these sensors might lead to degraded system performance or even instability. As such, sensor fault resilient control becomes a very important issue in modern drive systems. This paper proposes a novel sensor fault detection and isolation algorithm based on an extended Kalman filter. It is robust to system random noise and efficient in real-time implementation. Moreover, the proposed algorithm is compact and can detect and isolate all the sensor faults for IPMSM drives. Thorough theoretical analysis is provided, and the effectiveness of the proposed approach is proven by extensive experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper provides a systematic approach to designing the laboratory phase of a multiphase experiment, taking into account previous phases. General principles are outlined for experiments in which orthogonal designs can be employed. Multiphase experiments occur widely, although their multiphase nature is often not recognized. The need to randomize the material produced from the first phase in the laboratory phase is emphasized. Factor-allocation diagrams are used to depict the randomizations in a design and the use of skeleton analysis-of-variance (ANOVA) tables to evaluate their properties discussed. The methods are illustrated using a scenario and a case study. A basis for categorizing designs is suggested. This article has supplementary material online.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article explains the new pre-court procedures and additional procedures designed to foster settlement of claims introduced by the Workcover Queensland Act 1996, and the implication of the new provisions for practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the following problem: users in a dynamic group store their encrypted documents on an untrusted server, and wish to retrieve documents containing some keywords without any loss of data confidentiality. In this paper, we investigate common secure indices which can make multi-users in a dynamic group to obtain securely the encrypted documents shared among the group members without re-encrypting them. We give a formal definition of common secure index for conjunctive keyword-based retrieval over encrypted data (CSI-CKR), define the security requirement for CSI-CKR, and construct a CSI-CKR based on dynamic accumulators, Paillier’s cryptosystem and blind signatures. The security of proposed scheme is proved under strong RSA and co-DDH assumptions.