503 resultados para Combination of classifiers

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper suggests an approach for finding an appropriate combination of various parameters for extracting texture features (e.g. choice of spectral band for extracting texture feature, size of the moving window, quantization level of the image, and choice of texture feature etc.) to be used in the classification process. Gray level co-occurrence matrix (GLCM) method has been used for extracting texture from remotely sensed satellite image. Results of the classification of an Indian urban environment using spatial property (texture), derived from spectral and multi-resolution wavelet decomposed images have also been reported. A multivariate data analysis technique called ‘conjoint analysis’ has been used in the study to analyze the relative importance of these parameters. Results indicate that the choice of texture feature and window size have higher relative importance in the classification process than quantization level or the choice of image band for extracting texture feature. In case of texture features derived using wavelet decomposed image, the parameter ‘decomposition level’ has almost equal relative importance as the size of moving window and the decomposition of images up to level one is sufficient and there is no need to go for further decomposition. It was also observed that the classification incorporating texture features improves the overall classification accuracy in a statistically significant manner in comparison to pure spectral classification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An algorithm based on the concept of combining Kalman filter and Least Error Square (LES) techniques is proposed in this paper. The algorithm is intended to estimate signal attributes like amplitude, frequency and phase angle in the online mode. This technique can be used in protection relays, digital AVRs, DGs, DSTATCOMs, FACTS and other power electronics applications. The Kalman filter is modified to operate on a fictitious input signal and provides precise estimation results insensitive to noise and other disturbances. At the same time, the LES system has been arranged to operate in critical transient cases to compensate the delay and inaccuracy identified because of the response of the standard Kalman filter. Practical considerations such as the effect of noise, higher order harmonics, and computational issues of the algorithm are considered and tested in the paper. Several computer simulations and a laboratory test are presented to highlight the usefulness of the proposed method. Simulation results show that the proposed technique can simultaneously estimate the signal attributes, even if it is highly distorted due to the presence of non-linear loads and noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We hypothesised that a potentially disease-modifying osteoarthritis (OA) drug such as hyaluronic acid (HA) given in combination with anti-inflammatory signalling agents such as mitogen-activated protein kinase kinase–extracellular signal-regulated kinase (MEK-ERK) signalling inhibitor (U0126) could result in additive or synergistic effects on preventing the degeneration of articular cartilage. Chondrocyte differentiation and hypertrophy were evaluated using human OA primary cells treated with either HA or U0126, or the combination of HA + U0126. Cartilage degeneration in menisectomy (MSX) induced rat OA model was investigated by intra-articular delivery of either HA or U0126, or the combination of HA + U0126. Histology, immunostaining, RT-qPCR, Western blotting and zymography were performed to assess the expression of cartilage matrix proteins and hypertrophic markers. Phosphorylated ERK (pERK)1/2-positive chondrocytes were significantly higher in OA samples compared with those in healthy control suggesting the pathological role of that pathway in OA. It was noted that HA + U0126 significantly reduced the levels of pERK, chondrocyte hypertrophic markers (COL10 and RUNX2) and degenerative markers (ADAMTs5 and MMP-13), however, increased the levels of chondrogenic markers (COL2) compared to untreated or the application of HA or U0126 alone. In agreement with the results in vitro, intra-articular delivery of HA + U0126 showed significant therapeutic improvement of cartilage in rat MSX OA model compared with untreated or the application of HA or U0126 alone. Our study suggests that the combination of HA and MEK-ERK inhibition has a synergistic effect on preventing cartilage degeneration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitrogen balance is increasingly used as an indicator of the environmental performance of agricultural sector in national, international, and global contexts. There are three main methods of accounting the national nitrogen balance: farm gate, soil surface, and soil system. OECD (2008) recently reported the nitrogen and phosphorus balances for member countries for the 1985 - 2004 period using the soil surface method. The farm gate and soil system methods were also used in some international projects. Some studies have provided the comparison among these methods and the conclusion is mixed. The motivation of this present paper was to combine these three methods to provide a more detailed auditing of the nitrogen balance and flows for national agricultural production. In addition, the present paper also provided a new strategy of using reliable international and national data sources to calculate nitrogen balance using the farm gate method. The empirical study focused on the nitrogen balance of OECD countries for the period from 1985 to 2003. The N surplus sent to the total environment of OECD surged dramatically in early 1980s, gradually decreased during 1990s but exhibited an increasing trends in early 2000s. The overall N efficiency however fluctuated without a clear increasing trend. The eco-environmental ranking shows that Australia and Ireland were the worst while Korea and Greece were the best.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trastuzumab is a humanised monoclonal antibody against the extracellular domain of HER2 (human epidermal growth factor receptor-2) that is overexpressed in about 25% of human breast cancers. It has shown clinical benefit in HER2-positive breast cancer cases when used alone or in combination with chemotherapy. Trastuzumab increases the response rate to chemotherapy and prolongs survival when used in combination with taxanes. In this article, we review the clinical trials where trastuzumab has been administered together with docetaxel, and we present the results of the trastuzumab expanded access programme (EAP) in the UK. Combination of trastuzumab with docetaxel results in similar response rates and time-to-progression with the trastuzumab/paclitaxel combinations. The toxicity of the combination and the risk of heart failure are low. The clinical data for the docetaxel/trastuzumab combination indicate a favourable profile from both the efficacy and the safety point of view and confirm the feasibility and safety of trastuzumab administration both as monotherapy and in combination with docetaxel. © 2004 Blackwell Publishing Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term autonomy in robotics requires perception systems that are resilient to unusual but realistic conditions that will eventually occur during extended missions. For example, unmanned ground vehicles (UGVs) need to be capable of operating safely in adverse and low-visibility conditions, such as at night or in the presence of smoke. The key to a resilient UGV perception system lies in the use of multiple sensor modalities, e.g., operating at different frequencies of the electromagnetic spectrum, to compensate for the limitations of a single sensor type. In this paper, visual and infrared imaging are combined in a Visual-SLAM algorithm to achieve localization. We propose to evaluate the quality of data provided by each sensor modality prior to data combination. This evaluation is used to discard low-quality data, i.e., data most likely to induce large localization errors. In this way, perceptual failures are anticipated and mitigated. An extensive experimental evaluation is conducted on data sets collected with a UGV in a range of environments and adverse conditions, including the presence of smoke (obstructing the visual camera), fire, extreme heat (saturating the infrared camera), low-light conditions (dusk), and at night with sudden variations of artificial light. A total of 240 trajectory estimates are obtained using five different variations of data sources and data combination strategies in the localization method. In particular, the proposed approach for selective data combination is compared to methods using a single sensor type or combining both modalities without preselection. We show that the proposed framework allows for camera-based localization resilient to a large range of low-visibility conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The combination of thermally- and photochemically-induced polymerization using light sensitive alkoxyamines was investigated. The thermally driven polymerizations were performed via the cleavage of the alkoxyamine functionality, whereas the photochemically-induced polymerizations were carried out either by nitroxide mediated photo-polymerization (NMP2) or by a classical type II mechanism, depending on the structure of the light-sensitive alkoxyamine employed. Once the potential of the various structures as initiators of thermally- and photo-induced polymerizations was established, their use in combination for block copolymer syntheses was investigated. With each alkoxyamine investigated, block copolymers were successfully obtained and the system was applied to the post-modification of polymer coatings for application in patterning and photografting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The expression of biomass-degrading enzymes (such as cellobiohydrolases) in transgenic plants has the potential to reduce the costs of biomass saccharification by providing a source of enzymes to supplement commercial cellulase mixtures. Cellobiohydrolases are the main enzymes in commercial cellulase mixtures. In the present study, a cellobiohydrolase was expressed in transgenic corn stover leaf and assessed as an additive for two commercial cellulase mixtures for the saccharification of pretreated sugar cane bagasse obtained by different processes. Results Recombinant cellobiohydrolase in the senescent leaves of transgenic corn was extracted using a simple buffer with no concentration step. The extract significantly enhanced the performance of Celluclast 1.5 L (a commercial cellulase mixture) by up to fourfold on sugar cane bagasse pretreated at the pilot scale using a dilute sulfuric acid steam explosion process compared to the commercial cellulase mixture on its own. Also, the extracts were able to enhance the performance of Cellic CTec2 (a commercial cellulase mixture) up to fourfold on a range of residues from sugar cane bagasse pretreated at the laboratory (using acidified ethylene carbonate/ethylene glycol, 1-butyl-3-methylimidazolium chloride, and ball-milling) and pilot (dilute sodium hydroxide and glycerol/hydrochloric acid steam explosion) scales. We have demonstrated using tap water as a solvent (under conditions that mimic an industrial process) extraction of about 90% recombinant cellobiohydrolase from senescent, transgenic corn stover leaf that had minimal tissue disruption. Conclusions The accumulation of recombinant cellobiohydrolase in senescent, transgenic corn stover leaf is a viable strategy to reduce the saccharification cost associated with the production of fermentable sugars from pretreated biomass. We envisage an industrial-scale process in which transgenic plants provide both fibre and biomass-degrading enzymes for pretreatment and enzymatic hydrolysis, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study reports on an original concept of additive manufacturing for the fabrication of tissue engineered constructs (TEC), offering the possibility of concomitantly manufacturing a customized scaffold and a bioreactor chamber to any size and shape. As a proof of concept towards the development of anatomically relevant TECs, this concept was utilized for the design and fabrication of a highly porous sheep tibia scaffold around which a bioreactor chamber of similar shape was simultaneously built. The morphology of the bioreactor/scaffold device was investigated by micro-computed tomography and scanning electron microscopy confirming the porous architecture of the sheep tibiae as opposed to the non-porous nature of the bioreactor chamber. Additionally, this study demonstrates that both the shape, as well as the inner architecture of the device can significantly impact the perfusion of fluid within the scaffold architecture. Indeed, fluid flow modelling revealed that this was of significant importance for controlling the nutrition flow pattern within the scaffold and the bioreactor chamber, avoiding the formation of stagnant flow regions detrimental for in vitro tissue development. The bioreactor/scaffold device was dynamically seeded with human primary osteoblasts and cultured under bi-directional perfusion for two and six weeks. Primary human osteoblasts were observed homogenously distributed throughout the scaffold, and were viable for the six week culture period. This work demonstrates a novel application for additive manufacturing in the development of scaffolds and bioreactors. Given the intrinsic flexibility of the additive manufacturing technology platform developed, more complex culture systems can be fabricated which would contribute to the advances in customized and patient-specific tissue engineering strategies for a wide range of applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adoptive T cell therapy uses the specificity of the adaptive immune system to target cancer and virally infected cells. Yet the mechanism and means by which to enhance T cell function are incompletely described, especially in the skin. In this study, we use a murine model of immunotherapy to optimize cell-mediated immunity in the skin. We show that in vitro - derived central but not effector memory-like T cells bring about rapid regression of skin-expressing cognate Ag as a transgene in keratinocytes. Local inflammation induced by the TLR7 receptor agonist imiquimod subtly yet reproducibly decreases time to skin graft rejection elicited by central but not effector memory T cells in an immunodeficient mouse model. Local CCL4, a chemokine liberated by TLR7 agonism, similarly enhances central memory T cell function. In this model, IL-2 facilitates the development in vivo of effector function from central memory but not effector memory T cells. In a model of T cell tolerogenesis, we further show that adoptively transferred central but not effector memory T cells can give rise to successful cutaneous immunity, which is dependent on a local inflammatory cue in the target tissue at the time of adoptive T cell transfer. Thus, adoptive T cell therapy efficacy can be enhanced if CD8+ T cells with a central memory T cell phenotype are transferred, and IL-2 is present with contemporaneous local inflammation. Copyright © 2012 by The American Association of Immunologists, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forecasting volatility has received a great deal of research attention, with the relative performances of econometric model based and option implied volatility forecasts often being considered. While many studies find that implied volatility is the pre-ferred approach, a number of issues remain unresolved, including the relative merit of combining forecasts and whether the relative performances of various forecasts are statistically different. By utilising recent econometric advances, this paper considers whether combination forecasts of S&P 500 volatility are statistically superior to a wide range of model based forecasts and implied volatility. It is found that a combination of model based forecasts is the dominant approach, indicating that the implied volatility cannot simply be viewed as a combination of various model based forecasts. Therefore, while often viewed as a superior volatility forecast, the implied volatility is in fact an inferior forecast of S&P 500 volatility relative to model-based forecasts.