949 resultados para Biologically optimal dose combination


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main aim of radiotherapy is to deliver a dose of radiation that is high enough to destroy the tumour cells while at the same time minimising the damage to normal healthy tissues. Clinically, this has been achieved by assigning a prescription dose to the tumour volume and a set of dose constraints on critical structures. Once an optimal treatment plan has been achieved the dosimetry is assessed using the physical parameters of dose and volume. There has been an interest in using radiobiological parameters to evaluate and predict the outcome of a treatment plan in terms of both a tumour control probability (TCP) and a normal tissue complication probability (NTCP). In this study, simple radiobiological models that are available in a commercial treatment planning system were used to compare three dimensional conformal radiotherapy treatments (3D-CRT) and intensity modulated radiotherapy (IMRT) treatments of the prostate. Initially both 3D-CRT and IMRT were planned for 2 Gy/fraction to a total dose of 60 Gy to the prostate. The sensitivity of the TCP and the NTCP to both conventional dose escalation and hypo-fractionation was investigated. The biological responses were calculated using the Källman S-model. The complication free tumour control probability (P+) is generated from the combined NTCP and TCP response values. It has been suggested that the alpha/beta ratio for prostate carcinoma cells may be lower than for most other tumour cell types. The effect of this on the modelled biological response for the different fractionation schedules was also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For certain continuum problems, it is desirable and beneficial to combine two different methods together in order to exploit their advantages while evading their disadvantages. In this paper, a bridging transition algorithm is developed for the combination of the meshfree method (MM) with the finite element method (FEM). In this coupled method, the meshfree method is used in the sub-domain where the MM is required to obtain high accuracy, and the finite element method is employed in other sub-domains where FEM is required to improve the computational efficiency. The MM domain and the FEM domain are connected by a transition (bridging) region. A modified variational formulation and the Lagrange multiplier method are used to ensure the compatibility of displacements and their gradients. To improve the computational efficiency and reduce the meshing cost in the transition region, regularly distributed transition particles, which are independent of either the meshfree nodes or the FE nodes, can be inserted into the transition region. The newly developed coupled method is applied to the stress analysis of 2D solids and structures in order to investigate its’ performance and study parameters. Numerical results show that the present coupled method is convergent, accurate and stable. The coupled method has a promising potential for practical applications, because it can take advantages of both the meshfree method and FEM when overcome their shortcomings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the initial steps of research on planning of rural networks for MV and LV. In this paper, two different cases are studied. In the first case, 100 loads are distributed uniformly on a 100 km transmission line in a distribution network and in the second case, the load structure become closer to the rural situation. In case 2, 21 loads are located in a distribution system so that their distance is increasing, distance between load 1 and 2 is 3 km, between 2 and 3 is 6 km, etc). These two models to some extent represent the distribution system in urban and rural areas, respectively. The objective function for the design of the optimal system consists of three main parts: cost of transformers, and MV and LV conductors. The bus voltage is expressed as a constraint and should be maintained within a standard level, rising or falling by no more than 5%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the problems involving infrastructure delivery have become more complex and contentious, there has been an acknowledgement that these problems cannot be resolved by any one body working alone. This understanding has driven multi-sectoral collaboration and has led to an expansion of the set of actors, including stakeholders, who are now involved in delivery of infrastructure projects and services. However, more needs to be understood about how to include stakeholders in these processes and the optimal ways of developing the requisite combination of stakeholders to achieve effective outcomes. This thesis draws on stakeholder theory and governance network theory to obtain insights into how three networks delivering public outcomes within the Roads Alliance in Queensland engage with stakeholders in the delivery of complex and sensitive infrastructure services and projects. New knowledge about stakeholders will be obtained by testing a model of Stakeholder Salience and Engagement which combines and extends the stakeholder identification and salience theory (Mitchell, Agle, and Wood, 1997), ladder of stakeholder management and engagement (Friedman and Miles, 2006) and the model of stakeholder engagement and moral treatment of stakeholders (Greenwood, 2007). By applying this model, the broad research question: “Who or what decides how stakeholders are optimally engaged by governance networks delivering public outcomes?” will be addressed. The case studies will test a theoretical model of stakeholder salience and engagement which links strategic management decisions about stakeholder salience with the quality and quantity of engagement strategies for engaging different types of stakeholders. The outcomes of this research will contribute to and extend stakeholder theory by showing how stakeholder salience impacts on decisions about the types of engagement processes implemented. Governance network theory will be extended by showing how governance networks interact with stakeholders through the concepts of stakeholder salience and engagement. From a practical perspective this research will provide governance networks with an indication of how to optimise engagement with different types of stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The project has further developed two programs for the industry partners related to service life prediction and salt deposition. The program for Queensland Department of Main Roads which predicts salt deposition on different bridge structures at any point in Queensland has been further refined by looking at more variables. It was found that the height of the bridge significantly affects the salt deposition levels only when very close to the coast. However the effect of natural cleaning of salt by rainfall was incorporated into the program. The user interface allows selection of a location in Queensland, followed by a bridge component. The program then predicts the annual salt deposition rate and rates the likely severity of the environment. The service life prediction program for the Queensland Department of Public Works has been expanded to include 10 common building components, in a variety of environments. Data mining procedures have been used to develop the program and increase the usefulness of the application. A Query Based Learning System (QBLS) has been developed which is based on a data-centric model with extensions to provide support for user interaction. The program is based on number of sources of information about the service life of building components. These include the Delphi survey, the CSIRO Holistic model and a school survey. During the project, the Holistic model was modified for each building component and databases generated for the locations of all Queensland schools. Experiments were carried out to verify and provide parameters for the modelling. These included instrumentation of a downpipe, measurements on pH and chloride levels in leaf litter, EIS measurements and chromate leaching from Colorbond materials and dose tests to measure corrosion rates of new materials. A further database was also generated for inclusion in the program through a large school survey. Over 30 schools in a range of environments from tropical coastal to temperate inland were visited and the condition of the building components rated on a scale of 0-5. The data was analysed and used to calculate an average service life for each component/material combination in the environments, where sufficient examples were available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the placement of sectionalizers, as well as, a cross-connection is optimally determined so that the objective function is minimized. The objective function employed in this paper consists of two main parts, the switch cost and the reliability cost. The switch cost is composed of the cost of sectionalizers and cross-connection and the reliability cost is assumed to be proportional to a reliability index, SAIDI. To optimize the allocation of sectionalizers and cross-connection problem realistically, the cost related to each element is considered as discrete. In consequence of binary variables for the availability of sectionalizers, the problem is extremely discrete. Therefore, the probability of local minimum risk is high and a heuristic-based optimization method is needed. A Discrete Particle Swarm Optimization (DPSO) is employed in this paper to deal with this discrete problem. Finally, a testing distribution system is used to validate the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to take advantage of recent developments in joint factor analysis (JFA) in the context of a phonetically conditioned GMM speaker verification system. Previous work has shown performance advantages through phonetic conditioning, but this has not been shown to date with the JFA framework. Our focus is particularly on strategies for combining the phone-conditioned systems. We show that the classic fusion of the scores is suboptimal when using multiple GMM systems. We investigate several combination strategies in the model space, and demonstrate improvement over score-level combination as well as over a non-phonetic baseline system. This work was conducted during the 2008 CLSP Workshop at Johns Hopkins University.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water-filled portable road safety barriers are a common fixture in road works, however their use of water can be problematic, both in terms of the quantity of water used and the transportation of the water to the installation site. This project aims to develop a new design of portable road safety barrier, which will make novel use of composite and foam materials in order to reduce the barrier’s reliance on water in order to control errant vehicles. The project makes use of finite element (FE) techniques in order to simulate and evaluate design concepts. FE methods and models that have previously been tested and validated will be used in combination in order to provide the most accurate numerical simulations available to drive the project forward. LS-DYNA code is as highly dynamic, non-linear numerical solver which is commonly used in the automotive and road safety industries. Several complex materials and physical interactions are to be simulated throughout the course of the project including aluminium foams, composite laminates and water within the barrier during standardised impact tests. Techniques to be used include FE, smoothed particle hydrodynamics (SPH) and weighted multi-parameter optimisation techniques. A detailed optimisation of several design parameters with specific design goals will be performed with LS-DYNA and LS-OPT, which will require a large number of high accuracy simulations and advanced visualisation techniques. Supercomputing will play a central role in the project, enabling the numerous medium element count simulations necessary in order to determine the optimal design parameters of the barrier to be performed. Supercomputing will also allow the development of useful methods of visualisation results and the production of highly detailed simulations for end-product validation purposes. Efforts thus far have been towards integrating various numerical methods (including FEM, SPH and advanced materials models) together in an efficient and accurate manner. Various designs of joining mechanisms have been developed and are currently being developed into FE models and simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective - We report the first randomised controlled trial (RCT) using a combination of St. John’s wort (SJW) and Kava for the treatment of major depressive disorder (MDD) with comorbid anxiety. Methods - Twenty-eight adults with MDD and co-occurring anxiety were recruited for a double-blind RCT. After a placebo run-in of 2 weeks, the trial had a crossover design testing SJW and Kava against placebo over two controlled phases, each of 4 weeks. The primary analyses used intention-to-treat and completer analyses. Results - On both intention-to-treat ( p¼0.047) and completer analyses ( p¼0.003), SJW and Kava gave a significantly greater reduction in self-reported depression on the Beck Depression Inventory (BDI-II) over placebo in the first controlled phase. However, in the crossover phase, a replication of those effects in the delayed medication group did not occur. Nor were there significant effects on anxiety or quality of life. Conclusion - There was some evidence of antidepressant effects using SJW and Kava in a small sample with comorbid anxiety. Possible explanations for the absence of anxiolysis may include a potential interaction with SJW, the presence of depression, or an inadequate dose of Kava.