71 resultados para Gross operating margin


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the stability analysis for a distribution static compensator (DSTATCOM) that operates in current control mode based on bifurcation theory. Bifurcations delimit the operating zones of nonlinear circuits and, hence, the capability to compute these bifurcations is of important interest for practical design. A control design for the DSTATCOM is proposed. Along with this control, a suitable mathematical representation of the DSTATCOM is proposed to carry out the bifurcation analysis efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the point of common coupling. In addition, the stability regions in the control gain space, as well as the contour lines for different Floquet multipliers are computed. It is demonstrated through bifurcation analysis that the loss of stability in the DSTATCOM is due to the emergence of a Neimark bifurcation. The observations are verified through simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A technique was developed to investigate the capture/retention characteristic of a gross pollutant trap (GPT) with fully and partially blocked internal screens. Custom modified spheres of variable density filled with liquid were released into the GPT inlet and monitored at the outlet. The outlet data shows that the capture/retention performances of a GPT with fully blocked screens deteriorate rapidly. During higher flow rates, screen blockages below 68% approach maximum efficiency. At lower flow rates, the high performance trend is reversed and the variation in behaviour of pollutants with different densities becomes more noticeable. Additional experiments with a second upstream inlet configured GPT showed an improved capture/retention performance. It was also noted that the bypass allows the incoming pollutants to escape when the GPT is blocked. This useful feature prevents upstream blockages between cleaning intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article deals with the non-linear oscillations assessment of a distribution static comensator ooperating in voltage control mode using the bifurcation theory. A mathematical model of the distribution static compensator in the voltage control mode to carry out the bifurcation analysis is derived. The stabiity regions in the Thevein equivalent plane are computed. In addition, the stability regions in the control gains space, as well as the contour lines for different Floquet multipliers are computed. The AC and DC capacitor impacts on the stability are analyzed through the bifurcation theory. The observations are verified through simulaation studies. The computation of the stability region allows the assessment of the stable operating zones for a power system that includes a distribution static compensator operating in the voltage mode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Transport Certification Australia on-board mass feasibility project is testing various on-board mass devices in a range of heavy vehicles (HVs). Extensive field tests of on-board mass measurement systems for HVs have been conducted during 2008. These tests were of accuracy, robustness and tamper-evidence of heavy vehicle on-board mass telematics. All the systems tested showed accuracies within approximately +/- 500 kg of gross combination mass or approximately +/- 2% of the attendant weighbridge reading. Analysis of the dynamic data also showed encouraging results and has raised the possibility of use of such dynamic information in tamper evidence in two areas. This analysis was to determine if the use of averaged dynamic data could identify potential tampering or incorrect operating procedures as well as the possibility of dynamic measurements flagging a tamper event by the use of metrics including a tampering index (TIX). Technical and business options to detect tamper events will now be developed during implementation of regulatory OBM system application to Australian heavy vehicles (HVs).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the development of a simulation model for operating theatres. Elective patient scheduling is complicated by several factors; stochastic demand for resources due to variation in the nature and severity of a patient’s illness, unexpected complications in a patient’s course of treatment and the arrival of non-scheduled emergency patients which compete for resources. Extend simulation software was used for its ability to represent highly complex systems and analyse model outputs. Patient arrivals and lengths of surgery are determined by analysis of historical data. The model was used to explore the effects increasing patient arrivals and alternative elective patient admission disciplines would have on the performance measures. The model can be used as a decision support system for hospital planners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prognostics and asset life prediction is one of research potentials in engineering asset health management. We previously developed the Explicit Hazard Model (EHM) to effectively and explicitly predict asset life using three types of information: population characteristics; condition indicators; and operating environment indicators. We have formerly studied the application of both the semi-parametric EHM and non-parametric EHM to the survival probability estimation in the reliability field. The survival time in these models is dependent not only upon the age of the asset monitored, but also upon the condition and operating environment information obtained. This paper is a further study of the semi-parametric and non-parametric EHMs to the hazard and residual life prediction of a set of resistance elements. The resistance elements were used as corrosion sensors for measuring the atmospheric corrosion rate in a laboratory experiment. In this paper, the estimated hazard of the resistance element using the semi-parametric EHM and the non-parametric EHM is compared to the traditional Weibull model and the Aalen Linear Regression Model (ALRM), respectively. Due to assuming a Weibull distribution in the baseline hazard of the semi-parametric EHM, the estimated hazard using this model is compared to the traditional Weibull model. The estimated hazard using the non-parametric EHM is compared to ALRM which is a well-known non-parametric covariate-based hazard model. At last, the predicted residual life of the resistance element using both EHMs is compared to the actual life data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comprehensive review of scientific and grey literature on gross pollutant traps (GPTs). GPTs are designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. Their application involves professional societies, research organisations, local city councils, government agencies and the stormwater industry—often in partnership. In view of this, the 113 references include unpublished manuscripts from these bodies along with scientific peer-reviewed conference papers and journal articles. The literature reviewed was organised into a matrix of six main devices and nine research areas (testing methodologies) which include: design appraisal study, field monitoring/testing, experimental flow fields, gross pollutant capture/retention characteristics, residence time calculations, hydraulic head loss, screen blockages, flow visualisations and computational fluid dynamics (CFD). When the fifty-four item matrix was analysed, twenty-eight research gaps were found in the tabulated literature. It was also found that the number of research gaps increased if only the scientific literature was considered. It is hoped, that in addition to informing the research community at QUT, this literature review will also be of use to other researchers in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel and comprehensive testing approach to examine the performance of gross pollutant traps (GPTs) was developed. A proprietary GPT with internal screens for capturing gross pollutants—organic matter and anthropogenic litter—was used as a case study. This work is the first investigation of its kind and provides valuable practical information for the design, selection and operation of GPTs and also the management of street waste in an urban environment. It used a combination of physical and theoretical models to examine in detail the hydrodynamic and capture/retention characteristics of the GPT. The results showed that the GPT operated efficiently until at least 68% of the screens were blocked, particularly at high flow rates. At lower flow rates, the high capture/retention performance trend was reversed. It was also found that a raised inlet GPT offered a better capture/retention performance. This finding indicates that cleaning operations could be more effectively planned in conjunction with the deterioration in GPT’s capture/retention performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the establishment of the first national strategic development plan in the early 1970s, the construction industry has played an important role in terms of the economic, social and cultural development of Indonesia. The industry’s contribution to Indonesia’s gross domestic product (GDP) increased from 3.9% in 1973 to 7.7% in 2007. Business Monitoring International (2009) forecasts that Indonesia is home to one of the fastest-growing construction industries in Asia despite the average construction growth rate being expected to remain under 10% over the period 2006 – 2010. Similarly, Howlett and Powell (2006) place Indonesia as one of the 20 largest construction markets in 2010. Although the prospects for the Indonesian construction industry are now very promising, many local construction firms still face serious difficulties, such as poor performance and low competitiveness. There are two main reasons behind this problem: the environment that they face is not favourable; the other is the lack of strategic direction to improve competitiveness and performance. Furthermore, although strategic management has now become more widely used by many large construction firms in developed countries, practical examples and empirical studies related to the Indonesian construction industry remain scarce. In addition, research endeavours related to these topics in developing countries appear to be limited. This has potentially become one of the factors hampering efforts to guide Indonesian construction enterprises. This research aims to construct a conceptual model to enable Indonesian construction enterprises to develop a sound long-term corporate strategy that generates competitive advantage and superior performance. The conceptual model seeks to address the main prescription of a dynamic capabilities framework (Teece, Pisano & Shuen, 1997; Teece, 2007) within the context of the Indonesian construction industry. It is hypothesised that in a rapidly changing and varied environment, competitive success arises from the continuous development and reconfiguration of firm’s specific assets achieving competitive advantage is not only dependent on the exploitation of specific assets/capabilities, but on the exploitation of all of the assets and capabilities combinations in the dynamic capabilities framework. Thus, the model is refined through sequential statistical regression analyses of survey results with a sample size of 120 valid responses. The results of this study provide empirical evidence in support of the notion that a competitive advantage is achieved via the implementation of a dynamic capability framework as an important way for a construction enterprise to improve its organisational performance. The characteristics of asset-capability combinations were found to be significant determinants of the competitive advantage of the Indonesian construction enterprises, and that such advantage sequentially contributes to organisational performance. If a dynamic capabilities framework can work in the context of Indonesia, it suggests that the framework has potential applicability in other emerging and developing countries. This study also demonstrates the importance of the multi-stage nature of the model which provides a rich understanding of the dynamic process by which asset-capability should be exploited in combination by the construction firms operating in varying levels of hostility. Such findings are believed to be useful to both academics and practitioners, however, as this research represents a dynamic capabilities framework at the enterprise level, future studies should continue to explore and examine the framework in other levels of strategic management in construction as well as in other countries where different cultures or similar conditions prevails.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While recent research has provided valuable information as to the composition of laser printer particles, their formation mechanisms, and explained why some printers are emitters whilst others are low emitters, fundamental questions relating to the potential exposure of office workers remained unanswered. In particular, (i) what impact does the operation of laser printers have on the background particle number concentration (PNC) of an office environment over the duration of a typical working day?; (ii) what is the airborne particle exposure to office workers in the vicinity of laser printers; (iii) what influence does the office ventilation have upon the transport and concentration of particles?; (iv) is there a need to control the generation of, and/or transport of particles arising from the operation of laser printers within an office environment?; (v) what instrumentation and methodology is relevant for characterising such particles within an office location? We present experimental evidence on printer temporal and spatial PNC during the operation of 107 laser printers within open plan offices of five buildings. We show for the first time that the eight-hour time-weighted average printer particle exposure is significantly less than the eight-hour time-weighted local background particle exposure, but that peak printer particle exposure can be greater than two orders of magnitude higher than local background particle exposure. The particle size range is predominantly ultrafine (< 100nm diameter). In addition we have established that office workers are constantly exposed to non-printer derived particle concentrations, with up to an order of magnitude difference in such exposure amongst offices, and propose that such exposure be controlled along with exposure to printer derived particles. We also propose, for the first time, that peak particle reference values be calculated for each office area analogous to the criteria used in Australia and elsewhere for evaluating exposure excursion above occupational hazardous chemical exposure standards. A universal peak particle reference value of 2.0 x 104 particles cm-3 has been proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the surprising recurring phenomena observed in experiments with boosting is that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero. In this paper, we show that this phenomenon is related to the distribution of margins of the training examples with respect to the generated voting classification rule, where the margin of an example is simply the difference between the number of correct votes and the maximum number of votes received by any incorrect label. We show that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. We also show theoretically and experimentally that boosting is especially effective at increasing the margins of the training examples. Finally, we compare our explanation to those based on the bias-variance decomposition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Log-linear and maximum-margin models are two commonly-used methods in supervised machine learning, and are frequently used in structured prediction problems. Efficient learning of parameters in these models is therefore an important problem, and becomes a key factor when learning from very large data sets. This paper describes exponentiated gradient (EG) algorithms for training such models, where EG updates are applied to the convex dual of either the log-linear or max-margin objective function; the dual in both the log-linear and max-margin cases corresponds to minimizing a convex function with simplex constraints. We study both batch and online variants of the algorithm, and provide rates of convergence for both cases. In the max-margin case, O(1/ε) EG updates are required to reach a given accuracy ε in the dual; in contrast, for log-linear models only O(log(1/ε)) updates are required. For both the max-margin and log-linear cases, our bounds suggest that the online EG algorithm requires a factor of n less computation to reach a desired accuracy than the batch EG algorithm, where n is the number of training examples. Our experiments confirm that the online algorithms are much faster than the batch algorithms in practice. We describe how the EG updates factor in a convenient way for structured prediction problems, allowing the algorithms to be efficiently applied to problems such as sequence learning or natural language parsing. We perform extensive evaluation of the algorithms, comparing them to L-BFGS and stochastic gradient descent for log-linear models, and to SVM-Struct for max-margin models. The algorithms are applied to a multi-class problem as well as to a more complex large-scale parsing task. In all these settings, the EG algorithms presented here outperform the other methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of structured classification, where the task is to predict a label y from an input x, and y has meaningful internal structure. Our framework includes supervised training of Markov random fields and weighted context-free grammars as special cases. We describe an algorithm that solves the large-margin optimization problem defined in [12], using an exponential-family (Gibbs distribution) representation of structured objects. The algorithm is efficient—even in cases where the number of labels y is exponential in size—provided that certain expectations under Gibbs distributions can be calculated efficiently. The method for structured labels relies on a more general result, specifically the application of exponentiated gradient updates [7, 8] to quadratic programs.