253 resultados para Averaging operators


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Failing injectors are one of the most common faults in diesel engines. The severity of these faults could have serious effects on diesel engine operations such as engine misfire, knocking, insufficient power output or even cause a complete engine breakdown. It is thus essential to prevent such faults from occurring by monitoring the condition of these injectors. In this paper, the authors present the results of an experimental investigation on identifying the signal characteristics of a simulated incipient injector fault in a diesel engine using both in-cylinder pressure and acoustic emission (AE) techniques. A time waveform event driven synchronous averaging technique was used to minimize or eliminate the effect of engine speed variation and amplitude fluctuation. It was found that AE is an effective method to detect the simulated injector fault in both time (crank angle) and frequency (order) domains. It was also shown that the time domain in-cylinder pressure signal is a poor indicator for condition monitoring and diagnosis of the simulated injector fault due to the small effect of the simulated fault on the engine combustion process. Nevertheless, good correlations between the simulated injector fault and the lower order components of the enveloped in-cylinder pressure spectrum were found at various engine loading conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biomarker analysis has been implemented in sports research in an attempt to monitor the effects of exertion and fatigue in athletes. This study proposed that while such biomarkers may be useful for monitoring injury risk in workers, proteomic approaches might also be utilised to identify novel exertion or injury markers. We found that urinary urea and cortisol levels were significantly elevated in mining workers following a 12 hour overnight shift. These levels failed to return to baseline over 24h in the more active maintenance crew compared to truck drivers (operators) suggesting a lack of recovery between shifts. Use of a SELDI-TOF MS approach to detect novel exertion or injury markers revealed a spectral feature which was associated with workers in both work categories who were engaged in higher levels of physical activity. This feature was identified as the LG3 peptide, a C-terminal fragment of the anti-angiogenic / anti-tumourigenic protein endorepellin. This finding suggests that urinary LG3 peptide may be a biomarker of physical activity. It is also possible that the activity mediated release of LG3 / endorepellin into the circulation may represent a biological mechanism for the known inverse association between physical activity and cancer risk / survival.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deterministic transit capacity analysis applies to planning, design and operational management of urban transit systems. The Transit Capacity and Quality of Service Manual (1) and Vuchic (2, 3) enable transit performance to be quantified and assessed using transit capacity and productive capacity. This paper further defines important productive performance measures of an individual transit service and transit line. Transit work (p-km) captures the transit task performed over distance. Passenger transmission (p-km/h) captures the passenger task delivered by service at speed. Transit productiveness (p-km/h) captures transit work performed over time. These measures are useful to operators in understanding their services’ or systems’ capabilities and passenger quality of service. This paper accounts for variability in utilized demand by passengers along a line and high passenger load conditions where passenger pass-up delay occurs. A hypothetical case study of an individual bus service’s operation demonstrates the usefulness of passenger transmission in comparing existing and growth scenarios. A hypothetical case study of a bus line’s operation during a peak hour window demonstrates the theory’s usefulness in examining the contribution of individual services to line productive performance. Scenarios may be assessed using this theory to benchmark or compare lines and segments, conditions, or consider improvements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road dust contain potentially toxic pollutants originating from a range of anthropogenic sources common to urban land uses and soil inputs from surrounding areas. The research study analysed the mineralogy and morphology of dust samples from road surfaces from different land uses and background soil samples to characterise the relative source contributions to road dust. The road dust consist primarily of soil derived minerals (60%) with quartz averaging 40-50% and remainder being clay forming minerals of albite, microcline, chlorite and muscovite originating from surrounding soils. About 2% was organic matter primarily originating from plant matter. Potentially toxic pollutants represented about 30% of the build-up. These pollutants consist of brake and tire wear, combustion emissions and fly ash from asphalt. Heavy metals such as Zn, Cu, Pb, Ni, Cr and Cd primarily originate from vehicular traffic while Fe, Al and Mn primarily originate from surrounding soils. The research study confirmed the significant contribution of vehicular traffic to dust deposited on urban road surfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current regulatory approach to coal seam gas projects in Queensland is based on the philosophy of adaptive environmental management. This method of “learning by doing” is implemented in Queensland primarily through the imposition of layered monitoring and reporting duties on the coal seam gas operator alongside obligations to compensate and “make good” harm caused. The purpose of this article is to provide a critical review of the Queensland regulatory approach to the approval and minimisation of adverse impacts from coal seam gas activities. Following an overview of the hallmarks of an effective adaptive management approach, this article begins by addressing the mosaic of approval processes and impact assessment regimes that may apply to coal seam gas projects. This includes recent Strategic Cropping Land reforms. This article then turns to consider the preconditions for land access in Queensland and the emerging issues for landholders relating to the negotiation of access and compensation agreements. This article then undertakes a critical review of the environmental duties imposed on coal seam gas operators relating to hydraulic fracturing, well head leaks, groundwater management and the disposal and beneficial use of produced water. Finally, conclusions are drawn regarding the overall effectiveness of the Queensland framework and the lessons that may be drawn from Queensland’s adaptive environmental management approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents a critical analysis of the current and proposed CCS legal frameworks across a number of jurisdictions in Australia in order to examine the legal treatment of the risks of carbon leakage from CCS operations. It does so through an analysis of the statutory obligations and liability rules established under the offshore Commonwealth and Victorian regimes, and onshore Queensland and Victorian legislative frameworks. Exposure draft legislation for CCS laws in Western Australia is also examined. In considering where the losses will fall in the event of leakage, the potential tortious and statutory liabilities of private operators and the State are addressed alongside the operation of statutory protections from liability. The current legal treatment of CCS under the new Australian Carbon Pricing Mechanism is also critiqued.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Multimodal Seaport Container Terminal (MSCT) is a complex system which requires careful planning and control in order to operate efficiently. It consists of a number of subsystems that require optimisation of the operations within them, as well as synchronisation of machines and containers between the various subsystems. Inefficiency in the terminal can delay ships from their scheduled timetables, as well as cause delays in delivering containers to their inland destinations, both of which can be very costly to their operators. The purpose of this PhD thesis is to use Operations Research methodologies to optimise and synchronise these subsystems as an integrated application. An initial model is developed for the overall MSCT; however, due to a large number of assumptions that had to be made, as well as other issues, it is found to be too inaccurate and infeasible for practical use. Instead, a method of developing models for each subsystem is proposed that then be integrated with each other. Mathematical models are developed for the Storage Area System (SAS) and Intra-terminal Transportation System (ITTS). The SAS deals with the movement and assignment of containers to stacks within the storage area, both when they arrive and when they are rehandled to retrieve containers below them. The ITTS deals with scheduling the movement of containers and machines between the storage areas and other sections of the terminal, such as the berth and road/rail terminals. Various constructive heuristics are explored and compared for these models to produce good initial solutions for large-sized problems, which are otherwise impractical to compute by exact methods. These initial solutions are further improved through the use of an innovative hyper-heuristic algorithm that integrates the SAS and ITTS solutions together and optimises them through meta-heuristic techniques. The method by which the two models can interact with each other as an integrated system will be discussed, as well as how this method can be extended to the other subsystems of the MSCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Retrofit projects are different from newly-built projects in many respects. A retrofit project involves an existing building, which imposes constraints on the owners, designers, operators and constructors throughout the project process. Retrofit projects are risky, complex, less predictable and difficult to be well planned, which need greater coordination. For office building retrofit project, further restrictions will apply as these buildings often locate in CBD areas and most have to remain operational during the progression of project work. Issues such as site space, material storage and handling, noise and dust, need to be considered and well addressed. In this context, waste management is even more challenging with small spaces for waste handling, uncertainties in waste control, and impact of waste management activities on project delivery and building occupants. Current literatures on waste management in office building retrofit projects focus on increasing waste recovery rate based on project planning, monitoring and stakeholders’ collaboration. However, previous research has not produced knowledge of understanding the particular retrofit processes and their impact on waste generation and management. This paper discusses the interim results of a continuing research on new strategies for waste management in office building retrofit projects. Firstly based on the literature review, it summarizes the unique characteristics of office building retrofit projects and their influence on waste management. An assumption on waste management strategies is formed. Semi-structured interviews were conducted towards industry practitioners and findings are then presented in the paper. The assumption of the research was validated in the interviews from the opinions and experiences of the respondents. Finally the research develops a process model for waste management in office building retrofit projects. It introduces two different waste management strategies. For the dismantling phase, waste is generated fast along with the work progress, so integrated planning for project delivery and waste generation is needed in order to organize prompt handling and treatment. For the fit-out phase, the work is similar as new construction. Factors which are particularly linked to generating waste on site need to be controlled and monitored. Continuing research in this space will help improve the practice of waste management in office building retrofit projects. The new strategies will help promote the practicality of project waste planning and management and stakeholders’ capability of coordinating waste management and project delivery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fractional differential equations are becoming more widely accepted as a powerful tool in modelling anomalous diffusion, which is exhibited by various materials and processes. Recently, researchers have suggested that rather than using constant order fractional operators, some processes are more accurately modelled using fractional orders that vary with time and/or space. In this paper we develop computationally efficient techniques for solving time-variable-order time-space fractional reaction-diffusion equations (tsfrde) using the finite difference scheme. We adopt the Coimbra variable order time fractional operator and variable order fractional Laplacian operator in space where both orders are functions of time. Because the fractional operator is nonlocal, it is challenging to efficiently deal with its long range dependence when using classical numerical techniques to solve such equations. The novelty of our method is that the numerical solution of the time-variable-order tsfrde is written in terms of a matrix function vector product at each time step. This product is approximated efficiently by the Lanczos method, which is a powerful iterative technique for approximating the action of a matrix function by projecting onto a Krylov subspace. Furthermore an adaptive preconditioner is constructed that dramatically reduces the size of the required Krylov subspaces and hence the overall computational cost. Numerical examples, including the variable-order fractional Fisher equation, are presented to demonstrate the accuracy and efficiency of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study the interplay effects for Enhanced Dynamic Wedge (EDW) treatments are experimentally investigated. Single and multiple field EDW plans for different wedge angles were delivered to a phantom and detector on a moving platform, with various periods, amplitudes for parallel and perpendicular motions. A four field 4D CT planned lung EDW treatment was delivered to a dummy tumor over four fractions. For the single field parallel case the amplitude and the period of motion both affect the interplay resulting in the appearance of a step function and penumbral cut off with the discrepancy worst where collimator-tumor speed is similar. For perpendicular motion the amplitude of tumor motion is the only dominant factor. For large wedge angle the dose discrepancy is more pronounced compared to the small wedge angle for the same field size and amplitude-period values. For a small field size i.e. 5 × 5 cm2 the loss of wedged distribution was observed for both 60º and 15º wedge angles for of parallel and perpendicular motions. Film results from 4D CT planned delivery displayed a mix of over and under dosages over 4 fractions, with the gamma pass rate of 40% for the averaged film image at 3%/1 mm DTA (Distance to Agreement). Amplitude and period of the tumor motion both affect the interplay for single and multi-field EDW treatments and for a limited (4 or 5) fraction delivery there is a possibility of non-averaging of the EDW interplay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of tumour motion during radiation therapy delivery have been widely investigated. Motion effects have become increasingly important with the introduction of dynamic radiotherapy delivery modalities such as enhanced dynamic wedges (EDWs) and intensity modulated radiation therapy (IMRT) where a dynamically collimated radiation beam is delivered to the moving target, resulting in dose blurring and interplay effects which are a consequence of the combined tumor and beam motion. Prior to this work, reported studies on the EDW based interplay effects have been restricted to the use of experimental methods for assessing single-field non-fractionated treatments. In this work, the interplay effects have been investigated for EDW treatments. Single and multiple field treatments have been studied using experimental and Monte Carlo (MC) methods. Initially this work experimentally studies interplay effects for single-field non-fractionated EDW treatments, using radiation dosimetry systems placed on a sinusoidaly moving platform. A number of wedge angles (60º, 45º and 15º), field sizes (20 × 20, 10 × 10 and 5 × 5 cm2), amplitudes (10-40 mm in step of 10 mm) and periods (2 s, 3 s, 4.5 s and 6 s) of tumor motion are analysed (using gamma analysis) for parallel and perpendicular motions (where the tumor and jaw motions are either parallel or perpendicular to each other). For parallel motion it was found that both the amplitude and period of tumor motion affect the interplay, this becomes more prominent where the collimator tumor speeds become identical. For perpendicular motion the amplitude of tumor motion is the dominant factor where as varying the period of tumor motion has no observable effect on the dose distribution. The wedge angle results suggest that the use of a large wedge angle generates greater dose variation for both parallel and perpendicular motions. The use of small field size with a large tumor motion results in the loss of wedged dose distribution for both parallel and perpendicular motion. From these single field measurements a motion amplitude and period have been identified which show the poorest agreement between the target motion and dynamic delivery and these are used as the „worst case motion parameters.. The experimental work is then extended to multiple-field fractionated treatments. Here a number of pre-existing, multiple–field, wedged lung plans are delivered to the radiation dosimetry systems, employing the worst case motion parameters. Moreover a four field EDW lung plan (using a 4D CT data set) is delivered to the IMRT quality control phantom with dummy tumor insert over four fractions using the worst case parameters i.e. 40 mm amplitude and 6 s period values. The analysis of the film doses using gamma analysis at 3%-3mm indicate the non averaging of the interplay effects for this particular study with a gamma pass rate of 49%. To enable Monte Carlo modelling of the problem, the DYNJAWS component module (CM) of the BEAMnrc user code is validated and automated. DYNJAWS has been recently introduced to model the dynamic wedges. DYNJAWS is therefore commissioned for 6 MV and 10 MV photon energies. It is shown that this CM can accurately model the EDWs for a number of wedge angles and field sizes. The dynamic and step and shoot modes of the CM are compared for their accuracy in modelling the EDW. It is shown that dynamic mode is more accurate. An automation of the DYNJAWS specific input file has been carried out. This file specifies the probability of selection of a subfield and the respective jaw coordinates. This automation simplifies the generation of the BEAMnrc input files for DYNJAWS. The DYNJAWS commissioned model is then used to study multiple field EDW treatments using MC methods. The 4D CT data of an IMRT phantom with the dummy tumor is used to produce a set of Monte Carlo simulation phantoms, onto which the delivery of single field and multiple field EDW treatments is simulated. A number of static and motion multiple field EDW plans have been simulated. The comparison of dose volume histograms (DVHs) and gamma volume histograms (GVHs) for four field EDW treatments (where the collimator and patient motion is in the same direction) using small (15º) and large wedge angles (60º) indicates a greater mismatch between the static and motion cases for the large wedge angle. Finally, to use gel dosimetry as a validation tool, a new technique called the „zero-scan method. is developed for reading the gel dosimeters with x-ray computed tomography (CT). It has been shown that multiple scans of a gel dosimeter (in this case 360 scans) can be used to reconstruct a zero scan image. This zero scan image has a similar precision to an image obtained by averaging the CT images, without the additional dose delivered by the CT scans. In this investigation the interplay effects have been studied for single and multiple field fractionated EDW treatments using experimental and Monte Carlo methods. For using the Monte Carlo methods the DYNJAWS component module of the BEAMnrc code has been validated and automated and further used to study the interplay for multiple field EDW treatments. Zero-scan method, a new gel dosimetry readout technique has been developed for reading the gel images using x-ray CT without losing the precision and accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this chapter is to provide rail practitioners with a practical approach for determining safety requirements of low-cost level crossing warning devices (LCLCWDs) on an Australian railway by way of a case study. LCLCWDs, in theory, allow railway operators to improve the safety of passively controlled crossing by upgrading a larger number of level crossings with the same budget that would otherwise be used to upgrade these using the conventional active level crossing control technologies, e.g. track circuit initiated flashing light systems. The chapter discusses the experience and obstacles of adopting LCLCWDs in Australia, and demonstrates how the risk-based approach may be used to make the case for LCLCWDs.