927 resultados para Piecewise linear techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The commercialization of aerial image processing is highly dependent on the platforms such as UAVs (Unmanned Aerial Vehicles). However, the lack of an automated UAV forced landing site detection system has been identified as one of the main impediments to allow UAV flight over populated areas in civilian airspace. This article proposes a UAV forced landing site detection system that is based on machine learning approaches including the Gaussian Mixture Model and the Support Vector Machine. A range of learning parameters are analysed including the number of Guassian mixtures, support vector kernels including linear, radial basis function Kernel (RBF) and polynormial kernel (poly), and the order of RBF kernel and polynormial kernel. Moreover, a modified footprint operator is employed during feature extraction to better describe the geometric characteristics of the local area surrounding a pixel. The performance of the presented system is compared to a baseline UAV forced landing site detection system which uses edge features and an Artificial Neural Network (ANN) region type classifier. Experiments conducted on aerial image datasets captured over typical urban environments reveal improved landing site detection can be achieved with an SVM classifier with an RBF kernel using a combination of colour and texture features. Compared to the baseline system, the proposed system provides significant improvement in term of the chance to detect a safe landing area, and the performance is more stable than the baseline in the presence of changes to the UAV altitude.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To estimate the relative inpatient costs of hospital-acquired conditions. Methods: Patient level costs were estimated using computerized costing systems that log individual utilization of inpatient services and apply sophisticated cost estimates from the hospital's general ledger. Occurrence of hospital-acquired conditions was identified using an Australian ‘condition-onset' flag for diagnoses not present on admission. These were grouped to yield a comprehensive set of 144 categories of hospital-acquired conditions to summarize data coded with ICD-10. Standard linear regression techniques were used to identify the independent contribution of hospital-acquired conditions to costs, taking into account the case-mix of a sample of acute inpatients (n = 1,699,997) treated in Australian public hospitals in Victoria (2005/06) and Queensland (2006/07). Results: The most costly types of complications were post-procedure endocrine/metabolic disorders, adding AU$21,827 to the cost of an episode, followed by MRSA (AU$19,881) and enterocolitis due to Clostridium difficile (AU$19,743). Aggregate costs to the system, however, were highest for septicaemia (AU$41.4 million), complications of cardiac and vascular implants other than septicaemia (AU$28.7 million), acute lower respiratory infections, including influenza and pneumonia (AU$27.8 million) and UTI (AU$24.7 million). Hospital-acquired complications are estimated to add 17.3% to treatment costs in this sample. Conclusions: Patient safety efforts frequently focus on dramatic but rare complications with very serious patient harm. Previous studies of the costs of adverse events have provided information on ‘indicators’ of safety problems rather than the full range of hospital-acquired conditions. Adding a cost dimension to priority-setting could result in changes to the focus of patient safety programmes and research. Financial information should be combined with information on patient outcomes to allow for cost-utility evaluation of future interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines and compares imaging methods used during the radiotherapy treatment of prostate cancer. The studies found that radiation therapists were able to localise and target the prostate consistently with planar imaging techniques and that the use of small gold markers in the prostate reduced the variation in prostate localisation when using volumetric imaging. It was concluded that larger safety margins are required when using volumetric imaging without gold markers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear assets are engineering infrastructure, such as pipelines, railway lines, and electricity cables, which span long distances and can be divided into different segments. Optimal management of such assets is critical for asset owners as they normally involve significant capital investment. Currently, Time Based Preventive Maintenance (TBPM) strategies are commonly used in industry to improve the reliability of such assets, as they are easy to implement compared with reliability or risk-based preventive maintenance strategies. Linear assets are normally of large scale and thus their preventive maintenance is costly. Their owners and maintainers are always seeking to optimize their TBPM outcomes in terms of minimizing total expected costs over a long term involving multiple maintenance cycles. These costs include repair costs, preventive maintenance costs, and production losses. A TBPM strategy defines when Preventive Maintenance (PM) starts, how frequently the PM is conducted and which segments of a linear asset are operated on in each PM action. A number of factors such as required minimal mission time, customer satisfaction, human resources, and acceptable risk levels need to be considered when planning such a strategy. However, in current practice, TBPM decisions are often made based on decision makers’ expertise or industrial historical practice, and lack a systematic analysis of the effects of these factors. To address this issue, here we investigate the characteristics of TBPM of linear assets, and develop an effective multiple criteria decision making approach for determining an optimal TBPM strategy. We develop a recursive optimization equation which makes it possible to evaluate the effect of different maintenance options for linear assets, such as the best partitioning of the asset into segments and the maintenance cost per segment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article aims to fill in the gap of the second-order accurate schemes for the time-fractional subdiffusion equation with unconditional stability. Two fully discrete schemes are first proposed for the time-fractional subdiffusion equation with space discretized by finite element and time discretized by the fractional linear multistep methods. These two methods are unconditionally stable with maximum global convergence order of $O(\tau+h^{r+1})$ in the $L^2$ norm, where $\tau$ and $h$ are the step sizes in time and space, respectively, and $r$ is the degree of the piecewise polynomial space. The average convergence rates for the two methods in time are also investigated, which shows that the average convergence rates of the two methods are $O(\tau^{1.5}+h^{r+1})$. Furthermore, two improved algorithms are constrcted, they are also unconditionally stable and convergent of order $O(\tau^2+h^{r+1})$. Numerical examples are provided to verify the theoretical analysis. The comparisons between the present algorithms and the existing ones are included, which show that our numerical algorithms exhibit better performances than the known ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis is concerned with the detection and prediction of rain in environmental recordings using different machine learning algorithms. The results obtained in this research will help ecologists to efficiently analyse environmental data and monitor biodiversity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the finite element modelling of structural frames, external loads such as wind loads, dead loads and imposed loads usually act along the elements rather than at the nodes only. Conventionally, when an element is subjected to these general transverse element loads, they are usually converted to nodal forces acting at the ends of the elements by either lumping or consistent load approaches. In addition, it is especially important for an element subjected to the first- and second-order elastic behaviour, to which the steel structure is critically prone to; in particular the thin-walled steel structures, when the stocky element section may be generally critical to the inelastic behaviour. In this sense, the accurate first- and second-order elastic displacement solutions of element load effect along an element is vitally crucial, but cannot be simulated using neither numerical nodal nor consistent load methods alone, as long as no equilibrium condition is enforced in the finite element formulation, which can inevitably impair the structural safety of the steel structure particularly. It can be therefore regarded as a unique element load method to account for the element load nonlinearly. If accurate displacement solution is targeted for simulating the first- and second-order elastic behaviour on an element on the basis of sophisticated non-linear element stiffness formulation, the numerous prescribed stiffness matrices must indispensably be used for the plethora of specific transverse element loading patterns encountered. In order to circumvent this shortcoming, the present paper proposes a numerical technique to include the transverse element loading in the non-linear stiffness formulation without numerous prescribed stiffness matrices, and which is able to predict structural responses involving the effect of first-order element loads as well as the second-order coupling effect between the transverse load and axial force in the element. This paper shows that the principle of superposition can be applied to derive the generalized stiffness formulation for element load effect, so that the form of the stiffness matrix remains unchanged with respect to the specific loading patterns, but with only the magnitude of the loading (element load coefficients) being needed to be adjusted in the stiffness formulation, and subsequently the non-linear effect on element loadings can be commensurate by updating the magnitude of element load coefficients through the non-linear solution procedures. In principle, the element loading distribution is converted into a single loading magnitude at mid-span in order to provide the initial perturbation for triggering the member bowing effect due to its transverse element loads. This approach in turn sacrifices the effect of element loading distribution except at mid-span. Therefore, it can be foreseen that the load-deflection behaviour may not be as accurate as those at mid-span, but its discrepancy is still trivial as proved. This novelty allows for a very useful generalised stiffness formulation for a single higher-order element with arbitrary transverse loading patterns to be formulated. Moreover, another significance of this paper is placed on shifting the nodal response (system analysis) to both nodal and element response (sophisticated element formulation). For the conventional finite element method, such as the cubic element, all accurate solutions can be only found at node. It means no accurate and reliable structural safety can be ensured within an element, and as a result, it hinders the engineering applications. The results of the paper are verified using analytical stability function studies, as well as with numerical results reported by independent researchers on several simple frames.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study sought to explore the initial impact of the ACT's implementation of roadside oral fluid drug screening program. The results suggest that a number of individuals reported intentions to drug drive in the future. The classical deterrence theory variables of certainty of apprehension, severity and swiftness of sanctions were not predictive of intentions to drug drive in the future. In contrast, having avoided apprehension and having known of others that have avoided apprehension were predictive of intentions to drug drive in the future. Increasing perceptions of the certainty of apprehension, increased testing frequency, and increased awareness of the oral fluid drug screening program could potentially lead to reductions of drug driving and result in safer road environment for all ACT community members.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organizations executing similar business processes need to understand the differences and similarities in activities performed across work environments. Presently, research interest is directed towards the potential of visualization for the display of process models, to support users in their analysis tasks. Although recent literature in process mining and comparison provide several methods and algorithms to perform process and log comparison, few contributions explore novel visualization approaches. This paper analyses process comparison from a design perspective, providing some practical visualization techniques as anal- ysis solutions (/to support process analysis). The design of the visual comparison has been tackled through three different points of view: the general model, the projected model and the side-by-side comparison in order to support the needs of business analysts. A case study is presented showing the application of process mining and visualization techniques to patient treatment across two Australian hospitals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we excite bound long range stripe plasmon modes with a highly focused laser beam. We demonstrate highly confined plasmons propagating along a 50 μm long silver stripe 750 nm wide and 30 nm thick. Two excitation techniques were studied: focusing the laser spot onto the waveguide end and focusing the laser spot onto a silver grating. By comparing the intensity of the out-coupling photons at the end of the stripe for both grating and end excitation we are able to show that gratings provide an increase of a factor of two in the output intensity and thus out-coupling of plasmons excited by this technique are easier to detect. Authors expect that the outcome of this paper will prove beneficial for the development of passive nano-optical devices based on stripe waveguides, by providing insight into the different excitation techniques available and the advantages of each technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resource assignment and scheduling is a difficult task when job processing times are stochastic, and resources are to be used for both known and unknown demand. To operate effectively within such an environment, several novel strategies are investigated. The first focuses upon the creation of a robust schedule, and utilises the concept of strategically placed idle time (i.e. buffering). The second approach introduces the idea of maintaining a number of free resources at each time, and culminates in another form of strategically placed buffering. The attraction of these approaches is that they are easy to grasp conceptually, and mimic what practitioners already do in practice. Our extensive numerical testing has shown that these techniques ensure more prompt job processing, and reduced job cancellations and waiting time. They are effective in the considered setting and could easily be adapted for many real life problems, for instance those in health care. This article has more importantly demonstrated that integrating the two approaches is a better strategy and will provide an effective stochastic scheduling approach.