163 resultados para Linear Quadratic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work presented in this poster outlines the steps taken to model a 4 mm conical collimator (BrainLab, Germany) on a Novalis Tx linear accelerator (Varian, Palo Alto, USA) capable of producing a 6MV photon beam for treatment of Stereotactic Radiosurgery (SRS) patients. The verification of this model was performed by measurements in liquid water and in virtual water. The measurements involved scanning depth dose and profiles in a water tank plus measurement of output factors in virtual water using Gafchromic® EBT3 film.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to examine the possibility of an inverted U-shaped relationship between job demands and work engagement, and whether social support moderates this relationship. Design/methodology/approach – This study uses 307 technical and information technology (IT) managers who responded to an online survey. Multiple regressions are employed to examine linear and curvilinear relationship among variables. Findings – Overall, results support the applicability of the quadratic effect of job demands on employee engagement. However, only supervisor support, not colleague support, moderated the relationship between job demands and work engagement. Originality/value – The paper is the first to shed light on the quadratic effect of job demands on work engagement. The findings have noteworthy implications for managers to design optimal job demands that increase employee engagement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motorcyclists are the most crash-prone road-user group in many Asian countries including Singapore; however, factors influencing motorcycle crashes are still not well understood. This study examines the effects of various roadway characteristics, traffic control measures and environmental factors on motorcycle crashes at different location types including expressways and intersections. Using techniques of categorical data analysis, this study has developed a set of log-linear models to investigate multi-vehicle motorcycle crashes in Singapore. Motorcycle crash risks in different circumstances have been calculated after controlling for the exposure estimated by the induced exposure technique. Results show that night-time influence increases crash risks of motorcycles particularly during merging and diverging manoeuvres on expressways, and turning manoeuvres at intersections. Riders appear to exercise more care while riding on wet road surfaces particularly during night. Many hazardous interactions at intersections tend to be related to the failure of drivers to notice a motorcycle as well as to judge correctly the speed/distance of an oncoming motorcycle. Road side conflicts due to stopping/waiting vehicles and interactions with opposing traffic on undivided roads have been found to be as detrimental factors on motorcycle safety along arterial, main and local roads away from intersections. Based on the findings of this study, several targeted countermeasures in the form of legislations, rider training, and safety awareness programmes have been recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hybrid system representations have been exploited in a number of challenging modelling situations, including situations where the original nonlinear dynamics are too complex (or too imprecisely known) to be directly filtered. Unfortunately, the question of how to best design suitable hybrid system models has not yet been fully addressed, particularly in the situations involving model uncertainty. This paper proposes a novel joint state-measurement relative entropy rate based approach for design of hybrid system filters in the presence of (parameterised) model uncertainty. We also present a design approach suitable for suboptimal hybrid system filters. The benefits of our proposed approaches are illustrated through design examples and simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

his paper formulates an edge-based smoothed conforming point interpolation method (ES-CPIM) for solid mechanics using the triangular background cells. In the ES-CPIM, a technique for obtaining conforming PIM shape functions (CPIM) is used to create a continuous and piecewise quadratic displacement field over the whole problem domain. The smoothed strain field is then obtained through smoothing operation over each smoothing domain associated with edges of the triangular background cells. The generalized smoothed Galerkin weak form is then used to create the discretized system equations. Numerical studies have demonstrated that the ES-CPIM possesses the following good properties: (1) ES-CPIM creates conforming quadratic PIM shape functions, and can always pass the standard patch test; (2) ES-CPIM produces a quadratic displacement field without introducing any additional degrees of freedom; (3) The results of ES-CPIM are generally of very high accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major challenge in modern photonics and nano-optics is the diffraction limit of light which does not allow field localisation into regions with dimensions smaller than half the wavelength. Localisation of light into nanoscale regions (beyond its diffraction limit) has applications ranging from the design of optical sensors and measurement techniques with resolutions as high as a few nanometres, to the effective delivery of optical energy into targeted nanoscale regions such as quantum dots, nano-electronic and nano-optical devices. This field has become a major research direction over the last decade. The use of strongly localised surface plasmons in metallic nanostructures is one of the most promising approaches to overcome this problem. Therefore, the aim of this thesis is to investigate the linear and non-linear propagation of surface plasmons in metallic nanostructures. This thesis will focus on two main areas of plasmonic research –– plasmon nanofocusing and plasmon nanoguiding. Plasmon nanofocusing – The main aim of plasmon nanofocusing research is to focus plasmon energy into nanoscale regions using metallic nanostructures and at the same time achieve strong local field enhancement. Various structures for nanofocusing purposes have been proposed and analysed such as sharp metal wedges, tapered metal films on dielectric substrates, tapered metal rods, and dielectric V-grooves in metals. However, a number of important practical issues related to nanofocusing in these structures still remain unclear. Therefore, one of the main aims of this thesis is to address two of the most important of issues which are the coupling efficiency and heating effects of surface plasmons in metallic nanostructures. The method of analysis developed throughout this thesis is a general treatment that can be applied to a diversity of nanofocusing structures, with results shown here for the specific case of sharp metal wedges. Based on the geometrical optics approximation, it is demonstrated that the coupling efficiency from plasmons generated with a metal grating into the nanofocused symmetric or quasi-symmetric modes may vary between ~50% to ~100% depending on the structural parameters. Optimal conditions for nanofocusing with the view to minimise coupling and dissipative losses are also determined and discussed. It is shown that the temperature near the tip of a metal wedge heated by nanosecond plasmonic pulses can increase by several hundred degrees Celsius. This temperature increase is expected to lead to nonlinear effects, self-influence of the focused plasmon, and ultimately self-destruction of the metal tip. This thesis also investigates a different type of nanofocusing structure which consists of a tapered high-index dielectric layer resting on a metal surface. It is shown that the nanofocusing mechanism that occurs in this structure is somewhat different from other structures that have been considered thus far. For example, the surface plasmon experiences significant backreflection and mode transformation at a cut-off thickness. In addition, the reflected plasmon shows negative refraction properties that have not been observed in other nanofocusing structures considered to date. Plasmon nanoguiding – Guiding surface plasmons using metallic nanostructures is important for the development of highly integrated optical components and circuits which are expected to have a superior performance compared to their electronicbased counterparts. A number of different plasmonic waveguides have been considered over the last decade including the recently considered gap and trench plasmon waveguides. The gap and trench plasmon waveguides have proven to be difficult to fabricate. Therefore, this thesis will propose and analyse four different modified gap and trench plasmon waveguides that are expected to be easier to fabricate, and at the same time acquire improved propagation characteristics of the guided mode. In particular, it is demonstrated that the guided modes are significantly screened by the extended metal at the bottom of the structure. This is important for the design of highly integrated optics as it provides the opportunity to place two waveguides close together without significant cross-talk. This thesis also investigates the use of plasmonic nanowires to construct a Fabry-Pérot resonator/interferometer. It is shown that the resonance effect can be achieved with the appropriate resonator length and gap width. Typical quality factors of the Fabry- Pérot cavity are determined and explained in terms of radiative and dissipative losses. The possibility of using a nanowire resonator for the design of plasmonic filters with close to ~100% transmission is also demonstrated. It is expected that the results obtained in this thesis will play a vital role in the development of high resolution near field microscopy and spectroscopy, new measurement techniques and devices for single molecule detection, highly integrated optical devices, and nanobiotechnology devices for diagnostics of living cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall objective of this thesis is to explore how and why the content of individuals' psychological contracts changes over time. The contract is generally understood as "individual beliefs, shaped by the organisation, regarding the terms of an exchange agreement between individuals and their organisation" (Rousseau, 1995, p. 9). With an overall study sampling frame of 320 graduate organisational newcomers, a mixed method longitudinal research design comprised of three sequential, inter-related studies is employed in order to capture the change process. From the 15 semi-structured interviews conducted in Study 1, the key findings included identifying a relatively high degree of mutuality between employees' and their managers' reciprocal contract beliefs around the time of organisational entry. Also, at this time, individuals had developed specific components of their contract content through a mix of social network information (regarding broader employment expectations) and perceptions of various elements of their particular organisation's reputation (for more firm-specific expectations). Study 2 utilised a four-wave survey approach (available to the full sampling frame) over the 14 months following organisational entry to explore the 'shape' of individuals' contract change trajectories and the role of four theorised change predictors in driving these trajectories. The predictors represented an organisational-level informational cue (perceptions of corporate reputation), a dyadic-level informational cue (perceptions of manager-employee relationship quality) and two individual difference variables (affect and hardiness). Through the use of individual growth modelling, the findings showed differences in the general change patterns across contract content components of perceived employer (exhibiting generally quadratic change patterns) and employee (exhibiting generally no-change patterns) obligations. Further, individuals differentially used the predictor variables to construct beliefs about specific contract content. While both organisational- and dyadic-level cues were focused upon to construct employer obligation beliefs, organisational-level cues and individual difference variables were focused upon to construct employee obligation beliefs. Through undertaking 26 semi-structured interviews, Study 3 focused upon gaining a richer understanding of why participants' contracts changed, or otherwise, over the study period, with a particular focus upon the roles of breach and violation. Breach refers to an employee's perception that an employer obligation has not been met and violation refers to the negative and affective employee reactions which may ensue following a breach. The main contribution of these findings was identifying that subsequent to a breach or violation event a range of 'remediation effects' could be activated by employees which, depending upon their effectiveness, served to instigate either breach or contract repair or both. These effects mostly instigated broader contract repair and were generally cognitive strategies enacted by an individual to re-evaluate the breach situation and re-focus upon other positive aspects of the employment relationship. As such, the findings offered new evidence for a clear distinction between remedial effects which serve to only repair the breach (and thus the contract) and effects which only repair the contract more broadly; however, when effective, both resulted in individuals again viewing their employment relationships positively. Overall, in response to the overarching research question of this thesis, how and why individuals' psychological contract beliefs change, individuals do indeed draw upon various information sources, particularly at the organisational-level, as cues or guides in shaping their contract content. Further, the 'shapes' of the changes in beliefs about employer and employee obligations generally follow different, and not necessarily linear, trajectories over time. Finally, both breach and violation and also remedial actions, which address these occurrences either by remedying the breach itself (and thus the contract) or the contract only, play central roles in guiding individuals' contract changes to greater or lesser degrees. The findings from the thesis provide both academics and practitioners with greater insights into how employees construct their contract beliefs over time, the salient informational cues used to do this and how the effects of breach and violation can be mitigated through creating an environment which facilitates the use of effective remediation strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays(FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1991, McNabb introduced the concept of mean action time (MAT) as a finite measure of the time required for a diffusive process to effectively reach steady state. Although this concept was initially adopted by others within the Australian and New Zealand applied mathematics community, it appears to have had little use outside this region until very recently, when in 2010 Berezhkovskii and coworkers rediscovered the concept of MAT in their study of morphogen gradient formation. All previous work in this area has been limited to studying single–species differential equations, such as the linear advection–diffusion–reaction equation. Here we generalise the concept of MAT by showing how the theory can be applied to coupled linear processes. We begin by studying coupled ordinary differential equations and extend our approach to coupled partial differential equations. Our new results have broad applications including the analysis of models describing coupled chemical decay and cell differentiation processes, amongst others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear adaptive channel equalization using the least mean square (LMS) algorithm and the recursive least-squares(RLS) algorithm for an innovative multi-user (MU) MIMOOFDM wireless broadband communications system is proposed. The proposed equalization method adaptively compensates the channel impairments caused by frequency selectivity in the propagation environment. Simulations for the proposed adaptive equalizer are conducted using a training sequence method to determine optimal performance through a comparative analysis. Results show an improvement of 0.15 in BER (at a SNR of 16 dB) when using Adaptive Equalization and RLS algorithm compared to the case in which no equalization is employed. In general, adaptive equalization using LMS and RLS algorithms showed to be significantly beneficial for MU-MIMO-OFDM systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Significant wheel-rail dynamic forces occur because of imperfections in the wheels and/or rail. One of the key responses to the transmission of these forces down through the track is impact force on the sleepers. Dynamic analysis of nonlinear systems is very complicated and does not lend itself easily to a classical solution of multiple equations. Trying to deduce the behaviour of track components from experimental data is very difficult because such data is hard to obtain and applies to only the particular conditions of the track being tested. The finite element method can be the best solution to this dilemma. This paper describes a finite element model using the software package ANSYS for various sized flat defects in the tread of a wheel rolling at a typical speed on heavy haul track. The paper explores the dynamic response of a prestressed concrete sleeper to these defects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.