117 resultados para Jernström Offset


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Cervical cancer and infection with human immunodeficiency virus (HIV) are both important public health problems in South Africa (SA). The aim of this study was to determine the prevalence of cervical squamous intraepithelial lesions (SILs), high-risk human papillomavirus (HR-HPV), HPV viral load and HPV genotypes in HIV positive women initiating anti-retroviral (ARV) therapy. Methods A cross-sectional survey was conducted at an anti-retroviral (ARV) treatment clinic in Cape Town, SA in 2007. Cervical specimens were taken for cytological analysis and HPV testing. The Digene Hybrid Capture 2 (HC2) test was used to detect HR-HPV. Relative light units (RLU) were used as a measure of HPV viral load. HPV types were determined using the Roche Linear Array HPV Genotyping test. Crude associations with abnormal cytology were tested and multiple logistic regression was used to determine independent risk factors for abnormal cytology. Results The median age of the 109 participants was 31 years, the median CD4 count was 125/mm3, 66.3% had an abnormal Pap smear, the HR-HPV prevalence was 78.9% (Digene), the median HPV viral load was 181.1 RLU (HC2 positive samples only) and 78.4% had multiple genotypes. Among women with abnormal smears the most prevalent HR-HPV types were HPV types 16, 58 and 51, all with a prevalence of 28.5%. On univariate analysis HR-HPV, multiple HPV types and HPV viral load were significantly associated with the presence of low and high-grade SILs (LSIL/HSIL). The multivariate logistic regression showed that HPV viral load was associated with an increased odds of LSIL/HSIL, odds ratio of 10.7 (95% CI 2.0 – 57.7) for those that were HC2 positive and had a viral load of ≤ 181.1 RLU (the median HPV viral load), and 33.8 (95% CI 6.4 – 178.9) for those that were HC2 positive with a HPV viral load > 181.1 RLU. Conclusion Women initiating ARVs have a high prevalence of abnormal Pap smears and HR-HPV. Our results underscore the need for locally relevant, rigorous screening protocols for the increasing numbers of women accessing ARV therapy so that the benefits of ARVs are not partially offset by an excess risk in cervical cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon credit markets are in the early stages of development and media headlines such as these illustrate emerging levels of concern and foreboding over the potential for fraudulent crime within these markets. Australian companies are continuing to venture into the largely unregulated voluntary carbon credit market to offset their emissions and / or give their customers the opportunity to be ‘carbon neutral’. Accordingly, the voluntary market has seen a proliferation of carbon brokers that offer tailored offset carbon products according to need and taste. With the instigation of the Australian compliance market and with pressure increasing for political responses to combat climate change, we would expect Australian companies to experience greater exposure to carbon products in both compliance and voluntary markets. This paper examines the risks of carbon fraud in these markets by reviewing cases of actual fraud and analysing and identifying contexts where risks of carbon fraud are most likely.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New materials technology has provided the potential for the development of an innovative Hybrid Composite Floor Plate System (HCFPS) with many desirable properties, such as light weight, easy to construct, economical, demountable, recyclable and reusable. Component materials of HCFPS include a central Polyurethane (PU) core, outer layers of Glass-fibre Reinforced Cement (GRC) and steel laminates at tensile regions. HCFPS is configured such that the positive inherent properties of individual component materials are combined to offset any weakness and achieve optimum performance. Research has been carried out using extensive Finite Element (FE) computer simulations supported by experimental testing. Both the strength and serviceability requirements have been established for this lightweight floor plate system. This paper presents some of the research towards the development of HCFPS along with a parametric study to select suitable span lengths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advanced substation applications, such as synchrophasors and IEC 61850-9-2 sampled value process buses, depend upon highly accurate synchronizing signals for correct operation. The IEEE 1588 Precision Timing Protocol (PTP) is the recommended means of providing precise timing for future substations. This paper presents a quantitative assessment of PTP reliability using Fault Tree Analysis. Two network topologies are proposed that use grandmaster clocks with dual network connections and take advantage of the Best Master Clock Algorithm (BMCA) from IEEE 1588. The cross-connected grandmaster topology doubles reliability, and the addition of a shared third grandmaster gives a nine-fold improvement over duplicated grandmasters. The performance of BMCA mediated handover of the grandmaster role during contingencies in the timing system was evaluated experimentally. The 1 µs performance requirement of sampled values and synchrophasors are met, even during network or GPS antenna outages. Slave clocks are shown to synchronize to the backup grandmaster in response to degraded performance or loss of the main grandmaster. Slave disturbances are less than 350 ns provided the grandmaster reference clocks are not offset from one another. A clear understanding of PTP reliability and the factors that affect availability will encourage the adoption of PTP for substation time synchronization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Construction practitioners often experience unexpected results of their scheduling-related decisions. This is mainly due to lack of understanding of the dynamic nature of construction system. However, very little attention has been given to its significant importance and few empirical studies have been undertaken on this issue. This paper, therefore, analyzes the effect of aggressive scheduling, overtime, resource adding, and schedule slippage on construction performance, focusing on workers’ reactions to those scheduling decisions. Survey data from 102 construction practitioners in 38 construction sites are used for the analysis. The results indicate that efforts to increase work rate by working overtime, resource adding, and aggressive scheduling can be offset due to losses in productivity and quality. Based on the research findings, practical guidelines are then discussed to help site managers to effectively deal with the dynamics of scheduling and improve construction performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accelerating a project can be rewarding. The consequences, however, can be troublesome if productivity and quality are sacrificed for the sake of remaining ahead of schedule, such that the actual schedule benefits are often barely worth the effort. The tradeoffs and paths of schedule pressure and its causes and effects are often overlooked when schedule decisions are being made. This paper analyses the effects that schedule pressure has on construction performance, and focuses on tradeoffs in scheduling. A research framework has been developed using a causal diagram to illustrate the cause-and-effect analysis of schedule pressure. An empirical investigation has been performed by using survey data collected from 102 construction practitioners working in 38 construction sites in Singapore. The results of this survey data analysis indicate that advantages of increasing the pace of work—by working under schedule pressure—can be offset by losses in productivity and quality. The negative effects of schedule pressure arise mainly by working out of sequence, generating work defects, cutting corners, and losing the motivation to work. The adverse effects of schedule pressure can be minimized by scheduling construction activities realistically and planning them proactively, motivating workers, and by establishing an effective project coordination and communication mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is widely recognised that defining trade-offs between greenhouse gas emissions using ‘emission equivalence’ based on global warming potentials (GWPs) referenced to carbon dioxide produces anomalous results when applied to methane. The short atmospheric lifetime of methane, compared to the timescales of CO2 uptake, leads to the greenhouse warming depending strongly on the temporal pattern of emission substitution. We argue that a more appropriate way to consider the relationship between the warming effects of methane and carbon dioxide is to define a ‘mixed metric’ that compares ongoing methane emissions (or reductions) to one-off emissions (or reductions) of carbon dioxide. Quantifying this approach, we propose that a one-off sequestration of 1 t of carbon would offset an ongoing methane emission in the range 0.90–1.05 kg CH4 per year. We present an example of how our approach would apply to rangeland cattle production, and consider the broader context of mitigation of climate change, noting the reverse trade-off would raise significant challenges in managing the risk of non-compliance. Our analysis is consistent with other approaches to addressing the criticisms of GWP-based emission equivalence, but provides a simpler and more robust approach while still achieving close equivalence of climate mitigation outcomes ranging over decadal to multi-century timescales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australian Federal Government has recently passed reforms to the shipping industry. These reforms are aimed at removing barriers to investment in Australian shipping, fostering global competitiveness and securing a stable maritime skills base. The shipping reform package adopts a two pronged approach designed to achieve its stated goals by providing both a ‘stick’ and ‘carrot’ to industry participants. First, the ‘stick’ is delivered via the provision of tighter regulation of coastal trading operations through a new licencing system, along with the introduction of a civil penalty regime and an increase in existing penalties. Second, the ‘carrot’ is delivered via taxation incentives available to vessels registered in Australia where the registrant meets certain specified criteria. These incentives, introduced through amendments to the Income Tax Assessment Act 1997 and the Income Tax Assessment Act 1936 and contained in the Tax Laws Amendment (Shipping Reform) Act 2012, provide five key tax incentives to the shipping industry. From 1 July 2012, amendments give effect to an income tax exemption for qualifying ship operators, accelerated depreciation of vessels, roll-over relief from income tax on the sale of a vessel, an employer refundable tax offset, and an exemption from royalty withholding tax for payments made for the lease of certain shipping vessels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Effective digital human model (DHM) simulation of automotive driver packaging ergonomics, safety and comfort depends on accurate modelling of occupant posture, which is strongly related to the mechanical interaction between human body soft tissue and flexible seat components. This paper presents a finite-element study simulating the deflection of seat cushion foam and supportive seat structures, as well as human buttock and thigh soft tissue when seated. The three-dimensional data used for modelling thigh and buttock geometry were taken on one 95th percentile male subject, representing the bivariate percentiles of the combined hip breadth (seated) and buttock-to-knee length distributions of a selected Australian and US population. A thigh-buttock surface shell based on this data was generated for the analytic model. A 6mm neoprene layer was offset from the shell to account for the compression of body tissue expected through sitting in a seat. The thigh-buttock model is therefore made of two layers, covering thin to moderate thigh and buttock proportions, but not more fleshy sizes. To replicate the effects of skin and fat, the neoprene rubber layer was modelled as a hyperelastic material with viscoelastic behaviour in a Neo-Hookean material model. Finite element (FE) analysis was performed in ANSYS V13 WB (Canonsburg, USA). It is hypothesized that the presented FE simulation delivers a valid result, compared to a standard SAE physical test and the real phenomenon of human-seat indentation. The analytical model is based on the CAD assembly of a Ford Territory seat. The optimized seat frame, suspension and foam pad CAD data were transformed and meshed into FE models and indented by the two layer, soft surface human FE model. Converging results with the least computational effort were achieved for a bonded connection between cushion and seat base as well as cushion and suspension, no separation between neoprene and indenter shell and a frictional connection between cushion pad and neoprene. The result is compared to a previous simulation of an indentation with a hard shell human finite-element model of equal geometry, and to the physical indentation result, which is approached with very high fidelity. We conclude that (a) SAE composite buttock form indentation of a suspended seat cushion can be validly simulated in a FE model of merely similar geometry, but using a two-layer hard/soft structure. (b) Human-seat indentation of a suspended seat cushion can be validly simulated with a simplified human buttock-thigh model for a selected anthropomorphism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ambiguity resolution plays a crucial role in real time kinematic GNSS positioning which gives centimetre precision positioning results if all the ambiguities in each epoch are correctly fixed to integers. However, the incorrectly fixed ambiguities can result in large positioning offset up to several meters without notice. Hence, ambiguity validation is essential to control the ambiguity resolution quality. Currently, the most popular ambiguity validation is ratio test. The criterion of ratio test is often empirically determined. Empirically determined criterion can be dangerous, because a fixed criterion cannot fit all scenarios and does not directly control the ambiguity resolution risk. In practice, depending on the underlying model strength, the ratio test criterion can be too conservative for some model and becomes too risky for others. A more rational test method is to determine the criterion according to the underlying model and user requirement. Miss-detected incorrect integers will lead to a hazardous result, which should be strictly controlled. In ambiguity resolution miss-detected rate is often known as failure rate. In this paper, a fixed failure rate ratio test method is presented and applied in analysis of GPS and Compass positioning scenarios. A fixed failure rate approach is derived from the integer aperture estimation theory, which is theoretically rigorous. The criteria table for ratio test is computed based on extensive data simulations in the approach. The real-time users can determine the ratio test criterion by looking up the criteria table. This method has been applied in medium distance GPS ambiguity resolution but multi-constellation and high dimensional scenarios haven't been discussed so far. In this paper, a general ambiguity validation model is derived based on hypothesis test theory, and fixed failure rate approach is introduced, especially the relationship between ratio test threshold and failure rate is examined. In the last, Factors that influence fixed failure rate approach ratio test threshold is discussed according to extensive data simulation. The result shows that fixed failure rate approach is a more reasonable ambiguity validation method with proper stochastic model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deep Raman Spectroscopy is a domain within Raman spectroscopy consisting of techniques that facilitate the depth profiling of diffusely scattering media. Such variants include Time-Resolved Raman Spectroscopy (TRRS) and Spatially-Offset Raman Spectroscopy (SORS). A recent study has also demonstrated the integration of TRRS and SORS in the development of Time-Resolved Spatially-Offset Raman Spectroscopy (TR-SORS). This research demonstrates the application of specific deep Raman spectroscopic techniques to concealed samples commonly encountered in forensic and homeland security at various working distances. Additionally, the concepts behind these techniques are discussed at depth and prospective improvements to the individual techniques are investigated. Qualitative and quantitative analysis of samples based on spectral data acquired from SORS is performed with the aid of multivariate statistical techniques. By the end of this study, an objective comparison is made among the techniques within Deep Raman Spectroscopy based on their capabilities. The efficiency and quality of these techniques are determined based on the results procured which facilitates the understanding of the degree of selectivity for the deeper layer exhibited by the individual techniques relative to each other. TR-SORS was shown to exhibit an enhanced selectivity for the deeper layer relative to TRRS and SORS whilst providing spectral results with good signal-to-noise ratio. Conclusive results indicate that TR-SORS is a prospective deep Raman technique that offers higher selectivity towards deep layers and therefore enhances the non-invasive analysis of concealed substances from close range as well as standoff distances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, carbon has been increasingly rendered ‘visible’ both discursively and through political processes that have imbued it with economic value. Greenhouse gas (GHG) emissions have been constructed as social and environmental costs and their reduction or avoidance as social and economic gain. The ‘marketisation’ of carbon, which has been facilitated through various compliance schemes such as the European Union Emissions Trading Scheme (EU ETS), the Kyoto Protocol, the proposed Australian Emissions Reduction Scheme and through the voluntary carbon credit market, have attempted to bring carbon into the ‘foreground’ as an economic liability and/or opportunity. Accompanying the increasing economic visibility of carbon are reports of frauds and scams – the ‘gaming of carbon markets’(Chan 2010). As Lohmann (2010: 21) points out, ‘what are conventionally classed as scams or frauds are an inevitable feature of carbon offset markets, not something that could be eliminated by regulation targeting the specific businesses or state agencies involved’. This paper critiques the disparate discourses of fraud risk in carbon markets and examines cases of fraud within emerging landscapes of green criminology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have explored the potential of deep Raman spectroscopy, specifically surface enhanced spatially offset Raman spectroscopy (SESORS), for non-invasive detection from within animal tissue, by employing SERS-barcoded nanoparticle (NP) assemblies as the diagnostic agent. This concept has been experimentally verified in a clinic-relevant backscattered Raman system with an excitation line of 785 nm under ex vivo conditions. We have shown that our SORS system, with a fixed offset of 2-3 mm, offered sensitive probing of injected QTH-barcoded NP assemblies through animal tissue containing both protein and lipid. In comparison to that of non-aggregated SERS-barcoded gold NPs, we have demonstrated that the tailored SERS-barcoded aggregated NP assemblies have significantly higher detection sensitivity. We report that these NP assemblies can be readily detected at depths of 7-8 mm from within animal proteinaceous tissue with high signal-to-noise (S/N) ratio. In addition they could also be detected from beneath 1-2 mm of animal tissue with high lipid content, which generally poses a challenge due to high absorption of lipids in the near-infrared region. We have also shown that the signal intensity and S/N ratio at a particular depth is a function of the SERS tag concentration used and that our SORS system has a QTH detection limit of 10-6 M. Higher detection depths may possibly be obtained with optimization of the NP assemblies, along with improvements in the instrumentation. Such NP assemblies offer prospects for in vivo, non-invasive detection of tumours along with scope for incorporation of drugs and their targeted and controlled release at tumour sites. These diagnostic agents combined with drug delivery systems could serve as a “theranostic agent”, an integration of diagnostics and therapeutics into a single platform.