945 resultados para NON-HOMOLOGOUS END JOINING
Resumo:
In Newson v Aust Scan Pty Ltd t/a Ikea Springwood [2010] QSC 223 the Supreme Court examined the discretion under s 32(2) of the Personal Injuries Proceedings Act 2002 (Qld), to permit a document which has not been disclosed as required by the pre-court procedures under the PIPA to be used in a subsequent court proceeding. This appears to be the first time that the nature and parameters of the discretion have been judicially considered.
Resumo:
Australian climate, soils and agricultural management practices are significantly different from those of the northern hemisphere nations. Consequently, experimental data on greenhouse gas production from European and North American agricultural soils and its interpretation are unlikely to be directly applicable to Australian systems. A programme of studies of non-CO2 greenhouse gas emissions from agriculture has been established that is designed to reduce uncertainty of non-CO2 greenhouse gas emissions in the Australian National Greenhouse Gas Inventory and provide outputs that will enable better on-farm management practices for reducing non-CO2 greenhouse gas emissions, particularly nitrous oxide. The systems being examined and their locations are irrigated pasture (Kyabram Victoria), irrigated cotton (Narrabri, NSW), irrigated maize (Griffith, NSW), rain-fed wheat (Rutherglen, Victoria) and rain-fed wheat (Cunderdin, WA). The field studies include treatments with and without fertilizer addition, stubble burning versus stubble retention, conventional cultivation versus direct drilling and crop rotation to determine emission factors and treatment possibilities for best management options. The data to date suggest that nitrous oxide emissions from nitrogen fertilizer, applied to irrigated dairy pastures and rain-fed winter wheat, appear much lower than the average of northern hemisphere grain and pasture studies. More variable emissions have been found in studies of irrigated cotton/vetch/wheat rotation and substantially higher emissions from irrigated maize.
Resumo:
This instrument was used in the project named Teachers Reporting Child Sexual Abuse: Towards Evidence-based Reform of Law, Policy and Practice (ARC DP0664847)
Resumo:
This instrument was used in the project named Teachers Reporting Child Sexual Abuse: Towards Evidence-based Reform of Law, Policy and Practice (ARC DP0664847)
Resumo:
This paper examines parents' responses to key factors associated with mode choices for school trips. The research was conducted with parents of elementary school students in Denver Colorado as part of a larger investigation of school travel. School-based active travel programs aim to encourage students to walk or bike to school more frequently. To that end, planning research has identified an array of factors associated with parents' decisions to drive children to school. Many findings are interpreted as ‘barriers’ to active travel, implying that parents have similar objectives with respect to travel mode choices and that parents respond similarly and consistently to external conditions. While the conclusions are appropriate in forecasting demand and mode share with large populations, they are generally too coarse for programs that aim to influence travel behavior with individuals and small groups. This research uses content analysis of interview transcripts to examine the contexts of factors associated with parents' mode choices for trips to and from elementary school. Short, semi-structured interviews were conducted with 65 parents from 12 Denver Public Elementary Schools that had been selected to receive 2007–08 Safe Routes to School non-infrastructure grants. Transcripts were analyzed using Nvivo 8.0 to find out how parents respond to selected factors that are often described in planning literature as ‘barriers’ to active travel. Two contrasting themes emerged from the analysis: barrier elimination and barrier negotiation. Regular active travel appears to diminish parents' perceptions of barriers so that negotiation becomes second nature. Findings from this study suggest that intervention should build capacity and inclination in order to increase rates of active travel.
Resumo:
A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.
Resumo:
In this paper, spatially offset Raman spectroscopy (SORS) is demonstrated for non-invasively investigating the composition of drug mixtures inside an opaque plastic container. The mixtures consisted of three components including a target drug (acetaminophen or phenylephrine hydrochloride) and two diluents (glucose and caffeine). The target drug concentrations ranged from 5% to 100%. After conducting SORS analysis to ascertain the Raman spectra of the concealed mixtures, principal component analysis (PCA) was performed on the SORS spectra to reveal trends within the data. Partial least squares (PLS) regression was used to construct models that predicted the concentration of each target drug, in the presence of the other two diluents. The PLS models were able to predict the concentration of acetaminophen in the validation samples with a root-mean-square error of prediction (RMSEP) of 3.8% and the concentration of phenylephrine hydrochloride with an RMSEP of 4.6%. This work demonstrates the potential of SORS, used in conjunction with multivariate statistical techniques, to perform non-invasive, quantitative analysis on mixtures inside opaque containers. This has applications for pharmaceutical analysis, such as monitoring the degradation of pharmaceutical products on the shelf, in forensic investigations of counterfeit drugs, and for the analysis of illicit drug mixtures which may contain multiple components.
Resumo:
This thesis addresses one of the fundamental issues that remains unresolved in patent law today. It is a question that strikes at the heart of what a patent is and what it is supposed to protect. That question is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether patent law contains a physicality requirement. Resolving this issue will determine whether only traditional mechanical, industrial and manufacturing processes are patent eligible, or whether patent eligibility extends to include purely intangible, or non-physical, products and processes. To this end, this thesis seeks to identify where the dividing line lies between patentable subject matter and the recognised categories of excluded matter, namely, fundamental principles of nature, physical phenomena, and abstract ideas. It involves determining which technological advances are worth the inconvenience monopoly protection causes the public at large, and which should remain free for all to use without restriction. This is an issue that has important ramifications for innovation in the ‘knowledge economy’ of the Information Age. Determining whether patent law contains a physicality requirement is integral to deciding whether much of the valuable innovation we are likely to witness, in what are likely to be the emerging areas of technology in the near future, will receive the same encouragement as industrial and manufacturing advances of previous times.
Resumo:
Introduction This study reports on the development of a self report assessment tool to increase the efficacy of crash prediction within Australian Fleet settings Over last 20 years an array of measures have been produced (Driver anger scale, Driving Skill Inventory, Manchester Driver Behaviour Questionnaire, Driver Attitude Questionnaire, Driver Stress Inventory, Safety Climate Questionnaire) While these tools are useful, research has demonstrated limited ability to accurately identify individuals most likely to be involved in a crash. Reasons cited include; - Crashes are relatively rare - Other competing factors may influence crash event - Ongoing questions regarding the validity of self report measures (common method variance etc) - Lack of contemporary issues relating to fleet driving performance
Resumo:
Current research in secure messaging for Vehicular Ad hoc Networks (VANETs) appears to focus on employing a digital certificate-based Public Key Cryptosystem (PKC) to support security. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This paper proposes a non-certificate-based public key management for VANETs. A comprehensive evaluation of performance and scalability of the proposed public key management regime is presented, which is compared to a certificate-based PKC by employing a number of quantified analyses and simulations. Not only does this paper demonstrate that the proposal can maintain security, but it also asserts that it can improve overall performance and scalability at a lower cost, compared to the certificate-based PKC. It is believed that the proposed scheme will add a new dimension to the key management and verification services for VANETs.
Resumo:
Recently an innovative composite panel system was developed, where a thin insulation layer was used externally between two plasterboards to improve the fire performance of light gauge cold-formed steel frame walls. In this research, finite-element thermal models of both the traditional light gauge cold-formed steel frame wall panels with cavity insulation and the new light gauge cold-formed steel frame composite wall panels were developed to simulate their thermal behaviour under standard and realistic fire conditions. Suitable apparent thermal properties of gypsum plasterboard, insulation materials and steel were proposed and used. The developed models were then validated by comparing their results with available fire test results. This article presents the details of the developed finite-element models of small-scale non-load-bearing light gauge cold-formed steel frame wall panels and the results of the thermal analysis. It has been shown that accurate finite-element models can be used to simulate the thermal behaviour of small-scale light gauge cold-formed steel frame walls with varying configurations of insulations and plasterboards. The numerical results show that the use of cavity insulation was detrimental to the fire rating of light gauge cold-formed steel frame walls, while the use of external insulation offered superior thermal protection to them. The effects of real fire conditions are also presented.
Resumo:
Tabernacle is an experimental game world-building project which explores the relationship between the map and the 3-dimensional visualisation enabled by high-end game engines. The project is named after the 6th century tabernacle maps of Cosmas Indicopleustes in his Christian Topography. These maps articulate a cultural or metaphoric, rather than measured view of the world, contravening Alper's distinction which observes that “maps are measurement, art is experience”. The project builds on previous research into the use of game engines and 3D navigable representation to enable cultural experience, particularly non-Western cultural experiences and ways of seeing. Like the earlier research, Tabernacle highlights the problematic disjuncture between the modern Cartesian map structures of the engine and the mapping traditions of non-Western cultures. Tabernacle represents a practice-based research provocation. The project exposes assumptions about the maps which underpin 3D game worlds, and the autocratic tendencies of world construction software. This research is of critical importance as game engines and simulation technologies are becoming more popular in the recreation of culture and history. A key learning from the Tabernacle project was the ways in which available game engines – technologies with roots in the Enlightenment - constrained the team’s ability to represent a very different culture with a different conceptualisation of space and maps. Understanding the cultural legacies of the software itself is critical as we are tempted by the opportunities for representation of culture and history that they seem to offer. The project was presented at Perth Digital Arts and Culture in 2007 and reiterated using a different game engine in 2009. Further reflections were discussed in a conference paper presented at OZCHI 2009 and a peer-reviewed journal article, and insights gained from the experience continue to inform the author’s research.
Resumo:
The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.
Resumo:
In 1999 Richards compared the accuracy of commercially available motion capture systems commonly used in biomechanics. Richards identified that in static tests the optical motion capture systems generally produced RMS errors of less than 1.0 mm. During dynamic tests, the RMS error increased to up to 4.2 mm in some systems. In the last 12 years motion capture systems have continued to evolve and now include high-resolution CCD or CMOS image sensors, wireless communication, and high full frame sampling frequencies. In addition to hardware advances, there have also been a number of advances in software, which includes improved calibration and tracking algorithms, real time data streaming, and the introduction of the c3d standard. These advances have allowed the system manufactures to maintain a high retail price in the name of advancement. In areas such as gait analysis and ergonomics many of the advanced features such as high resolution image sensors and high sampling frequencies are not required due to the nature of the task often investigated. Recently Natural Point introduced low cost cameras, which on face value appear to be suitable as at very least a high quality teaching tool in biomechanics and possibly even a research tool when coupled with the correct calibration and tracking software. The aim of the study was therefore to compare both the linear accuracy and quality of angular kinematics from a typical high end motion capture system and a low cost system during a simple task.
Resumo:
A precise definition of interaction behavior between services is a prerequisite for successful business-to-business integration. Service choreographies provide a view on message exchanges and their ordering constraints from a global perspective. Assuming message sending and receiving as one atomic step allows to reduce the modelers’ effort. As downside, problematic race conditions resulting in deadlocks might appear when realizing the choreography using services that exchange messages asynchronously. This paper presents typical issues when desynchronizing service choreographies. Solutions from practice are discussed and a formal approach based on Petri nets is introduced for identifying desynchronizable choreographies.