386 resultados para NON-HOMOLOGOUS END JOINING


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines parents' responses to key factors associated with mode choices for school trips. The research was conducted with parents of elementary school students in Denver Colorado as part of a larger investigation of school travel. School-based active travel programs aim to encourage students to walk or bike to school more frequently. To that end, planning research has identified an array of factors associated with parents' decisions to drive children to school. Many findings are interpreted as ‘barriers’ to active travel, implying that parents have similar objectives with respect to travel mode choices and that parents respond similarly and consistently to external conditions. While the conclusions are appropriate in forecasting demand and mode share with large populations, they are generally too coarse for programs that aim to influence travel behavior with individuals and small groups. This research uses content analysis of interview transcripts to examine the contexts of factors associated with parents' mode choices for trips to and from elementary school. Short, semi-structured interviews were conducted with 65 parents from 12 Denver Public Elementary Schools that had been selected to receive 2007–08 Safe Routes to School non-infrastructure grants. Transcripts were analyzed using Nvivo 8.0 to find out how parents respond to selected factors that are often described in planning literature as ‘barriers’ to active travel. Two contrasting themes emerged from the analysis: barrier elimination and barrier negotiation. Regular active travel appears to diminish parents' perceptions of barriers so that negotiation becomes second nature. Findings from this study suggest that intervention should build capacity and inclination in order to increase rates of active travel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, spatially offset Raman spectroscopy (SORS) is demonstrated for non-invasively investigating the composition of drug mixtures inside an opaque plastic container. The mixtures consisted of three components including a target drug (acetaminophen or phenylephrine hydrochloride) and two diluents (glucose and caffeine). The target drug concentrations ranged from 5% to 100%. After conducting SORS analysis to ascertain the Raman spectra of the concealed mixtures, principal component analysis (PCA) was performed on the SORS spectra to reveal trends within the data. Partial least squares (PLS) regression was used to construct models that predicted the concentration of each target drug, in the presence of the other two diluents. The PLS models were able to predict the concentration of acetaminophen in the validation samples with a root-mean-square error of prediction (RMSEP) of 3.8% and the concentration of phenylephrine hydrochloride with an RMSEP of 4.6%. This work demonstrates the potential of SORS, used in conjunction with multivariate statistical techniques, to perform non-invasive, quantitative analysis on mixtures inside opaque containers. This has applications for pharmaceutical analysis, such as monitoring the degradation of pharmaceutical products on the shelf, in forensic investigations of counterfeit drugs, and for the analysis of illicit drug mixtures which may contain multiple components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses one of the fundamental issues that remains unresolved in patent law today. It is a question that strikes at the heart of what a patent is and what it is supposed to protect. That question is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether patent law contains a physicality requirement. Resolving this issue will determine whether only traditional mechanical, industrial and manufacturing processes are patent eligible, or whether patent eligibility extends to include purely intangible, or non-physical, products and processes. To this end, this thesis seeks to identify where the dividing line lies between patentable subject matter and the recognised categories of excluded matter, namely, fundamental principles of nature, physical phenomena, and abstract ideas. It involves determining which technological advances are worth the inconvenience monopoly protection causes the public at large, and which should remain free for all to use without restriction. This is an issue that has important ramifications for innovation in the ‘knowledge economy’ of the Information Age. Determining whether patent law contains a physicality requirement is integral to deciding whether much of the valuable innovation we are likely to witness, in what are likely to be the emerging areas of technology in the near future, will receive the same encouragement as industrial and manufacturing advances of previous times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction This study reports on the development of a self report assessment tool to increase the efficacy of crash prediction within Australian Fleet settings Over last 20 years an array of measures have been produced (Driver anger scale, Driving Skill Inventory, Manchester Driver Behaviour Questionnaire, Driver Attitude Questionnaire, Driver Stress Inventory, Safety Climate Questionnaire) While these tools are useful, research has demonstrated limited ability to accurately identify individuals most likely to be involved in a crash. Reasons cited include; - Crashes are relatively rare - Other competing factors may influence crash event - Ongoing questions regarding the validity of self report measures (common method variance etc) - Lack of contemporary issues relating to fleet driving performance

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current research in secure messaging for Vehicular Ad hoc Networks (VANETs) appears to focus on employing a digital certificate-based Public Key Cryptosystem (PKC) to support security. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This paper proposes a non-certificate-based public key management for VANETs. A comprehensive evaluation of performance and scalability of the proposed public key management regime is presented, which is compared to a certificate-based PKC by employing a number of quantified analyses and simulations. Not only does this paper demonstrate that the proposal can maintain security, but it also asserts that it can improve overall performance and scalability at a lower cost, compared to the certificate-based PKC. It is believed that the proposed scheme will add a new dimension to the key management and verification services for VANETs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently an innovative composite panel system was developed, where a thin insulation layer was used externally between two plasterboards to improve the fire performance of light gauge cold-formed steel frame walls. In this research, finite-element thermal models of both the traditional light gauge cold-formed steel frame wall panels with cavity insulation and the new light gauge cold-formed steel frame composite wall panels were developed to simulate their thermal behaviour under standard and realistic fire conditions. Suitable apparent thermal properties of gypsum plasterboard, insulation materials and steel were proposed and used. The developed models were then validated by comparing their results with available fire test results. This article presents the details of the developed finite-element models of small-scale non-load-bearing light gauge cold-formed steel frame wall panels and the results of the thermal analysis. It has been shown that accurate finite-element models can be used to simulate the thermal behaviour of small-scale light gauge cold-formed steel frame walls with varying configurations of insulations and plasterboards. The numerical results show that the use of cavity insulation was detrimental to the fire rating of light gauge cold-formed steel frame walls, while the use of external insulation offered superior thermal protection to them. The effects of real fire conditions are also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tabernacle is an experimental game world-building project which explores the relationship between the map and the 3-dimensional visualisation enabled by high-end game engines. The project is named after the 6th century tabernacle maps of Cosmas Indicopleustes in his Christian Topography. These maps articulate a cultural or metaphoric, rather than measured view of the world, contravening Alper's distinction which observes that “maps are measurement, art is experience”. The project builds on previous research into the use of game engines and 3D navigable representation to enable cultural experience, particularly non-Western cultural experiences and ways of seeing. Like the earlier research, Tabernacle highlights the problematic disjuncture between the modern Cartesian map structures of the engine and the mapping traditions of non-Western cultures. Tabernacle represents a practice-based research provocation. The project exposes assumptions about the maps which underpin 3D game worlds, and the autocratic tendencies of world construction software. This research is of critical importance as game engines and simulation technologies are becoming more popular in the recreation of culture and history. A key learning from the Tabernacle project was the ways in which available game engines – technologies with roots in the Enlightenment - constrained the team’s ability to represent a very different culture with a different conceptualisation of space and maps. Understanding the cultural legacies of the software itself is critical as we are tempted by the opportunities for representation of culture and history that they seem to offer. The project was presented at Perth Digital Arts and Culture in 2007 and reiterated using a different game engine in 2009. Further reflections were discussed in a conference paper presented at OZCHI 2009 and a peer-reviewed journal article, and insights gained from the experience continue to inform the author’s research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective quantification of three-dimensional kinematics during different functional and occupational tasks is now more in demand than ever. The introduction of new generation of low-cost passive motion capture systems from a number of manufacturers has made this technology accessible for teaching, clinical practice and in small/medium industry. Despite the attractive nature of these systems, their accuracy remains unproved in independent tests. We assessed static linear accuracy, dynamic linear accuracy and compared gait kinematics from a Vicon MX20 system to a Natural Point OptiTrack system. In all experiments data were sampled simultaneously. We identified both systems perform excellently in linear accuracy tests with absolute errors not exceeding 1%. In gait data there was again strong agreement between the two systems in sagittal and coronal plane kinematics. Transverse plane kinematics differed by up to 3 at the knee and hip, which we attributed to the impact of soft tissue artifact accelerations on the data. We suggest that low-cost systems are comparably accurate to their high-end competitors and offer a platform with accuracy acceptable in research for laboratories with a limited budget.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1999 Richards compared the accuracy of commercially available motion capture systems commonly used in biomechanics. Richards identified that in static tests the optical motion capture systems generally produced RMS errors of less than 1.0 mm. During dynamic tests, the RMS error increased to up to 4.2 mm in some systems. In the last 12 years motion capture systems have continued to evolve and now include high-resolution CCD or CMOS image sensors, wireless communication, and high full frame sampling frequencies. In addition to hardware advances, there have also been a number of advances in software, which includes improved calibration and tracking algorithms, real time data streaming, and the introduction of the c3d standard. These advances have allowed the system manufactures to maintain a high retail price in the name of advancement. In areas such as gait analysis and ergonomics many of the advanced features such as high resolution image sensors and high sampling frequencies are not required due to the nature of the task often investigated. Recently Natural Point introduced low cost cameras, which on face value appear to be suitable as at very least a high quality teaching tool in biomechanics and possibly even a research tool when coupled with the correct calibration and tracking software. The aim of the study was therefore to compare both the linear accuracy and quality of angular kinematics from a typical high end motion capture system and a low cost system during a simple task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precise definition of interaction behavior between services is a prerequisite for successful business-to-business integration. Service choreographies provide a view on message exchanges and their ordering constraints from a global perspective. Assuming message sending and receiving as one atomic step allows to reduce the modelers’ effort. As downside, problematic race conditions resulting in deadlocks might appear when realizing the choreography using services that exchange messages asynchronously. This paper presents typical issues when desynchronizing service choreographies. Solutions from practice are discussed and a formal approach based on Petri nets is introduced for identifying desynchronizable choreographies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Melodic alarms proposed in the IEC 60601-1-8 standard for medical electrical equipment were tested for learnability and discriminability. Thirty-three non-anaesthetist participants learned the alarms over two sessions of practice, with or without mnemonics suggested in the standard. Fewer than 30% of participants could identify the alarms with 100% accuracy at the end of practice. Confusions persisted between pairs of alarms, especially if mnemonics were used during learning (p = 0.011). Participants responded faster (p < 0.00001) and more accurately (p = 0.002) to medium priority alarms than to high priority alarms, even though they rated the high priority alarms as sounding more urgent (p < 0.00001). Participants with at least 1 year of formal musical training identified the alarms more accurately (p = 0.0002) than musically untrained participants, and found the task easier overall (p < 0.00001). More intensive studies of the IEC 60601-1-8 alarms are needed for their effectiveness to be determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep Raman spectroscopy has been utilized for the standoff detection of concealed chemical threat agents from a distance of 15 meters under real life background illumination conditions. By using combined time and space resolved measurements, various explosive precursors hidden in opaque plastic containers were identified non-invasively. Our results confirm that combined time and space resolved Raman spectroscopy leads to higher selectivity towards the sub-layer over the surface layer as well as enhanced rejection of fluorescence from the container surface when compared to standoff spatially offset Raman spectroscopy. Raman spectra that have minimal interference from the packaging material and good signal-to-noise ratio were acquired within 5 seconds of measurement time. A new combined time and space resolved Raman spectrometer has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than picosecond-based laboratory systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper seeks to identify and quantify sources of the lagging productivity in Singapore’s retail sector as reported in the Economic Strategies Committee 2010 report. A two-stage analysis is adopted. In the first stage, the Malmquist productivity index is employed which provides measures of productivity change, technological change and efficiency change. In the second stage, technical efficiency estimates are regressed against explanatory variables based on a truncated regression model. Sources of technical efficiency were attributed to quality of workers while product assortment and competition negatively impacted on efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.