915 resultados para HYDRIDE CAPTURE
Resumo:
Advances in information and communications technologies during the last two decades have allowed organisations to capture and utilise data on a vast scale, thus heightening the importance of adequate measures for protecting unauthorised disclosure of personal information. In this respect, data breach notification has emerged as an issue of increasing importance throughout the world. It has been the subject of law reform in the United States and in other international jurisdictions. Following the Australian Law Reform Commission’s review of privacy, data breach notification will soon be addressed in Australia. This article provides a review of US and Australian legal initiatives regarding the notification of data breaches. The authors highlight areas of concern based on the extant US literature that require specific consideration in Australia regarding the development of an Australian legal framework for the notification of data breaches.
Resumo:
Background: The proportion of older individuals in the driving population is predicted to increase in the next 50 years. This has important implications for driving safety as abilities which are important for safe driving, such as vision (which accounts for the majority of the sensory input required for driving), processing ability and cognition have been shown to decline with age. The current methods employed for screening older drivers upon re-licensure are also vision based. This study, which investigated social, behavioural and professional aspects involved with older drivers, aimed to determine: (i) if the current visual standards in place for testing upon re-licensure are effective in reducing the older driver fatality rate in Australia; (ii) if the recommended visual standards are actually implemented as part of the testing procedures by Australian optometrists; and (iii) if there are other non-standardised tests which may be better at predicting the on-road incident-risk (including near misses and minor incidents) in older drivers than those tests recommended in the standards. Methods: For the first phase of the study, state-based age- and gender-stratified numbers of older driver fatalities for 2000-2003 were obtained from the Australian Transportation Safety Bureau database. Poisson regression analyses of fatality rates were considered by renewal frequency and jurisdiction (as separate models), adjusting for possible confounding variables of age, gender and year. For the second phase, all practising optometrists in Australia were surveyed on the vision tests they conduct in consultations relating to driving and their knowledge of vision requirements for older drivers. Finally, for the third phase of the study to investigate determinants of on-road incident risk, a stratified random sample of 600 Brisbane residents aged 60 years and were selected and invited to participate using an introductory letter explaining the project requirements. In order to capture the number and type of road incidents which occurred for each participant over 12 months (including near misses and minor incidents), an important component of the prospective research study was the development and validation of a driving diary. The diary was a tool in which incidents that occurred could be logged at that time (or very close in time to which they occurred) and thus, in comparison with relying on participant memory over time, recall bias of incident occurrence was minimised. Association between all visual tests, cognition and scores obtained for non-standard functional tests with retrospective and prospective incident occurrence was investigated. Results: In the first phase,rivers aged 60-69 years had a 33% lower fatality risk (Rate Ratio [RR] = 0.75, 95% CI 0.32-1.77) in states with vision testing upon re-licensure compared with states with no vision testing upon re-licensure, however, because the CIs are wide, crossing 1.00, this result should be regarded with caution. However, overall fatality rates and fatality rates for those aged 70 years and older (RR=1.17, CI 0.64-2.13) did not differ between states with and without license renewal procedures, indicating no apparent benefit in vision testing legislation. For the second phase of the study, nearly all optometrists measured visual acuity (VA) as part of a vision assessment for re-licensing, however, 20% of optometrists did not perform any visual field (VF) testing and only 20% routinely performed automated VF on older drivers, despite the standards for licensing advocating automated VF as part of the vision standard. This demonstrates the need for more effective communication between the policy makers and those responsible for carrying out the standards. It may also indicate that the overall higher driver fatality rate in jurisdictions with vision testing requirements is resultant as the tests recommended by the standards are only partially being conducted by optometrists. Hence a standardised protocol for the screening of older drivers for re-licensure across the nation must be established. The opinions of Australian optometrists with regard to the responsibility of reporting older drivers who fail to meet the licensing standards highlighted the conflict between maintaining patient confidentiality or upholding public safety. Mandatory reporting requirements of those drivers who fail to reach the standards necessary for driving would minimise potential conflict between the patient and their practitioner, and help maintain patient trust and goodwill. The final phase of the PhD program investigated the efficacy of vision, functional and cognitive tests to discriminate between at-risk and safe older drivers. Nearly 80% of the participants experienced an incident of some form over the prospective 12 months, with the total incident rate being 4.65/10 000 km. Sixty-three percent reported having a near miss and 28% had a minor incident. The results from the prospective diary study indicate that the current vision screening tests (VA and VF) used for re-licensure do not accurately predict older drivers who are at increased odds of having an on-road incident. However, the variation in visual measurements of the cohort was narrow, also affecting the results seen with the visual functon questionnaires. Hence a larger cohort with greater variability should be considered for a future study. A slightly lower cognitive level (as measured with the Mini-Mental State Examination [MMSE]) did show an association with incident involvement as did slower reaction time (RT), however the Useful-Field-of-View (UFOV) provided the most compelling results of the study. Cut-off values of UFOV processing (>23.3ms), divided attention (>113ms), selective attention (>258ms) and overall score (moderate/ high/ very high risk) were effective in determining older drivers at increased odds of having any on-road incident and the occurrence of minor incidents. Discussion: The results have shown that for the 60-69 year age-group, there is a potential benefit in testing vision upon licence renewal. However, overall fatality rates and fatality rates for those aged 70 years and older indicated no benefit in vision testing legislation and suggests a need for inclusion of screening tests which better predict on-road incidents. Although VA is routinely performed by Australian optometrists on older drivers renewing their licence, VF is not. Therefore there is a need for a protocol to be developed and administered which would result in standardised methods conducted throughout the nation for the screening of older drivers upon re-licensure. Communication between the community, policy makers and those conducting the protocol should be maximised. By implementing a standardised screening protocol which incorporates a level of mandatory reporting by the practitioner, the ethical dilemma of breaching patient confidentiality would also be resolved. The tests which should be included in this screening protocol, however, cannot solely be ones which have been implemented in the past. In this investigation, RT, MMSE and UFOV were shown to be better determinants of on-road incidents in older drivers than VA and VF, however, as previously mentioned, there was a lack of variability in visual status within the cohort. Nevertheless, it is the recommendation from this investigation, that subject to appropriate sensitivity and specificity being demonstrated in the future using a cohort with wider variation in vision, functional performance and cognition, these tests of cognition and information processing should be added to the current protocol for the screening of older drivers which may be conducted at licensing centres across the nation.
Resumo:
In this chapter, ideas from ecological psychology and nonlinear dynamics are integrated to characterise decision-making as an emergent property of self-organisation processes in the interpersonal interactions that occur in sports teams. A conceptual model is proposed to capture constraints on dynamics of decisions and actions in dyadic systems, which has been empirically evaluated in simulations of interpersonal interactions in team sports. For this purpose, co-adaptive interpersonal dynamics in team sports such as rubgy union have been studied to reveal control parameter and collective variable relations in attacker-defender dyads. Although interpersonal dynamics of attackers and defenders in 1 vs 1 situations showed characteristics of chaotic attractors, the informational constraints of rugby union typically bounded dyadic systems into low dimensional attractors. Our work suggests that the dynamics of attacker-defender dyads can be characterised as an evolving sequence since players' positioning and movements are connected in diverse ways over time.
Resumo:
In the region of self-organized criticality (SOC) interdependency between multi-agent system components exists and slight changes in near-neighbor interactions can break the balance of equally poised options leading to transitions in system order. In this region, frequency of events of differing magnitudes exhibits a power law distribution. The aim of this paper was to investigate whether a power law distribution characterized attacker-defender interactions in team sports. For this purpose we observed attacker and defender in a dyadic sub-phase of rugby union near the try line. Videogrammetry was used to capture players’ motion over time as player locations were digitized. Power laws were calculated for the rate of change of players’ relative position. Data revealed that three emergent patterns from dyadic system interactions (i.e., try; unsuccessful tackle; effective tackle) displayed a power law distribution. Results suggested that pattern forming dynamics dyads in rugby union exhibited SOC. It was concluded that rugby union dyads evolve in SOC regions suggesting that players’ decisions and actions are governed by local interactions rules.
Resumo:
Innovation Management (IM) in most knowledge based firms is used on an adhoc basis where senior managers use this term to leverage competitive edge without understanding its true meaning and how its robust application in organisation impacts organisational performance. There have been attempts in the manufacturing industry to harness the innovative potential of the business and apprehend its use as a point of difference to improve financial and non financial outcomes. However further work is required to innovatively extrapolate the lessons learnt to introduce incremental and/or radical innovation to knowledge based firms. An international structural engineering firm has been proactive in exploring and implementing this idea and has forged an alliance with the Queensland University of Technology to start the Innovation Management Program (IMP). The aim was to develop a permanent and sustainable program with which innovation can be woven through the fabric of the organisation. There was an intention to reinforce the firms’ vision and reinvigorate ideas and create new options that help in its realisation. This paper outlines the need for innovation in knowledge based firms and how this consulting engineering firm reacted to this exigency. The development of the Innovation Management Program, its different themes (and associated projects) and how they integrate to form a holistic model is also discussed. The model is designed around the need of providing professional qualification improvement opportunities for staff, setting-up organised, structured & easily accessible knowledge repositories to capture tacit and explicit knowledge and implement efficient project management strategies with a view to enhance client satisfaction. A Delphi type workshop is used to confirm the themes and projects. Some of the individual projects and their expected outcomes are also discussed. A questionnaire and interviews were used to collect data to select appropriate candidates responsible for leading these projects. Following an in-depth analysis of preliminary research results, some recommendations on the selection process will also be presented.
Synthesis of 4-arm star poly(L-Lactide) oligomers using an in situ-generated calcium-based initiator
Resumo:
Using an in situ-generated calcium-based initiating species derived from pentaerythritol, the bulk synthesis of well-defined 4-arm star poly(L-lactide) oligomers has been studied in detail. The substitution of the traditional initiator, stannous octoate with calcium hydride allowed the synthesis of oligomers that had both low PDIs and a comparable number of polymeric arms (3.7 – 3.9) to oligomers of similar molecular weight. Investigations into the degree of control observed during the course of the polymerization found that the insolubility of pentaerythritol in molten L-lactide resulted in an uncontrolled polymerization only when the feed mole ratio of L-lactide to pentaerythritol was 13. At feed ratios of 40 and greater, a pseudo-living polymerization was observed. As part of this study, in situ FT-Raman spectroscopy was demonstrated to be a suitable method to monitor the kinetics of the ring-opening polymerization (ROP) of lactide. The advantages of using this technique rather than FT-IR-ATR and 1H NMR for monitoring L-lactide consumption during polymerization are discussed.
Resumo:
Emissions from airport operations are of significant concern because of their potential impact on local air quality and human health. The currently limited scientific knowledge of aircraft emissions is an important issue worldwide, when considering air pollution associated with airport operation, and this is especially so for ultrafine particles. This limited knowledge is due to scientific complexities associated with measuring aircraft emissions during normal operations on the ground. In particular this type of research has required the development of novel sampling techniques which must take into account aircraft plume dispersion and dilution as well as the various particle dynamics that can affect the measurements of the aircraft engine plume from an operational aircraft. In order to address this scientific problem, a novel mobile emission measurement method called the Plume Capture and Analysis System (PCAS), was developed and tested. The PCAS permits the capture and analysis of aircraft exhaust during ground level operations including landing, taxiing, takeoff and idle. The PCAS uses a sampling bag to temporarily store a sample, providing sufficient time to utilize sensitive but slow instrumental techniques to be employed to measure gas and particle emissions simultaneously and to record detailed particle size distributions. The challenges in relation to the development of the technique include complexities associated with the assessment of the various particle loss and deposition mechanisms which are active during storage in the PCAS. Laboratory based assessment of the method showed that the bag sampling technique can be used to accurately measure particle emissions (e.g. particle number, mass and size distribution) from a moving aircraft or vehicle. Further assessment of the sensitivity of PCAS results to distance from the source and plume concentration was conducted in the airfield with taxiing aircraft. The results showed that the PCAS is a robust method capable of capturing the plume in only 10 seconds. The PCAS is able to account for aircraft plume dispersion and dilution at distances of 60 to 180 meters downwind of moving a aircraft along with particle deposition loss mechanisms during the measurements. Characterization of the plume in terms of particle number, mass (PM2.5), gaseous emissions and particle size distribution takes only 5 minutes allowing large numbers of tests to be completed in a short time. The results were broadly consistent and compared well with the available data. Comprehensive measurements and analyses of the aircraft plumes during various modes of the landing and takeoff (LTO) cycle (e.g. idle, taxi, landing and takeoff) were conducted at Brisbane Airport (BNE). Gaseous (NOx, CO2) emission factors, particle number and mass (PM2.5) emission factors and size distributions were determined for a range of Boeing and Airbus aircraft, as a function of aircraft type and engine thrust level. The scientific complexities including the analysis of the often multimodal particle size distributions to describe the contributions of different particle source processes during the various stages of aircraft operation were addressed through comprehensive data analysis and interpretation. The measurement results were used to develop an inventory of aircraft emissions at BNE, including all modes of the aircraft LTO cycle and ground running procedures (GRP). Measurements of the actual duration of aircraft activity in each mode of operation (time-in-mode) and compiling a comprehensive matrix of gas and particle emission rates as a function of aircraft type and engine thrust level for real world situations was crucial for developing the inventory. The significance of the resulting matrix of emission rates in this study lies in the estimate it provides of the annual particle emissions due to aircraft operations, especially in terms of particle number. In summary, this PhD thesis presents for the first time a comprehensive study of the particle and NOx emission factors and rates along with the particle size distributions from aircraft operations and provides a basis for estimating such emissions at other airports. This is a significant addition to the scientific knowledge in terms of particle emissions from aircraft operations, since the standard particle number emissions rates are not currently available for aircraft activities.
Resumo:
The use of the PC and Internet for placing telephone calls will present new opportunities to capture vast amounts of un-transcribed speech for a particular speaker. This paper investigates how to best exploit this data for speaker-dependent speech recognition. Supervised and unsupervised experiments in acoustic model and language model adaptation are presented. Using one hour of automatically transcribed speech per speaker with a word error rate of 36.0%, unsupervised adaptation resulted in an absolute gain of 6.3%, equivalent to 70% of the gain from the supervised case, with additional adaptation data likely to yield further improvements. LM adaptation experiments suggested that although there seems to be a small degree of speaker idiolect, adaptation to the speaker alone, without considering the topic of the conversation, is in itself unlikely to improve transcription accuracy.
Resumo:
To date, most theories of business models have theorized value capture assuming that appropriability regimes were exogenous and that the firm would face a unique, ideal-typical appropriability regime. This has led theory contributions to focus on governance structures to minimize transaction costs, to downplay the interdepencies between value capture and value creation, and to ignore revenue generation strategies. We propose a reconceptualization of business models value capture mechanisms that rely on assumptions of endogeneity and multiplicity of appropriability regimes. This new approach to business model construction highlights the interdependencies and trade-offs between value creation and value capture offered by different types and combinations of appropriability regimes. The theory is illustrated by the analysis of three cases of open source software business models
Resumo:
This chapter considers shared encounters through blogging in the light of John Urry’s new mobilities paradigm. We review relevant literature on mobile blogging (moblogging) – blogging, pervasive image capture and sharing, moblogging and video blogging – and describe common issues with these digital content sharing practices. We then document some features of how technology affords “reflexive encounters” through the description of a blogging study involving smokers trying to quit, describing important connections between mobilities – physical, object, and communicative mobility. Finally, we present some challenges for new blogging technologies, their relevance to social encounters, and possible future directions through considering the mobile self; the new digital life document; and digital content sharing practices.
Resumo:
Background: Poor appetite is a marker of morbidity and mortality in hemodialysis patients, making it an important area for research. Visual analog scales (VAS) can capture a range of subjective sensations related to appetite (such as hunger, desire to eat or fullness), but have not been commonly used to measure appetite in dialysis patients. The aim of this study was to explore the association between retrospective ratings of appetite using VAS and a range of clinical variables as well as biomarkers of appetite in hemodialysis patients.----- Methods: 28 hemodialysis patients (mean age 61±17y, 50% male, median dialysis vintage 19.5(4-101) months) rated their appetite using VAS for hunger, fullness and desire to eat and a 5-point categorical scale measuring general appetite. Blood levels of the appetite peptides leptin, ghrelin and peptide YY were also measured.----- Results: Hunger ratings measured by VAS were significantly (p<0.05) correlated with a range of clinical, nutritional and inflammatory markers: age (r=-0.376), co-morbidities, (r=-0.380) PG-SGA score (r=-0.451), weight (r=-0.375), fat-free mass (r=-0.435), C-Reactive Protein (CRP) (r=-0.383) and Intercellular adhesion molecule (sICAM-1) (r=-0.387). There was a consistent relationship between VAS and appetite on a 5-point categorical scale for questions of hunger, and a similar trend for desire to eat, but not for fullness. Neither method of measuring subjective appetite correlated with appetite peptides.----- Conclusions: Retrospective ratings of hunger on a VAS are associated with a range of clinical variables and further studies are warranted to support their use as a method of measuring appetite in dialysis patients.
Resumo:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
Resumo:
Over the last decade, the rapid growth and adoption of the World Wide Web has further exacerbated user needs for e±cient mechanisms for information and knowledge location, selection, and retrieval. How to gather useful and meaningful information from the Web becomes challenging to users. The capture of user information needs is key to delivering users' desired information, and user pro¯les can help to capture information needs. However, e®ectively acquiring user pro¯les is di±cult. It is argued that if user background knowledge can be speci¯ed by ontolo- gies, more accurate user pro¯les can be acquired and thus information needs can be captured e®ectively. Web users implicitly possess concept models that are obtained from their experience and education, and use the concept models in information gathering. Prior to this work, much research has attempted to use ontologies to specify user background knowledge and user concept models. However, these works have a drawback in that they cannot move beyond the subsumption of super - and sub-class structure to emphasising the speci¯c se- mantic relations in a single computational model. This has also been a challenge for years in the knowledge engineering community. Thus, using ontologies to represent user concept models and to acquire user pro¯les remains an unsolved problem in personalised Web information gathering and knowledge engineering. In this thesis, an ontology learning and mining model is proposed to acquire user pro¯les for personalised Web information gathering. The proposed compu- tational model emphasises the speci¯c is-a and part-of semantic relations in one computational model. The world knowledge and users' Local Instance Reposito- ries are used to attempt to discover and specify user background knowledge. From a world knowledge base, personalised ontologies are constructed by adopting au- tomatic or semi-automatic techniques to extract user interest concepts, focusing on user information needs. A multidimensional ontology mining method, Speci- ¯city and Exhaustivity, is also introduced in this thesis for analysing the user background knowledge discovered and speci¯ed in user personalised ontologies. The ontology learning and mining model is evaluated by comparing with human- based and state-of-the-art computational models in experiments, using a large, standard data set. The experimental results are promising for evaluation. The proposed ontology learning and mining model in this thesis helps to develop a better understanding of user pro¯le acquisition, thus providing better design of personalised Web information gathering systems. The contributions are increasingly signi¯cant, given both the rapid explosion of Web information in recent years and today's accessibility to the Internet and the full text world.
Resumo:
Credentials are a salient form of cultural capital and if a student’s learning and productions are not assessed, they are invisible in current social systems of education and employment. In this field, invisible equals non-existent. This paper arises from the context of an alternative education institution where conventional educational assessment techniques currently fail to recognise the creativity and skills of a cohort of marginalised young people. In order to facilitate a new assessment model an electronic portfolio system (EPS) is being developed and trialled to capture evidence of students’ learning and their productions. In so doing a dynamic system of arranging, exhibiting, exploiting and disseminating assessment data in the form of coherent, meaningful and valuable reports will be maintained. The paper investigates the notion of assessing development of creative thinking and skills through the means of a computerised system that operates in an area described as the efield. A model of the efield is delineated and is explained as a zone existing within the internet where free users exploit the cloud and cultivate social and cultural capital. Drawing largely on sociocultural theory and Bourdieu’s concepts of field, habitus and capitals, the article positions the efield as a potentially productive instrument in assessment for learning practices. An important aspect of the dynamics of this instrument is the recognition of teachers as learners. This is seen as an integral factor in the sociocultural approach to assessment for learning practices that will be deployed with the EPS. What actually takes place is argued to be assessment for learning as a field of exchange. The model produced in this research is aimed at delivering visibility and recognition through an engaging instrument that will enhance the prospects of marginalised young people and shift the paradigm for assessment in a creative world.
Resumo:
Business Process Management (BPM) has emerged as a popular management approach in both Information Technology (IT) and management practice. While there has been much research on business process modelling and the BPM life cycle, there has been little attention given to managing the quality of a business process during its life cycle. This study addresses this gap by providing a framework for organisations to manage the quality of business processes during different phases of the BPM life cycle. This study employs a multi-method research design which is based on the design science approach and the action research methodology. During the design science phase, the artifacts to model a quality-aware business process were developed. These artifacts were then evaluated through three cycles of action research which were conducted within three large Australian-based organisations. This study contributes to the body of BPM knowledge in a number of ways. Firstly, it presents a quality-aware BPM life cycle that provides a framework on how quality can be incorporated into a business process and subsequently managed during the BPM life cycle. Secondly, it provides a framework to capture and model quality requirements of a business process as a set of measurable elements that can be incorporated into the business process model. Finally, it proposes a novel root cause analysis technique for determining the causes of quality issues within business processes.