486 resultados para likelihood to publication
Resumo:
It is important that we understand the factors and conditions that shape driver behaviour – those conditions within the road transport system that contribute to driver error and the situations where driver non-compliance to road regulations is likely. This report presents the findings derived from a program of research investigating the nature of errors made by drivers, involving a literature review and an on-road study. The review indicates that, despite significant investigation, the role of different error types in road traffic crashes remains unclear, as does the role of the wider road transport system failures in driver error causation.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Objective. To evaluate the effectiveness of a single-session online theory of planned behaviour (TPB)-based intervention to improve sun-protective attitudes and behaviour among Australian adults. Methods. Australian adults (N = 534; 38.7% males; Mage = 39.3 years) from major cities (80.9%), regional (17.6%) and remote areas (1.5%)were recruited and randomly allocated to an intervention (N=265) and information only group (N = 267). The online intervention focused on fostering positive attitudes, perceptions of normative support, and control perceptions for sun protection. Participants completed questionnaires assessing standard TPB measures (attitude, subjective norm, perceived behavioural control, intention, behaviour) and extended TPB constructs of group norm (friends, family), personal norm, and image norm, pre-intervention (Time 1) and one week (Time 2) and one month post-intervention (Time 3). Repeated Measures Multivariate Analysis of Variance tested intervention effects across time. Results. Intervention participants reported more positive attitudes towards sun protection and used sunprotective measures more often in the subsequent month than participants receiving information only. The intervention effects on control perceptions and norms were non-significant. Conclusions. A theory-based online intervention fostering more favourable attitudes towards sun safety can increase sun protection attitudes and self-reported behaviour among Australian adults in the short term.
Resumo:
Non-use values (i.e. economic values assigned by individuals to ecosystem goods and services unrelated to current or future uses) provide one of the most compelling incentives for the preservation of ecosystems and biodiversity. Assessing the non-use values of non-users is relatively straightforward using stated preference methods, but the standard approaches for estimating non-use values of users (stated decomposition) have substantial shortcomings which undermine the robustness of their results. In this paper, we propose a pragmatic interpretation of non-use values to derive estimates that capture their main dimensions, based on the identification of a willingness to pay for ecosystem protection beyond one's expected life. We empirically test our approach using a choice experiment conducted on coral reef ecosystem protection in two coastal areas in New Caledonia with different institutional, cultural, environmental and socio-economic contexts. We compute individual willingness to pay estimates, and derive individual non-use value estimates using our interpretation. We find that, a minima, estimates of non-use values may comprise between 25 and 40% of the mean willingness to pay for ecosystem preservation, less than has been found in most studies.
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
Heavy metals that are built-up on urban impervious surfaces such as roads are transported to urban water resources through stormwater runoff. Therefore, it is essential to understand the predominant pathways of heavy metals to the build-up on roads in order to develop suitable pollution mitigation strategies to protect the receiving water environment. The study presented in this paper investigated the sources and transport pathways of manganese, lead, copper, zinc and chromium, which are heavy metals commonly present in urban road build-up. It was found that manganese and lead are contributed to road build-up primarily by direct deposition due to the re-suspension of roadside soil by wind turbulence, while traffic is the predominant source of copper, zinc and chromium to the atmosphere and road build-up. Atmospheric deposition is also the major transport pathway for copper and zinc, and for chromium, direct deposition by traffic sources is the predominant pathway.
Resumo:
This thesis investigated biopsychological factors involved in successfully resisting overconsumption in an environment promoting obesity, and differences between individuals who were and were not able to resist overconsumption. Results showed that self control was a key factor in successful resistance, whereas sensitivity to food reward was associated with overconsumption susceptibility. Reduced self control may be a consequence as well as a cause of obesity, and may not recover following weight loss. Self control was not enhanced through an exercise programme that aimed to ameliorate brain fitness through improved cardiovascular fitness.
Resumo:
Although local food consumption is growing in importance there remains a lack of research addressing local food consumption preferences in less-developed countries. This paper aims to examine the drivers of local food purchase intentions for Chilean consumers. A model of local food behavioral intention was developed from consumer behavior theory. The model was tested using structural equation modeling with data from Chilean shoppers located in Santiago (n=283). The analysis revealed that Chilean consumers are willing to purchase local food based on their positive attitude towards buying local food and their feelings of connectedness with the environment, but not because they have a desire to support local businesses. These findings have implications for retailers, marketers and food producers.
Resumo:
1. The vasodilator effects of adenosine receptor agonists, isoprenaline and histamine were examined in perfused heart preparations from young (4–6 weeks) and mature (12–20 weeks) rats. 2. Adenosine induced a biphasic concentration-dependent decrease in KCl (35 mM) raised coronary perfusion pressure in hearts from young and mature rats, suggesting the presence of both high- and low-affinity sites for adenosine receptors in the two age groups tested. In heart preparations from mature rats, vasodilator responses to adenosine were significantly reduced compared with responses observed in young rats. 3. Responses to 5′-N-ethylcarboxamidoadenosine (NECA) and 2-p-(2-carboxyethyl)phenethylamino-5′-N-ethylcarboxamidoadenosine hydrochloride (CGS-21680) were reduced in preparations from mature rats, whereas the vasodilator actions of N6-cyclopentyladenosine (CPA) and N6-2-(4-aminophenyl)ethyladenosine (APNEA) did not change with age. 4. The results presented in this study suggest that several adenosine receptor subtypes mediate vasodilator responses in the coronary circulation of the rat and that a reduction in response to adenosine with age may be due to changes in the high-affinity receptor site.
Resumo:
This paper explores a gap within the serious game design research. That gap is the ambiguity surrounding the process of aligning the instructional objectives of serious games with their core-gameplay i.e. the moment-to-moment activity that is the core of player interaction. A core-gameplay focused design framework is proposed that can work alongside existing, more broadly focused serious games design frameworks. The framework utilises an inquiry-based approach that allows the serious game designer to use key questions as a means to clearly outline instructional objectives with the core-gameplay. The use of this design framework is considered in the context of a small section of gameplay from an educational game currently in development. This demonstration of the framework brings shows how instructional objectives can be embedded into a serious games core-gameplay.
Resumo:
An estimated A$75,000 is lost by Australians everyday to online fraud, according to the Australian Competition and Consumer Commission (ACCC). Given that this is based on reported crime, the real figure is likely to be much higher. It is well known that fraud, particularly online fraud, has a very low reporting rate. This also doesn’t even begin to encompass non-financial costs to victims. The real cost is likely to be much, much higher. There are many challenges to policing this type of crime, and victims who send money to overseas jurisdictions make it even harder, as does the likelihood of offenders creating false identities or simply stealing legitimate ones. But despite these challenges police have started to do something to prevent the impact and losses of online fraud. By accessing financial intelligence, police are able to identify individuals who are sending money to known high-risk countries for fraud. They then notify these people with their suspicions that they may be involved in fraud. In many cases the people don’t even know they may be victims or involved in online fraud.
Resumo:
Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.
Resumo:
The research field of urban computing – defined as “the integration of computing, sensing, and actuation technologies into everyday urban settings and lifestyles” – considers the design and use of ubiquitous computing technology in public and shared urban environments. Its impact on cities, buildings, and spaces evokes innumerable kinds of change. Embedded into our everyday lived environments, urban computing technologies have the potential to alter the meaning of physical space, and affect the activities performed in those spaces. This paper starts a multi-themed discussion of various aspects that make up the, at times, messy and certainly transdisciplinary field of urban computing and urban informatics.
Resumo:
This project was a comprehensive study of drink driving in two Chinese cities. It examined general motor vehicle drivers' and drunk driving offenders' knowledge on and practices of drinking and driving, and their interaction with alcohol misuse problems. In addition, traffic police officers' perceptions of drink driving and their legal enforcement practices were studied. The differences between the two cities (Guangzhou and Yinchuan) were discussed and the approaches by China and Australia to drink driving legislation, legal enforcement and policy were also compared.
Resumo:
Structural Health Monitoring (SHM) schemes are useful for proper management of the performance of structures and for preventing their catastrophic failures. Vibration based SHM schemes has gained popularity during the past two decades resulting in significant research. It is hence evitable that future SHM schemes will include robust and automated vibration based damage assessment techniques (VBDAT) to detect, localize and quantify damage. In this context, the Damage Index (DI) method which is classified as non-model or output based VBDAT, has the ability to automate the damage assessment process without using a computer or numerical model along with actual measurements. Although damage assessment using DI methods have been able to achieve reasonable success for structures made of homogeneous materials such as steel, the same success level has not been reported with respect to Reinforced Concrete (RC) structures. The complexity of flexural cracks is claimed to be the main reason to hinder the applicability of existing DI methods in RC structures. Past research also indicates that use of a constant baseline throughout the damage assessment process undermines the potential of the Modal Strain Energy based Damage Index (MSEDI). To address this situation, this paper presents a novel method that has been developed as part of a comprehensive research project carried out at Queensland University of Technology, Brisbane, Australia. This novel process, referred to as the baseline updating method, continuously updates the baseline and systematically tracks both crack formation and propagation with the ability to automate the damage assessment process using output only data. The proposed method is illustrated through examples and the results demonstrate the capability of the method to achieve the desired outcomes.