976 resultados para ALEPH Order Number
Resumo:
As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.
Resumo:
Skid resistance is a condition parameter characterising the contribution that a road makes to the friction between a road surface and a vehicle tyre. Studies of traffic crash histories around the world have consistently found that a disproportionate number of crashes occur where the road surface has a low level of surface friction and/or surface texture, particularly when the road surface is wet. Various research results have been published over many years and have tried to quantify the influence of skid resistance on accident occurrence and to characterise a correlation between skid resistance and accident frequency. Most of the research studies used simple statistical correlation methods in analysing skid resistance and crash data.----- ------ Preliminary findings of a systematic and extensive literature search conclude that there is rarely a single causation factor in a crash. Findings from research projects do affirm various levels of correlation between skid resistance and accident occurrence. Studies indicate that the level of skid resistance at critical places such as intersections, curves, roundabouts, ramps and approaches to pedestrian crossings needs to be well maintained.----- ----- Management of risk is an integral aspect of the Queensland Department of Main Roads (QDMR) strategy for managing its infrastructure assets. The risk-based approach has been used in many areas of infrastructure engineering. However, very limited information is reported on using risk-based approach to mitigate crash rates related to road surface. Low skid resistance and surface texture may increase the risk of traffic crashes.----- ----- The objectives of this paper are to explore current issues of skid resistance in relation to crashes, to provide a framework of probability-based approach to be adopted by QDMR in assessing the relationship between crash accidents and pavement properties, and to explain why the probability-based approach is a suitable tool for QDMR in order to reduce accident rates due to skid resistance.
Resumo:
Automobiles have deeply impacted the way in which we travel but they have also contributed to many deaths and injury due to crashes. A number of reasons for these crashes have been pointed out by researchers. Inexperience has been identified as a contributing factor to road crashes. Driver’s driving abilities also play a vital role in judging the road environment and reacting in-time to avoid any possible collision. Therefore driver’s perceptual and motor skills remain the key factors impacting on road safety. Our failure to understand what is really important for learners, in terms of competent driving, is one of the many challenges for building better training programs. Driver training is one of the interventions aimed at decreasing the number of crashes that involve young drivers. Currently, there is a need to develop comprehensive driver evaluation system that benefits from the advances in Driver Assistance Systems. A multidisciplinary approach is necessary to explain how driving abilities evolves with on-road driving experience. To our knowledge, driver assistance systems have never been comprehensively used in a driver training context to assess the safety aspect of driving. The aim and novelty of this thesis is to develop and evaluate an Intelligent Driver Training System (IDTS) as an automated assessment tool that will help drivers and their trainers to comprehensively view complex driving manoeuvres and potentially provide effective feedback by post processing the data recorded during driving. This system is designed to help driver trainers to accurately evaluate driver performance and has the potential to provide valuable feedback to the drivers. Since driving is dependent on fuzzy inputs from the driver (i.e. approximate distance calculation from the other vehicles, approximate assumption of the other vehicle speed), it is necessary that the evaluation system is based on criteria and rules that handles uncertain and fuzzy characteristics of the driving tasks. Therefore, the proposed IDTS utilizes fuzzy set theory for the assessment of driver performance. The proposed research program focuses on integrating the multi-sensory information acquired from the vehicle, driver and environment to assess driving competencies. After information acquisition, the current research focuses on automated segmentation of the selected manoeuvres from the driving scenario. This leads to the creation of a model that determines a “competency” criterion through the driving performance protocol used by driver trainers (i.e. expert knowledge) to assess drivers. This is achieved by comprehensively evaluating and assessing the data stream acquired from multiple in-vehicle sensors using fuzzy rules and classifying the driving manoeuvres (i.e. overtake, lane change, T-crossing and turn) between low and high competency. The fuzzy rules use parameters such as following distance, gaze depth and scan area, distance with respect to lanes and excessive acceleration or braking during the manoeuvres to assess competency. These rules that identify driving competency were initially designed with the help of expert’s knowledge (i.e. driver trainers). In-order to fine tune these rules and the parameters that define these rules, a driving experiment was conducted to identify the empirical differences between novice and experienced drivers. The results from the driving experiment indicated that significant differences existed between novice and experienced driver, in terms of their gaze pattern and duration, speed, stop time at the T-crossing, lane keeping and the time spent in lanes while performing the selected manoeuvres. These differences were used to refine the fuzzy membership functions and rules that govern the assessments of the driving tasks. Next, this research focused on providing an integrated visual assessment interface to both driver trainers and their trainees. By providing a rich set of interactive graphical interfaces, displaying information about the driving tasks, Intelligent Driver Training System (IDTS) visualisation module has the potential to give empirical feedback to its users. Lastly, the validation of the IDTS system’s assessment was conducted by comparing IDTS objective assessments, for the driving experiment, with the subjective assessments of the driver trainers for particular manoeuvres. Results show that not only IDTS was able to match the subjective assessments made by driver trainers during the driving experiment but also identified some additional driving manoeuvres performed in low competency that were not identified by the driver trainers due to increased mental workload of trainers when assessing multiple variables that constitute driving. The validation of IDTS emphasized the need for an automated assessment tool that can segment the manoeuvres from the driving scenario, further investigate the variables within that manoeuvre to determine the manoeuvre’s competency and provide integrated visualisation regarding the manoeuvre to its users (i.e. trainers and trainees). Through analysis and validation it was shown that IDTS is a useful assistance tool for driver trainers to empirically assess and potentially provide feedback regarding the manoeuvres undertaken by the drivers.
Resumo:
This is my penultimate report as National President of the Australian Institute of Traffic Planning and Management, Inc. As an academic, I would like to take this opportunity to raise some issues and challenges I see in transport professional education in Australia. My general view is that the transport profession has until recently been less conspicuous to others as an identifiable discipline. This is both a blessing and somewhat of a curse. People mostly enter, or sometimes fall into, the transport profession having taken a degree in civil engineering, other engineering, urban and regional planning, economics, industrial psychology, business, followed by the less obvious disciplines. This order is probably about relative to the proportion of members’ background qualifications in our ranks too. However, once a graduate destined to become a transport professional has spent about five years or so out of the academic estuary, they tend to specialise in an area that cannot necessarily be easily correlated to the well known courses I have rattled off above. I can say from experience that it is not out of the question to see SIDRA models having been prepared by a transport professional who did not take traffic engineering as part of a civil engineering degree. So I see a couple of key challenges for the transport profession, which happens to be represented by a number of bodies, with our AITPM perhaps being the peak body, into the future,
Resumo:
The perceived benefits of Wellness Education in University environments are substantiated by a number of studies in relation to the place, impact and purpose of Wellness curricula. Many authors recommend that Wellness curriculum design must include personal experiences, reflective practice and active self-managed learning approaches in order to legitimise the adoption of Wellness as a personal lifestyle approach. Wellness Education provides opportunities to engage in learning self-regulation skills both within and beyond the context of the Wellness construct. Learner success is optimised by creating authentic opportunities to develop and practice self regulation strategies that facilitate making meaning of life's experiences. Such opportunities include provision of options for self determined outcomes and are scaffolded according to learner needs; thus, configuring a learner-centred curriculum in Wellness Education would potentially benefit by overlaying principles from the domains of Self Determination Theory, Self Regulated Learning and Transformative Education Theory to highlight authentic, transformative learning as a lifelong approach to Wellness.
Resumo:
In a world of constant and rapid change there are greater demands placed on learners to not only gain content knowledge, but also to develop learning skills and to adopt new strategies that will enable them to produce better and faster learning outcomes. Especially in internationally advancing nations like Kuwait this will be a major challenge of the future. This literature review examines theoretical frameworks that enhance Kuwaiti teachers’ knowledge and skill to adopt culturally relevant reform practices across a number of disciplines and provide guidance in an exploration and use of newer pedagogical tools like graphic organisers. It analyses the effects of graphic organisers on higher order learning and evaluates how they can effect professional development and pedagogical change in Kuwait.
Resumo:
Complex surveillance problems are common in biosecurity, such as prioritizing detection among multiple invasive species, specifying risk over a heterogeneous landscape, combining multiple sources of surveillance data, designing for specified power to detect, resource management, and collateral effects on the environment. Moreover, when designing for multiple target species, inherent biological differences among species result in different ecological models underpinning the individual surveillance systems for each. Species are likely to have different habitat requirements, different introduction mechanisms and locations, require different methods of detection, have different levels of detectability, and vary in rates of movement and spread. Often there is a further challenge of a lack of knowledge, literature, or data, for any number of the above problems. Even so, governments and industry need to proceed with surveillance programs which aim to detect incursions in order to meet environmental, social and political requirements. We present an approach taken to meet these challenges in one comprehensive and statistically powerful surveillance design for non-indigenous terrestrial vertebrates on Barrow Island, a high conservation nature reserve off the Western Australian coast. Here, the possibility of incursions is increased due to construction and expanding industry on the island. The design, which includes mammals, amphibians and reptiles, provides a complete surveillance program for most potential terrestrial vertebrate invaders. Individual surveillance systems were developed for various potential invaders, and then integrated into an overall surveillance system which meets the above challenges using a statistical model and expert elicitation. We discuss the ecological basis for the design, the flexibility of the surveillance scheme, how it meets the above challenges, design limitations, and how it can be updated as data are collected as a basis for adaptive management.
Resumo:
Osteoclasts are specialised bone-resorbing cells. This particular ability makes osteoclasts irreplaceable for the continual physiological process of bone remodelling as well as for the repair process during bone healing. Whereas the effects of systemic diseases on osteoclasts have been described by many authors, the spatial and temporal distribution of osteoclasts during bone healing seems to be unclear so far. In the present study, healing of a tibial osteotomy under standardised external fixation was examined after 2, 3, 6 and 9 weeks (n = 8) in sheep. The osteoclastic number was counted, the area of mineralised bone tissue was measured histomorphometrically and density of osteoclasts per square millimetre mineralised tissue was calculated. The osteoclastic density in the endosteal region increased, whereas the density in the periosteal region remained relatively constant. The density of osteoclasts within the cortical bone increased slightly over the first 6 weeks, however, there was a more rapid increase between the sixth and ninth weeks. The findings of this study imply that remodelling and resorption take place already in the very early phase of bone healing. The most frequent remodelling process can be found in the periosteal callus, emphasising its role as the main stabiliser. The endosteal space undergoes resorption in order to recanalise the medullary cavity, a process also started in the very early phase of healing at a low level and increasing significantly during healing. The cortical bone adapts in its outward appearance to the surrounding callus structure. This paradoxic loosening is caused by the continually increasing number and density of osteoclasts in the cortical bone ends. This study clearly emphasises the osteoclastic role especially during early bone healing. These cells do not simply resorb bone but participate in a fine adjusted system with the bone-producing osteoblasts in order to maintain and improve the structural strength of bone tissue.
Resumo:
In order to achieve meaningful reductions in individual ecological footprints, individuals must dramatically alter their day to day behaviours. Effective interventions will need to be evidence based and there is a necessity for the rapid transfer or communication of information from the point of research, into policy and practice. A number of health disciplines, including psychology and public health, share a common mission to promote health and well-being and it is becoming clear that the most practical pathway to achieving this mission is through interdisciplinary collaboration. This paper argues that an interdisciplinary collaborative approach will facilitate research that results in the rapid transfer of findings into policy and practice. The application of this approach is described in relation to the Green Living project which explored the psycho-social predictors of environmentally friendly behaviour. Following a qualitative pilot study, and in consultation with an expert panel comprising academics, industry professionals and government representatives, a self-administered mail survey was distributed to a random sample of 3000 residents of Brisbane and Moreton Bay (Queensland, Australia). The Green Living survey explored specific beliefs which included attitudes, norms, perceived control, intention and behaviour, as well as a number of other constructs such as environmental concern and altruism. This research has two beneficial outcomes. First, it will inform a practical model for predicting sustainable living behaviours and a number of local councils have already expressed an interest in making use of the results as part of their ongoing community engagement programs. Second, it provides an example of how a collaborative interdisciplinary project can provide a more comprehensive approach to research than can be accomplished by a single disciplinary project.
Resumo:
The Saffman-Taylor finger problem is to predict the shape and,in particular, width of a finger of fluid travelling in a Hele-Shaw cell filled with a different, more viscous fluid. In experiments the width is dependent on the speed of propagation of the finger, tending to half the total cell width as the speed increases. To predict this result mathematically, nonlinear effects on the fluid interface must be considered; usually surface tension is included for this purpose. This makes the mathematical problem suffciently diffcult that asymptotic or numerical methods must be used. In this paper we adapt numerical methods used to solve the Saffman-Taylor finger problem with surface tension to instead include the effect of kinetic undercooling, a regularisation effect important in Stefan melting-freezing problems, for which Hele-Shaw flow serves as a leading order approximation when the specific heat of a substance is much smaller than its latent heat. We find the existence of a solution branch where the finger width tends to zero as the propagation speed increases, disagreeing with some aspects of the asymptotic analysis of the same problem. We also find a second solution branch, supporting the idea of a countably infinite number of branches as with the surface tension problem.
Resumo:
This paper develops a composite participation index (PI) to identify patterns of transport disadvantage in space and time. It is operationalised using 157 weekly activity-travel diaries data collected from three case study areas in rural Northern Ireland. A review of activity space and travel behaviour research found that six dimensional indicators of activity spaces were typically used including the number of unique locations visited, distance travelled, area of activity spaces, frequency of activity participation, types of activity participated in, and duration of participation in order to identify transport disadvantage. A combined measure using six individual indices were developed based on the six dimensional indicators of activity spaces, by taking into account the relativity of the measures for weekdays, weekends, and for a week. Factor analyses were conducted to derive weights of these indices to form the PI measure. Multivariate analysis using general linear models of the different indicators/indices identified new patterns of transport disadvantage. The research found that: indicator based measures and index based measures are complement each other; interactions between different factors generated new patterns of transport disadvantage; and that these patterns vary in space and time. The analysis also indicates that the transport needs of different disadvantaged groups are varied.
Resumo:
As online social spaces continue to grow in importance, the complex relationship between users and the private providers of the platforms continues to raise increasingly difficult questions about legitimacy in online governance. This article examines two issues that go to the core of egitimate governance in online communities: how are rules enforced and punishments imposed, and how should the law support legitimate governance and protect participants from the illegitimate exercise of power? Because the rules of online communities are generally ultimately backed by contractual terms of service, the imposition of punishment for the breach of internal rules exists in a difficult conceptual gap between criminal law and the predominantly compensatory remedies of contractual doctrine. When theorists have addressed the need for the rules of virtual communities to be enforced, a dichotomy has generally emerged between the appropriate role of criminal law for 'real' crimes, and the private, internal resolution of 'virtual' or 'fantasy' crimes. In this structure, the punitive effect of internal measures is downplayed and the harm that can be caused to participants by internal sanctions is systemically undervalued.
Resumo:
A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.
Resumo:
In today’s electronic world vast amounts of knowledge is stored within many datasets and databases. Often the default format of this data means that the knowledge within is not immediately accessible, but rather has to be mined and extracted. This requires automated tools and they need to be effective and efficient. Association rule mining is one approach to obtaining knowledge stored with datasets / databases which includes frequent patterns and association rules between the items / attributes of a dataset with varying levels of strength. However, this is also association rule mining’s downside; the number of rules that can be found is usually very big. In order to effectively use the association rules (and the knowledge within) the number of rules needs to be kept manageable, thus it is necessary to have a method to reduce the number of association rules. However, we do not want to lose knowledge through this process. Thus the idea of non-redundant association rule mining was born. A second issue with association rule mining is determining which ones are interesting. The standard approach has been to use support and confidence. But they have their limitations. Approaches which use information about the dataset’s structure to measure association rules are limited, but could yield useful association rules if tapped. Finally, while it is important to be able to get interesting association rules from a dataset in a manageable size, it is equally as important to be able to apply them in a practical way, where the knowledge they contain can be taken advantage of. Association rules show items / attributes that appear together frequently. Recommendation systems also look at patterns and items / attributes that occur together frequently in order to make a recommendation to a person. It should therefore be possible to bring the two together. In this thesis we look at these three issues and propose approaches to help. For discovering non-redundant rules we propose enhanced approaches to rule mining in multi-level datasets that will allow hierarchically redundant association rules to be identified and removed, without information loss. When it comes to discovering interesting association rules based on the dataset’s structure we propose three measures for use in multi-level datasets. Lastly, we propose and demonstrate an approach that allows for association rules to be practically and effectively used in a recommender system, while at the same time improving the recommender system’s performance. This especially becomes evident when looking at the user cold-start problem for a recommender system. In fact our proposal helps to solve this serious problem facing recommender systems.
Resumo:
In recent debates about the regulation of technologies that deliver pornographic content, the greatest concerns have been about the increasing ease with which young people can access such material. Because of the ethical difficulties in researching this topic, little data has been available on the potential harm done to young people by exposure to pornography. This paper gathers a number of data sources that address this issue indirectly—including the results of our own survey of over 1000 consumers of pornography—to explore this issue. Research shows that healthy sexual development includes natural curiosity about sexuality. Retrospective studies show that accidental exposure to real-life scenes of sexuality does not harm children. Our survey shows that age of first exposure to pornography does not correlate with negative attitudes towards women. Studies with non-explicit representations of sexuality show that young people who seek out sexualised representations tend to be those with a pre-existing interest in sexuality. These studies also suggest that current generations of children are no more sexualised than previous generations, that they are not innocent about sexuality, and that a key negative effect of this knowledge is the requirement for them to feign ignorance in order to satisfy adults’ expectations of them. Research also suggests important differences between pre- and post-pubescent attitudes towards pornography, and that pornography is not addictive.