978 resultados para privileges prevail over UCPR pleading rules


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Young motorists engaging in anti-social and often dangerous driving manoeuvres (which is often referred to as “hooning” within Australia) is an increasing road safety problem. While anecdotal evidence suggests that such behaviour is positively linked with crash involvement, researchers have yet to examine whether younger drivers who deliberately break road rules and drive in an erratic manner (usually with peers) are in fact over represented in crash statistics. This paper outlines research that aimed to identify the characteristics of individuals most likely to engaging in hooning behaviours, as well as examine the frequency of such driving behaviours and if such activity is linked with self-reported crash involvement.---------- Methods: A total of 717 young drivers in Queensland voluntarily completed a questionnaire to investigate their driving behaviour and crash history.---------- Results: Quantitative analysis of the data revealed that almost half the sample reported engaging in some form of “hooning” behaviour at least once in their lifetime, although only 4% indicated heavy participation in the behaviour e.g., >50 times. Street racing was the most common activity reported by participants followed by “drifting” and then “burnouts”. Logistic regression analysis indicated that being younger and a male was predictive of reporting such anti-social driving behaviours, and importantly, a trend was identified between such behaviour and self-reported crash involvement.---------- Conclusions: This research provides preliminary evidence that younger male drivers are more likely to engage in dangerous driving behaviours, which ultimately may prove to increase their overall risk of becoming involved in a crash. This paper will further outline the study findings in regards to current enforcement efforts to deter such driving activity as well as provide direction for future research efforts in this area.---------- Research highlights: ► The self-reported driving behaviours of 717 younger Queensland drivers were examined to investigate the relationship between deliberately breaking road rules and self-reported crash involvement. ► Younger male drivers were most likely to engage in such aberrant driving behaviours and a trend was identified between such behaviour and self-reported crash involvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Confucius was and still is one of the most eminent Chinese philosophers. Such is the importance of Confucius’s teachings; it had influenced all aspects of social life in Chinese societies. In the post-Enron, post-Worldcom, and post-Global Financial Crisis era there are raising doubts in the mantra of the so-called conventional wisdom about law and economic order. Whilst many recent publications offered solutions to those problems like advocating for more laws, rules or reforms in regulatory institutions to enhance the regulation of corporate governance. What Confucius advocated was a non-legal, social mode of regulation based on moral ideals that should be embedded into the minds of every person. Whilst this is an ancient concept from primitive societies, its relevance and merits could be seen in modern Chinese societies like Hong Kong. In essence, Confucian principles of governance build on relational and paternalistic order based on moral ideals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recommender systems are widely used online to help users find other products, items etc that they may be interested in based on what is known about that user in their profile. Often however user profiles may be short on information and thus it is difficult for a recommender system to make quality recommendations. This problem is known as the cold-start problem. Here we investigate using association rules as a source of information to expand a user profile and thus avoid this problem. Our experiments show that it is possible to use association rules to noticeably improve the performance of a recommender system under the cold-start situation. Furthermore, we also show that the improvement in performance obtained can be achieved while using non-redundant rule sets. This shows that non-redundant rules do not cause a loss of information and are just as informative as a set of association rules that contain redundancy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AFTER a great deal of success with last year's "emo" adaptation of Hamlet, David Berthold begins La Boite Theatre Company's 2011 season, his second season at the helm, with an adaptation of Julius Caesar.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a framework for performing real-time recursive estimation of landmarks’ visual appearance. Imaging data in its original high dimensional space is probabilistically mapped to a compressed low dimensional space through the definition of likelihood functions. The likelihoods are subsequently fused with prior information using a Bayesian update. This process produces a probabilistic estimate of the low dimensional representation of the landmark visual appearance. The overall filtering provides information complementary to the conventional position estimates which is used to enhance data association. In addition to robotics observations, the filter integrates human observations in the appearance estimates. The appearance tracks as computed by the filter allow landmark classification. The set of labels involved in the classification task is thought of as an observation space where human observations are made by selecting a label. The low dimensional appearance estimates returned by the filter allow for low cost communication in low bandwidth sensor networks. Deployment of the filter in such a network is demonstrated in an outdoor mapping application involving a human operator, a ground and an air vehicle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents findings from the rural and remote road safety study, conducted in Queensland, Australia, from March 2004 till June 2007, and compares fatal crashes and non-fatal but serious crashes in respect of their environmental, vehicle and operator factors. During the study period there were 613 non-fatal crashes resulting in 684 hospitalised casualties and 119 fatal crashes resulting in 130 fatalities. Additional information from police sources was available on 103 fatal and 309 non-fatal serious crashes. Over three quarters of both fatal and hospitalised casualties were male and the median age in both groups was 34 years. Fatal crashes were more likely to involve speed, alcohol and violations of road rules and fatal crash victims were 2 and a 1/2 times more likely to be unrestrained inside the vehicle than non-fatal casualties, consistent with current international evidence. After controlling for human factors, vehicle and road conditions made a minimal contribution to the seriousness of the crash outcome. Targeted interventions to prevent fatalities on rural and remote roads should focus on reducing speed and drink driving and promoting seatbelt wearing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines if outcome expectancies (perceived consequences of engaging in certain behavior) and self- efficacy expectancies (confidence in personal capacity to regulate behavior) contribute to treatment outcome for alcohol dependence. Few clinical studies have examined these constructs. The Drinking Expectancy Profile (DEP), a psychometric measure of alcohol expectancy and drinking refusal selfefficacy, was administered to 298 alcohol-dependent patients (207 males) at assessment and on completion of a 12-week cognitive–behavioral therapy alcohol abstinence program. Baseline measures of expectancy and self-efficacy were not strong predictors of outcome. However, for the 164 patients who completed treatment, all alcohol expectancy and self-efficacy factors of the DEP showed change over time. The DEP scores approximated community norms at the end of treatment. Discriminant analysis indicated that change in social pressure drinking refusal self-efficacy, sexual enhancement expectancies, and assertion expectancies successfully discriminated those who successfully completed treatment from those who did not. Future research should examine the basis of expectancies related to social functioning as a possible mechanism of treatment response and a means to enhance treatment outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents a case study that shows how a creative music educator uses the internet to enable participatory performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Similarity solutions for flow over an impermeable, non-linearly (quadratic) stretching sheet were studied recently by Raptis and Perdikis (Int. J. Non-linear Mech. 41 (2006) 527–529) using a stream function of the form ψ=αxf(η)+βx2g(η). A fundamental error in their problem formulation is pointed out. On correction, it is shown that similarity solutions do not exist for this choice of ψ

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Osteoclasts are specialised bone-resorbing cells. This particular ability makes osteoclasts irreplaceable for the continual physiological process of bone remodelling as well as for the repair process during bone healing. Whereas the effects of systemic diseases on osteoclasts have been described by many authors, the spatial and temporal distribution of osteoclasts during bone healing seems to be unclear so far. In the present study, healing of a tibial osteotomy under standardised external fixation was examined after 2, 3, 6 and 9 weeks (n = 8) in sheep. The osteoclastic number was counted, the area of mineralised bone tissue was measured histomorphometrically and density of osteoclasts per square millimetre mineralised tissue was calculated. The osteoclastic density in the endosteal region increased, whereas the density in the periosteal region remained relatively constant. The density of osteoclasts within the cortical bone increased slightly over the first 6 weeks, however, there was a more rapid increase between the sixth and ninth weeks. The findings of this study imply that remodelling and resorption take place already in the very early phase of bone healing. The most frequent remodelling process can be found in the periosteal callus, emphasising its role as the main stabiliser. The endosteal space undergoes resorption in order to recanalise the medullary cavity, a process also started in the very early phase of healing at a low level and increasing significantly during healing. The cortical bone adapts in its outward appearance to the surrounding callus structure. This paradoxic loosening is caused by the continually increasing number and density of osteoclasts in the cortical bone ends. This study clearly emphasises the osteoclastic role especially during early bone healing. These cells do not simply resorb bone but participate in a fine adjusted system with the bone-producing osteoblasts in order to maintain and improve the structural strength of bone tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Association rule mining has contributed to many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we first propose a definition for redundancy, then propose a concise representation, called a Reliable basis, for representing non-redundant association rules. The Reliable basis contains a set of non-redundant rules which are derived using frequent closed itemsets and their generators instead of using frequent itemsets that are usually used by traditional association rule mining approaches. An important contribution of this paper is that we propose to use the certainty factor as the criterion to measure the strength of the discovered association rules. Using this criterion, we can ensure the elimination of as many redundant rules as possible without reducing the inference capacity of the remaining extracted non-redundant rules. We prove that the redundancy elimination, based on the proposed Reliable basis, does not reduce the strength of belief in the extracted rules. We also prove that all association rules, their supports and confidences, can be retrieved from the Reliable basis without accessing the dataset. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules. We also conduct experiments on the application of association rules to the area of product recommendation. The experimental results show that the non-redundant association rules extracted using the proposed method retain the same inference capacity as the entire rule set. This result indicates that using non-redundant rules only is sufficient to solve real problems needless using the entire rule set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In today’s electronic world vast amounts of knowledge is stored within many datasets and databases. Often the default format of this data means that the knowledge within is not immediately accessible, but rather has to be mined and extracted. This requires automated tools and they need to be effective and efficient. Association rule mining is one approach to obtaining knowledge stored with datasets / databases which includes frequent patterns and association rules between the items / attributes of a dataset with varying levels of strength. However, this is also association rule mining’s downside; the number of rules that can be found is usually very big. In order to effectively use the association rules (and the knowledge within) the number of rules needs to be kept manageable, thus it is necessary to have a method to reduce the number of association rules. However, we do not want to lose knowledge through this process. Thus the idea of non-redundant association rule mining was born. A second issue with association rule mining is determining which ones are interesting. The standard approach has been to use support and confidence. But they have their limitations. Approaches which use information about the dataset’s structure to measure association rules are limited, but could yield useful association rules if tapped. Finally, while it is important to be able to get interesting association rules from a dataset in a manageable size, it is equally as important to be able to apply them in a practical way, where the knowledge they contain can be taken advantage of. Association rules show items / attributes that appear together frequently. Recommendation systems also look at patterns and items / attributes that occur together frequently in order to make a recommendation to a person. It should therefore be possible to bring the two together. In this thesis we look at these three issues and propose approaches to help. For discovering non-redundant rules we propose enhanced approaches to rule mining in multi-level datasets that will allow hierarchically redundant association rules to be identified and removed, without information loss. When it comes to discovering interesting association rules based on the dataset’s structure we propose three measures for use in multi-level datasets. Lastly, we propose and demonstrate an approach that allows for association rules to be practically and effectively used in a recommender system, while at the same time improving the recommender system’s performance. This especially becomes evident when looking at the user cold-start problem for a recommender system. In fact our proposal helps to solve this serious problem facing recommender systems.