881 resultados para Low Speed Switched Reluctance Machine
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Research has highlighted the relationship between vehicle speed and increased crash risk and severity. Evidence suggests that police speed enforcement, in particular speed camera operations, can be an effective tool for reducing traffic crashes. A quantitative survey of Queensland drivers (n = 852) was conducted to investigate the impact of police speed enforcement methods on self-reported speeding behaviour. Results indicate that visible enforcement was associated with significantly greater self-reported compliance than covert operations irrespective of the mobility of the approach, and the effects on behaviour were longer lasting. The mobility of operations appeared to be moderated the visibility of the approach. Specifically, increased mobility was associated with increase reported compliant behaviour, but only for covert operations, and increased longevity of reported compliant behaviour, but only for overt operations. The perceived effectiveness of various speed enforcement approaches are also analysed across a range of driving scenarios. Results are discussed in light of the small effect sizes. Recommendations for policy and future research are presented.
Resumo:
Traffic law enforcement is based on deterrence principles, whereby drivers control their behaviour in order to avoid an undesirable sanction. For “hooning”-related driving behaviours in Queensland, the driver’s vehicle can be impounded for 48 hours, 3 months, or permanently depending on the number of previous hooning offences. It is assumed that the threat of losing something of value, their vehicle, will discourage drivers from hooning. While official data shows that the rate of repeat offending is low, an in-depth understanding of the deterrent effects of these laws should involve qualitative research with targeted drivers. A sample of 22 drivers who reported engaging in hooning behaviours participated in focus group discussions about the vehicle impoundment laws as applied to hooning offences in Queensland. The findings suggested that deterrence theory alone cannot fully explain hooning behaviour, as participants reported hooning frequently, and intended to continue doing so, despite reporting that it is likely that they will be caught, and perceiving the vehicle impoundment laws to be extremely severe. The punishment avoidance aspect of deterrence theory appears important, as well as factors over and above legal issues, particularly social influences. A concerning finding was drivers’ willingness to flee from police in order to avoid losing their vehicle permanently for a third offence, despite acknowledging risks to their own safety and that of others. This paper discusses the study findings in terms of the implications for future research directions, enforcement practices and policy development for hooning and other traffic offences for which vehicle impoundment is applied.
Resumo:
Review of 'The Kursk', La Boite Theatre Company, published in The Australian, 3 September 2009.
Seismic performance of brick infilled RC frame structures in low and medium rise buildings in Bhutan
Resumo:
The construction of reinforced concrete buildings with unreinforced infill is common practice even in seismically active country such as Bhutan, which is located in high seismic region of Eastern Himalaya. All buildings constructed prior 1998 were constructed without seismic provisions while those constructed after this period adopted seismic codes of neighbouring country, India. However, the codes have limited information on the design of infilled structures besides having differences in architectural requirements which may compound the structural problems. Although the influence of infill on the reinforced concrete framed structures is known, the present seismic codes do not consider it due to the lack of sufficient information. Time history analyses were performed to study the influence of infill on the performance of concrete framed structures. Important parameters were considered and the results presented in a manner that can be used by practitioners. The results show that the influence of infill on the structural performance is significant. The structural responses such as fundamental period, roof displacement, inter-storey drift ratio, stresses in infill wall and structural member forces of beams and column generally reduce, with incorporation of infill wall. The structures designed and constructed with or without seismic provision perform in a similar manner if the infills of high strength are used.
Resumo:
The term self-selected (i.e., individual or comfortable walking pace or speed) is commonly used in the literature (Frost, Dowling, Bar-Or, & Dyson, 1997; Jeng, Liao, Lai, & Hou, 1997; Wergel-Kolmert & Wohlfart, 1999; Maltais, Bar-Or, Pienynowski, & Galea, 2003; Browning & Kram, 2005; Browning, Baker, Herron, & Kram, 2006; Hills, Byrne, Wearing, & Armstrong, 2006) and is identified as the most efficient walking speed, with increased efficiency defined by lower oxygen uptake (VO^sub 2^) per unit mechanical work (Hoyt & Taylor, 1981; Taylor, Heglund, & Maloiy, 1982; Hreljac, 1993). [...] assessing individual and group differences in metabolic energy expenditure using oxygen uptake requires individuals to be comfortable with, and able to accommodate to, the equipment.
Resumo:
Objective: Obesity associated with atypical antipsychotic medications is an important clinical issue for people with schizophrenia. The purpose of this project was to determine whether there were any differences in resting energy expenditure (REE) and respiratory quotient (RQ) between men with schizophrenia and controls. Method: Thirty-one men with schizophrenia were individually matched for age and relative body weight with healthy, sedentary controls. Deuterium dilution was used to determine total body water and subsequently fat-free mass (FFM). Indirect calorimetry using a Deltatrac metabolic cart was used to determine REE and RQ. Results: When corrected for FFM, there was no significant difference in REE between the groups. However, fasting RQ was significantly higher in the men with schizophrenia than the controls. Conclusion: Men with schizophrenia oxidised proportionally less fat and more carbohydrate under resting conditions than healthy controls. These differences in substrate utilisation at rest may be an important consideration in obesity in this clinical group.
Resumo:
In this article we present an alternative theoretical perspective on contemporary cultural, political and economic practices in advanced countries. Like other articles in this issue of parallax, our focus is on conceptualising the economies of excess. However, our ideas do not draw on the writings of Georges Bataille in The Accursed Share, but principally on Virilio’s Speed & Politics: An Essay on Dromology and Marx’s Capital and the Grundrisse.4 Using a modest synthesis of tools provided by these theorists, we put forward a tentative conceptualisation of ‘dromoeconomics’, or, a political economy of speed.
Resumo:
The portability and runtime safety of programs which are executed on the Java Virtual Machine (JVM) makes the JVM an attractive target for compilers of languages other than Java. Unfortunately, the JVM was designed with language Java in mind, and lacks many of the primitives required for a straighforward implementation of other languages. Here, we discuss how the JVM may be used to implement other object-oriented languages. As a practical example of the possibilities, we report on a comprehensive case study. The open source Gardens Point Component Pascal compiler compiles the entire Component Pascal language, a dialect of Oberon-2, to JVM bytecodes. This compiler achieves runtime efficiencies which are comparable to native-code implementations of procedural languages.
Resumo:
The portability and runtime safety of programs which are executed on the Java Virtual Machine (JVM) makes the JVM an attractive target for compilers of languages other than Java. Unfortunately, the JVM was designed with language Java in mind, and lacks many of the primitives required for a straight forward implementation of other languages. Here, we discuss how the JVM may be used to implement other object oriented languages. As a practical example of the possibilities, we report on a comprehensive case study. The open source Gardens Point Component Pascal compiler compiles the entire Component Pascal language, a dialect of Oberon 2, to JVM bytecodes. This compiler achieves runtime efficiencies which are comparable to native code implementations of procedural languages.
Resumo:
The care of low-vision patients is termed vision rehabilitation, and optometrists have an essential role to play in the provision of vision rehabilitation services. Ideally, if patients stay with one optometrist or practice, their low-vision care becomes part of a continuum of eye care, from the time when they had normal vision. If progressive vision loss occurs, the role of the optometrist changes from primary eye care only to one of monitoring vision loss and gradually introducing low-vision care, especially magnification and advice on lighting and contrast, in conjunction with other vision rehabilitation professionals.
Resumo:
Modern machines are complex and often required to operate long hours to achieve production targets. The ability to detect symptoms of failure, hence, forecasting the remaining useful life of the machine is vital to prevent catastrophic failures. This is essential to reducing maintenance cost, operation downtime and safety hazard. Recent advances in condition monitoring technologies have given rise to a number of prognosis models that attempt to forecast machinery health based on either condition data or reliability data. In practice, failure condition trending data are seldom kept by industries and data that ended with a suspension are sometimes treated as failure data. This paper presents a novel approach of incorporating historical failure data and suspended condition trending data in the prognostic model. The proposed model consists of a FFNN whose training targets are asset survival probabilities estimated using a variation of Kaplan-Meier estimator and degradation-based failure PDF estimator. The output survival probabilities collectively form an estimated survival curve. The viability of the model was tested using a set of industry vibration data.
Resumo:
Objectives. To evaluate the performance of the dynamic-area high-speed videokeratoscopy technique in the assessment of tear film surface quality with and without the presence of soft contact lenses on eye. Methods. Retrospective data from a tear film study using basic high-speed videokeratoscopy, captured at 25 frames per second, (Kopf et al., 2008, J Optom) were used. Eleven subjects had tear film analysis conducted in the morning, midday and evening on the first and seventh day of one week of no lens wear. Five of the eleven subjects then completed an extra week of hydrogel lens wear followed by a week of silicone hydrogel lens wear. Analysis was performed on a 6 second period of the inter-blink recording. The dynamic-area high-speed videokeratoscopy technique uses the maximum available area of Placido ring pattern reflected from the tear interface and eliminates regions of disturbance due to shadows from the eyelashes. A value of tear film surface quality was derived using image rocessing techniques, based on the quality of the reflected ring pattern orientation. Results. The group mean tear film surface quality and the standard deviations for each of the conditions (bare eye, hydrogel lens, and silicone hydrogel lens) showed a much lower coefficient of variation than previous methods (average reduction of about 92%). Bare eye measurements from the right and left eyes of eleven individuals showed high correlation values (Pearson’s correlation r = 0.73, p < 0.05). Repeated measures ANOVA across the 6 second period of measurement in the normal inter-blink period for the bare eye condition showed no statistically significant changes. However, across the 6 second inter-blink period with both contact lenses, statistically significant changes were observed (p < 0.001) for both types of contact lens material. Overall, wearing hydrogel and silicone hydrogel lenses caused the tear film surface quality to worsen compared with the bare eye condition (repeated measures ANOVA, p < 0.0001 for both hydrogel and silicone hydrogel). Conclusions. The results suggest that the dynamic-area method of high-speed videokeratoscopy was able to distinguish and quantify the subtle, but systematic worsening of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions.
Resumo:
A new method for noninvasive assessment of tear film surface quality (TFSQ) is proposed. The method is based on high-speed videokeratoscopy in which the corneal area for the analysis is dynamically estimated in a manner that removes videokeratoscopy interference from the shadows of eyelashes but not that related to the poor quality of the precorneal tear film that is of interest. The separation between the two types of seemingly similar videokeratoscopy interference is achieved by region-based classification in which the overall noise is first separated from the useful signal (unaltered videokeratoscopy pattern), followed by a dedicated interference classification algorithm that distinguishes between the two considered interferences. The proposed technique provides a much wider corneal area for the analysis of TFSQ than the previously reported techniques. A preliminary study with the proposed technique, carried out for a range of anterior eye conditions, showed an effective behavior in terms of noise to signal separation, interference classification, as well as consistent TFSQ results. Subsequently, the method proved to be able to not only discriminate between the bare eye and the lens on eye conditions but also to have the potential to discriminate between the two types of contact lenses.