891 resultados para Explicit criteria


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pioneering work of Runge and Kutta a hundred years ago has ultimately led to suites of sophisticated numerical methods suitable for solving complex systems of deterministic ordinary differential equations. However, in many modelling situations, the appropriate representation is a stochastic differential equation and here numerical methods are much less sophisticated. In this paper a very general class of stochastic Runge-Kutta methods is presented and much more efficient classes of explicit methods than previous extant methods are constructed. In particular, a method of strong order 2 with a deterministic component based on the classical Runge-Kutta method is constructed and some numerical results are presented to demonstrate the efficacy of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists an important tradition of content analyses of aggression in sexually explicit material. The majority of these analyses use a definition of aggression that excludes consent. This article identifies three problems with this approach. First, it does not distinguish between aggression and some positive acts. Second, it excludes a key element of healthy sexuality. Third, it can lead to heteronormative definitions of healthy sexuality. It would be better to use a definition of aggression such as Baron and Richardson's (1994) in our content analyses, that includes a consideration of consent. A number of difficulties have been identified with attending to consent but this article offers solutions to each of these.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: The Ober and Thomas tests are subjective and involve a "negative" or "positive" assessment, making them difficult to apply within the paradigm of evidence-based medicine. No authors have combined the subjective clinical assessment with an objective measurement for these special tests. Objective: To compare the subjective assessment of iliotibial band and iliopsoas flexibility with the objective measurement of a digital inclinometer, to establish normative values, and to provide an evidence-based critical criterion for determining tissue tightness. Design: Cross-sectional study. Setting: Clinical research laboratory. Patients or Other Participants: Three hundred recreational athletes (125 men, 175 women; 250 in injured group, 50 in control group). Main Outcome Measure(s): Iliotibial band and iliopsoas muscle flexibility were determined subjectively using the modified Ober and Thomas tests, respectively. Using a digital inclinometer, we objectively measured limb position. lnterrater reliability for the subjective assessment was compared between 2 clinicians for a random sample of 100 injured participants, who were classified subjectively as either negative or positive for iliotibial band and iliopsoas tightness. Percentage of agreement indicated interrater reliability for the subjective assessment. Results: For iliotibial band flexibility, the average inclinometer angle was -24.59 degrees +/- 7.27 degrees. A total of 432 limbs were subjectively assessed as negative (-27.13 degrees +/- 5.53 degrees) and 168 as positive (-16.29 degrees +/- 6.87 degrees). For iliopsoas flexibility, the average inclinometer angle was -10.60 degrees +/- 9.61 degrees. A total of 392 limbs were subjectively assessed as negative (-15.51 degrees +/- 5.82 degrees) and 208 as positive (0.34 degrees +/- 7.00 degrees). The critical criteria for iliotibial band and iliopsoas flexibility were determined to be -23.16 degrees and -9.69 degrees, respectively. Between-clinicians agreement was very good, ranging from 95.0% to 97.6% for the Thomas and Ober tests, respectively. Conclusions: Subjective assessments and instrumented measurements were combined to establish normative values and critical criterions for tissue flexibility for the modified Ober and Thomas tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Making institutional expectations explicit using clear and common language engages commencing students and promotes help-seeking behaviour. When first year students enter university they cross the threshold into an unfamiliar environment (Devlin, Kift, Nelson, Smith & McKay, 2012). Universities endeavour to provide appropriate learning support services and resources; however research suggests that there is limited up take of these services, particularly in high risk students (Nelson-Field & Goodman, 2005). The Successful Student Skills Checklist is a tool which will be trialled during the 2013 Orientation period at the QUT Caboolture campus. The new tool is a response to the university’s commitment to provide “an environment where [students] are supported to take responsibility for their own learning, and to embrace an active role in succeeding to their full potential” (QUT, 2012, 6.2.1). This paper will outline the design of the support tool implemented during Orientation, as well as discuss the anticipated outcomes of the trial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall aim of our research was to characterize airborne particles from selected nanotechnology processes and to utilize the data to develop and test quantitative particle concentration-based criteria that can be used to trigger an assessment of particle emission controls. We investigated particle number concentration (PNC), particle mass (PM) concentration, count median diameter (CMD), alveolar deposited surface area, elemental composition, and morphology from sampling of aerosols arising from six nanotechnology processes. These included fibrous and non-fibrous particles, including carbon nanotubes (CNTs). We adopted standard occupational hygiene principles in relation to controlling peak emission and exposures, as outlined by both Safe Work Australia, (1) and the American Conference of Governmental Industrial Hygienists (ACGIH®). (2) The results from the study were used to analyses peak and 30-minute averaged particle number and mass concentration values measured during the operation of the nanotechnology processes. Analysis of peak (highest value recorded) and 30-minute averaged particle number and mass concentration values revealed: Peak PNC20–1000 nm emitted from the nanotechnology processes were up to three orders of magnitude greater than the local background particle concentration (LBPC). Peak PNC300–3000 nm was up to an order of magnitude greater, and PM2.5 concentrations up to four orders of magnitude greater. For three of these nanotechnology processes, the 30-minute average particle number and mass concentrations were also significantly different from the LBPC (p-value < 0.001). We propose emission or exposure controls may need to be implemented or modified, or further assessment of the controls be undertaken, if concentrations exceed three times the LBPC, which is also used as the local particle reference value, for more than a total of 30 minutes during a workday, and/or if a single short-term measurement exceeds five times the local particle reference value. The use of these quantitative criteria, which we are terming the universal excursion guidance criteria, will account for the typical variation in LBPC and inaccuracy of instruments, while precautionary enough to highlight peaks in particle concentration likely to be associated with particle emission from the nanotechnology process. Recommendations on when to utilize local excursion guidance criteria are also provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Online dating websites enable a specific form of social networking and their efficiency can be increased by supporting proactive recommendations based on participants' preferences with the use of data mining. This research develops two-way recommendation methods for people-to-people recommendation for large online social networks such as online dating networks. This research discovers the characteristics of the online dating networks and utilises these characteristics in developing efficient people-to-people recommendation methods. Methods developed support improved recommendation accuracy, can handle data sparsity that often comes with large data sets and are scalable for handling online networks with a large number of users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A crucial task in contractor prequalification is to establish a set of decision criteria through which the capabilities of contractors are measured and judged. However, in the UK, there are no nationwide standards or guidelines governing the selection of decision criteria for contractor prequalification. The decision criteria are usually established by individual clients on an ad hoc basis. This paper investigates the divergence of decision criteria used by different client and consultant organisations in contractor prequalification through a large empirical survey conducted in the UK. The results indicate that there are significant differences in the selection and use of decision criteria for prequalification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a multi-criteria optimisation study of group replacement schedules for water pipelines, which is a capital-intensive and service critical decision. A new mathematical model was developed, which minimises total replacement costs while maintaining a satisfactory level of services. The research outcomes are expected to enrich the body of knowledge of multi-criteria decision optimisation, where group scheduling is required. The model has the potential to optimise replacement planning for other types of linear asset networks resulting in bottom-line benefits for end users and communities. The results of a real case study show that the new model can effectively reduced the total costs and service interruptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is little conjecture that quality teaching is essential to student achievement and well-being. Whilst much has been written about the importance of quality teaching, including the link to pre-service teacher education, to date there has been little investigation into specific pedagogical practices that can enhance quality teaching dimensions within a pre-service teacher education programme. This paper reports on a small-scale qualitative research study, undertaken in an Australian university, which linked the fields of quality teaching, pre-service teacher education and values education. The study followed the journey of five pre-service teacher education students as they undertook their second field experience unit where the focus was centred on the values-based pedagogy of Philosophy in the Classroom. The research findings, collected via interviews, demonstrated that an explicit values-based pedagogy can have a positive impact on the development of quality teaching dimensions. This new knowledge has potential for further research into examining the ways quality teaching dimensions are gained and practised by pre-service teacher education students and these findings and recommendations are discussed in this paper.