127 resultados para Explicit criteria


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hazard and reliability prediction of an engineering asset is one of the significant fields of research in Engineering Asset Health Management (EAHM). In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset can be influenced and/or indicated by different factors that are termed as covariates. The Explicit Hazard Model (EHM) as a covariate-based hazard model is a new approach for hazard prediction which explicitly incorporates both internal and external covariates into one model. EHM is an appropriate model to use in the analysis of lifetime data in presence of both internal and external covariates in the reliability field. This paper presents applications of the methodology which is introduced and illustrated in the theory part of this study. In this paper, the semi-parametric EHM is applied to a case study so as to predict the hazard and reliability of resistance elements on a Resistance Corrosion Sensor Board (RCSB).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Ecological data sets often use clustered measurements or use repeated sampling in a longitudinal design. Choosing the correct covariance structure is an important step in the analysis of such data, as the covariance describes the degree of similarity among the repeated observations. 2. Three methods for choosing the covariance are: the Akaike information criterion (AIC), the quasi-information criterion (QIC), and the deviance information criterion (DIC). We compared the methods using a simulation study and using a data set that explored effects of forest fragmentation on avian species richness over 15 years. 3. The overall success was 80.6% for the AIC, 29.4% for the QIC and 81.6% for the DIC. For the forest fragmentation study the AIC and DIC selected the unstructured covariance, whereas the QIC selected the simpler autoregressive covariance. Graphical diagnostics suggested that the unstructured covariance was probably correct. 4. We recommend using DIC for selecting the correct covariance structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the variable-order nonlinear fractional diffusion equation View the MathML source where xRα(x,t) is a generalized Riesz fractional derivative of variable order View the MathML source and the nonlinear reaction term f(u,x,t) satisfies the Lipschitz condition |f(u1,x,t)-f(u2,x,t)|less-than-or-equals, slantL|u1-u2|. A new explicit finite-difference approximation is introduced. The convergence and stability of this approximation are proved. Finally, some numerical examples are provided to show that this method is computationally efficient. The proposed method and techniques are applicable to other variable-order nonlinear fractional differential equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, technology is described as involving processes whereby resources are utilised to satisfy human needs or to take advantage of opportunities, to develop practical solutions to problems. This study, set within one type of technology context, information technology, investigated how, through a one semester undergraduate university course, elements of technological processes were made explicit to students. While it was acknowledged in the development and implementation of this course that students needed to learn technical skills, technological skills and knowledge, including design, were seen as vital also, to enable students to think about information technology from a perspective that was not confined and limited to `technology as hardware and software'. This paper describes how the course, set within a three year program of study, was aimed at helping students to develop their thinking and their knowledge about design processes in an explicit way. An interpretive research approach was used and data sources included a repertory grid `survey'; student interviews; video recordings of classroom interactions, audio recordings of lectures, observations of classroom interactions made by researchers; and artefacts which included students' journals and portfolios. The development of students' knowledge about design practices is discussed and reflections upon student knowledge development in conjunction with their learning experiences are made. Implications for ensuring explicitness of design practice within information technology contexts are presented, and the need to identify what constitutes design knowledge is argued.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most online assessment systems now incorporate social networking features, and recent developments in social media spaces include protocols that allow the synchronisation and aggregation of data across multiple user profiles. In light of these advances and the concomitant fear of data sharing in secondary school education this papers provides important research findings about generic features of online social networking, which educators can use to make sound and efficient assessments in collaboration with their students and colleagues. This paper reports on a design experiment in flexible educational settings that challenges the dichotomous legacy of success and failure evident in many assessment activities for at-risk youth. Combining social networking practices with the sociology of education the paper proposes that assessment activities are best understood as a negotiable field of exchange. In this design experiment students, peers and educators engage in explicit, "front-end" assessment (Wyatt-Smith, 2008) to translate digital artefacts into institutional, and potentiality economic capital without continually referring to paper based pre-set criteria. This approach invites students and educators to use social networking functions to assess “work in progress” and final submissions in collaboration, and in doing so assessors refine their evaluative expertise and negotiate the value of student’s work from which new criteria can emerge. The mobile advantages of web-based technologies aggregate, externalise and democratise this transparent assessment model for most, if not all, student work that can be digitally represented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The notion of pedagogy for anyone in the teaching profession is innocuous. The term itself, is steeped in history but the details of the practice can be elusive. What does it mean for an academic to be embracing pedagogy? The problem is not limited to academics; most teachers baulk at the introduction of a pedagogic agenda and resist attempts to have them reflect on their classroom teaching practice, where ever that classroom might be constituted. This paper explores the application of a pedagogic model (Education Queensland, 2001) which was developed in the context of primary and secondary teaching and was part of a schooling agenda to improve pedagogy. As a teacher educator I introduced the model to classroom teachers (Hill, 2002) using an Appreciative Inquiry (Cooperrider and Srivastva 1987) model and at the same time applied the model to my own pedagogy as an academic. Despite being instigated as a model for classroom teachers, I found through my own practitioner investigation that the model was useful for exploring my own pedagogy as a university academic (Hill, 2007, 2008). Cooperrider, D.L. and Srivastva, S. (1987) Appreciative inquiry in organisational life, in Passmore, W. and Woodman, R. (Eds) Research in Organisational Changes and Development (Vol 1) Greenwich, CT: JAI Press. Pp 129-69 Education Queensland (2001) School Reform Longitudinal Study (QSRLS), Brisbane, Queensland Government. Hill, G. (2002, December ) Reflecting on professional practice with a cracked mirror: Productive Pedagogy experiences. Australian Association for Research in Education Conference. Brisbane, Australia. Hill, G. (2007) Making the assessment criteria explicit through writing feedback: A pedagogical approach to developing academic writing. International Journal of Pedagogies and Learning 3(1), 59-66. Hill, G. (2008) Supervising Practice Based Research. Studies in Learning, Evaluation, Innovation and Development, 5(4), 78-87

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identification of hot spots, also known as the sites with promise, black spots, accident-prone locations, or priority investigation locations, is an important and routine activity for improving the overall safety of roadway networks. Extensive literature focuses on methods for hot spot identification (HSID). A subset of this considerable literature is dedicated to conducting performance assessments of various HSID methods. A central issue in comparing HSID methods is the development and selection of quantitative and qualitative performance measures or criteria. The authors contend that currently employed HSID assessment criteria—namely false positives and false negatives—are necessary but not sufficient, and additional criteria are needed to exploit the ordinal nature of site ranking data. With the intent to equip road safety professionals and researchers with more useful tools to compare the performances of various HSID methods and to improve the level of HSID assessments, this paper proposes four quantitative HSID evaluation tests that are, to the authors’ knowledge, new and unique. These tests evaluate different aspects of HSID method performance, including reliability of results, ranking consistency, and false identification consistency and reliability. It is intended that road safety professionals apply these different evaluation tests in addition to existing tests to compare the performances of various HSID methods, and then select the most appropriate HSID method to screen road networks to identify sites that require further analysis. This work demonstrates four new criteria using 3 years of Arizona road section accident data and four commonly applied HSID methods [accident frequency ranking, accident rate ranking, accident reduction potential, and empirical Bayes (EB)]. The EB HSID method reveals itself as the superior method in most of the evaluation tests. In contrast, identifying hot spots using accident rate rankings performs the least well among the tests. The accident frequency and accident reduction potential methods perform similarly, with slight differences explained. The authors believe that the four new evaluation tests offer insight into HSID performance heretofore unavailable to analysts and researchers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Survival probability prediction using covariate-based hazard approach is a known statistical methodology in engineering asset health management. We have previously reported the semi-parametric Explicit Hazard Model (EHM) which incorporates three types of information: population characteristics; condition indicators; and operating environment indicators for hazard prediction. This model assumes the baseline hazard has the form of the Weibull distribution. To avoid this assumption, this paper presents the non-parametric EHM which is a distribution-free covariate-based hazard model. In this paper, an application of the non-parametric EHM is demonstrated via a case study. In this case study, survival probabilities of a set of resistance elements using the non-parametric EHM are compared with the Weibull proportional hazard model and traditional Weibull model. The results show that the non-parametric EHM can effectively predict asset life using the condition indicator, operating environment indicator, and failure history.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, weighted fair rate allocation for ATM available bit rate (ABR) service is discussed with the concern of the minimum cell rate (MCR). Weighted fairness with MCR guarantee has been discussed recently in the literature. In those studies, each ABR virtual connection (VC) is first allocated its MCR, then the remaining available bandwidth is further shared among ABR VCs according to their weights. For the weighted fairness defined in this paper, the bandwidth is first allocated according to each VC's weight; if a VC's weighted share is less than its MCR, it should be allocated its MCR instead of the weighted share. This weighted fairness with MCR guarantee is referred to as extended weighted (EXW) fairness. Certain theoretical issues related to EXW, such as its global solution and bottleneck structure, are first discussed in the paper. A distributed explicit rate allocation algorithm is then proposed to achieve EXW fairness in ATM networks. The algorithm is a general-purpose explicit rate algorithm in the sense that it can realise almost all the fairness principles proposed for ABR so far whilst only minor modifications may be needed.