972 resultados para evaluation methodologies
Resumo:
Public road authorities have a key responsibility in driving initiatives for reducing greenhouse gas (GHG) emissions in the road construction project lifecycle. A coherent and efficient chain of procurement processes and methods is needed to convert green policies into tangible actions that capture the potential for GHG reduction. Yet, many infrastructure clients lack developed methodologies regarding green procurement practices. Designing more efficient solutions for green procurement requires an evaluation of the current initiatives and stages of development. A mapping of the current GHG reduction initiatives in Australian public road procurement is presented in this paper. The study includes the five largest Australian state road authorities, which cover 94% of the total 817,089 km of Australian main roads (not local) and account for 96% of the total A$13 billion annual major road construction and maintenance expenditure. The state road authorities’ green procurement processes and tools are evaluated based on interviews and a review of documents. Altogether 12 people, comprising 1-3 people of each organisation, participated in the interviews and provided documents. An evaluation matrix was developed for mapping the findings across the lifecycle of road construction project delivery. The results show how Australian state road authorities drive decisions with an impact on GHG emissions on the strategic planning phase, project development phase, and project implementation phase. The road authorities demonstrate varying levels of advancement in their green procurement methodologies. Six major gaps in the current green procurement processes are identified and, respectively, six recommendations for future research and development are suggested. The greatest gaps remain in the project development phase, which has a critical role in fixing the project (GHG reduction) goals, identifying risks and opportunities, and selecting the contractor to deliver the project. Specifically, the role of mass-haul optimisation as a part of GHG minimisation was reviewed, and mass-haul management was found to be an underutilised element with GHG reduction potential.
Resumo:
Existing compliance management frameworks (CMFs) offer a multitude of compliance management capabilities that makes difficult for enterprises to decide on the suitability of a framework. Making a decision on the suitability requires a deep understanding of the functionalities of a framework. Gaining such an understanding is a difficult task which, in turn, requires specialised tools and methodologies for evaluation. Current compliance research lacks such tools and methodologies for evaluating CMFs. This paper reports a methodological evaluation of existing CMFs based on a pre-defined evaluation criteria. Our evaluation highlights what existing CMFs offer, and what they cannot. Also, it underpins various open questions and discusses the challenges in this direction.
Resumo:
There is a growing trend to offer students learning opportunities that are flexible, innovative and engaging. As educators embrace student-centred agile teaching and learning methodologies, which require continuous reflection and adaptation, the need to evaluate students’ learning in a timely manner has become more pressing. Conventional evaluation surveys currently dominate the evaluation landscape internationally, despite recognition that they are insufficient to effectively evaluate curriculum and teaching quality. Surveys often: (1) fail to address the issues for which educators need feedback, (2) constrain student voice, (3) have low response rates and (4) occur too late to benefit current students. Consequently, this paper explores principles of effective feedback to propose a framework for learner-focused evaluation. We apply a three-stage control model, involving feedforward, concurrent and feedback evaluation, to investigate the intersection of assessment and evaluation in agile learning environments. We conclude that learner-focused evaluation cycles can be used to guide action so that evaluation is not undertaken simply for the benefit of future offerings, but rather to benefit current students by allowing ‘real-time’ learning activities to be adapted in the moment. As a result, students become co-producers of learning and evaluation becomes a meaningful, responsive dialogue between students and their instructors.
Resumo:
Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression
Resumo:
This paper presents three methodologies for determining optimum locations and magnitudes of reactive power compensation in power distribution systems. Method I and Method II are suitable for complex distribution systems with a combination of both radial and ring-main feeders and having different voltage levels. Method III is suitable for low-tension single voltage level radial feeders. Method I is based on an iterative scheme with successive powerflow analyses, with formulation and solution of the optimization problem using linear programming. Method II and Method III are essentially based on the steady state performance of distribution systems. These methods are simple to implement and yield satisfactory results comparable with the results of Method I. The proposed methods have been applied to a few distribution systems, and results obtained for two typical systems are presented for illustration purposes.
Resumo:
Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships with four modeling methods run with multiple scenarios of (1) sources of occurrences and geographically isolated background ranges for absences, (2) approaches to drawing background (absence) points, and (3) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved by using a global dataset for model training, rather than restricting data input to the species’ native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e. into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g. boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post-hoc test conducted on a new Partenium dataset from Nepal validated excellent predictive performance of our “best” model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for Parthenium hysterophorus L. (Asteraceae; parthenium). However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. This article is protected by copyright. All rights reserved.
Resumo:
Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships with four modeling methods run with multiple scenarios of (1) sources of occurrences and geographically isolated background ranges for absences, (2) approaches to drawing background (absence) points, and (3) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved by using a global dataset for model training, rather than restricting data input to the species’ native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e. into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g. boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post-hoc test conducted on a new Partenium dataset from Nepal validated excellent predictive performance of our “best” model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for Parthenium hysterophorus L. (Asteraceae; parthenium). However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. This article is protected by copyright. All rights reserved.
Resumo:
This paper presents methodologies for residual strength evaluation of concrete structural components using linear elastic and nonlinear fracture mechanics principles. The effect of cohesive forces due to aggregate bridging has been represented mathematically by employing tension softening models. Various tension softening models such as linear, bilinear, trilinear, exponential and power curve have been described with appropriate expressions. These models have been validated by predicting the remaining life of concrete structural components and comparing with the corresponding experimental values available in the literature. It is observed that the predicted remaining life by using power model and modified bi-linear model is in good agreement with the corresponding experimental values. Residual strength has also been predicted using these tension softening models and observed that the predicted residual strength is in good agreement with the corresponding analytical values in the literature. In general, it is observed that the variation of predicted residual moment with the chosen tension softening model follows the similar trend as in the case of remaining life. Linear model predicts large residual moments followed by trilinear, bilinear and power models.
Resumo:
This paper presents the advanced analytical methodologies such as Double- G and Double - K models for fracture analysis of concrete specimens made up of high strength concrete (HSC, HSC1) and ultra high strength concrete. Brief details about characterization and experimentation of HSC, HSC1 and UHSC have been provided. Double-G model is based on energy concept and couples the Griffith's brittle fracture theory with the bridging softening property of concrete. The double-K fracture model is based on stress intensity factor approach. Various fracture parameters such as cohesive fracture toughness (4), unstable fracture toughness (K-Ic(c)), unstable fracture toughness (K-Ic(un)) and initiation fracture toughness (K-Ic(ini)) have been evaluated based on linear elastic fracture mechanics and nonlinear fracture mechanics principles. Double-G and double-K method uses the secant compliance at the peak point of measured P-CMOD curves for determining the effective crack length. Bi-linear tension softening model has been employed to account for cohesive stresses ahead of the crack tip. From the studies, it is observed that the fracture parameters obtained by using double - G and double - K models are in good agreement with each other. Crack extension resistance has been estimated by using the fracture parameters obtained through double - K model. It is observed that the values of the crack extension resistance at the critical unstable point are almost equal to the values of the unstable fracture toughness K-Ic(un) of the materials. The computed fracture parameters will be useful for crack growth study, remaining life and residual strength evaluation of concrete structural components.
Resumo:
What is the scope and responsibilities of design? This work partially answers this by employing a normative approach to design of a biomass cook stove. This study debates on the sufficiency of existing design methodologies in the light of a capability approach. A case study of a biomass cook stove Astra Ole has elaborated the theoretical constructs of capability approach, which, in turn, has structured insights from field to evaluate the product. Capability approach based methodology is also prescriptively used to design the mould for rapid dissemination of the Astra Ole.
Resumo:
222 p. : il.
Resumo:
The importance of quantifying the economic returns to investments in aquatic resources research together with the social, environmental and institutional impacts of such investments is widely recognized among ICLARM's donors, trustees and beneficiaries. As with other Consultative Group on International Agricultural Research (CGIAR) centers, ICLARM is being asked to provide specific accounts of the outputs of its research and their impact on farms and on fisheries, including their socioeconomic impact. Such impact information has become a necessary, though not sufficient, basis for setting priorities and allocating resources for research for the CGIAR centers. This paper discusses the types and methods of impact assessment relevant to ICLARM's work. A three-pronged assessment approach is envisaged to capture the full range of impacts: 1) ex ante assessment for research priority setting; 2) assessment prior to dissemination or adoption along with monitoring and evaluation; and 3) ex post impact assessment. It also discusses the objectives and scope for operational impact assessment of ICLARM's research.
Resumo:
Humans appear to be sensitive to relative small changes in their surroundings. These changes are often initially perceived as irrelevant, but they can cause significant changes in behavior. However, how exactly people's behavior changes is often hard to quantify. A reliable and valid tool is needed in order to address such a question, ideally measuring an important point of interaction, such as the hand. Wearable-body-sensor systems can be used to obtain valuable, behavioral information. These systems are particularly useful for assessing functional interactions that occur between the endpoints of the upper limbs and our surroundings. A new method is explored that consists of computing hand position using a wearable sensor system and validating it against a gold standard reference measurement (optical tracking device). Initial outcomes related well to the gold standard measurements (r = 0.81) showing an acceptable average root mean square error of 0.09 meters. Subsequently, the use of this approach was further investigated by measuring differences in motor behavior, in response to a changing environment. Three subjects were asked to perform a water pouring task with three slightly different containers. Wavelet analysis was introduced to assess how motor consistency was affected by these small environmental changes. Results showed that the behavioral motor adjustments to a variable environment could be assessed by applying wavelet coherence techniques. Applying these procedures in everyday life, combined with correct research methodologies, can assist in quantifying how environmental changes can cause alterations in our motor behavior.
Resumo:
Yeoman, A. J., Cooper, J. M., Urquhart, C. J. & Tyler, A. (2003). The management of health library outreach services: evaluation and reflection on lessons learned on the VIVOS project. Journal of the Medical Library Association, 91(4), 426-433. Sponsorship: Resource
Resumo:
This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.