885 resultados para fuzzy based evaluation method
Resumo:
A soft linguistic evaluation method is proposed for the environmental assessment of physical infrastructure projects based on fuzzy relations. Infrastructure projects are characterized in terms of linguistic expressions of 'performance' with respect to factors or impacts and the 'importance' of those factors/impacts. A simple example is developed to illustrate the method in the context of three road infrastructure projects assessed against five factors/impacts. In addition, a means to include hard or crisp factors is presented and illustrated with respect to a sixth factor.
Resumo:
Advances in multiscale material modeling of structural concrete have created an upsurge of interest in the accurate evaluation of mechanical properties and volume fractions of its nano constituents. The task is accomplished by analyzing the response of a material to indentation, obtained as an outcome of a nanoindentation experiment, using a procedure called the Oliver and Pharr (OP) method. Despite its widespread use, the accuracy of this method is often questioned when it is applied to the data from heterogeneous materials or from the materials that show pile-up and sink-in during indentation, which necessitates the development of an alternative method. ^ In this study, a model is developed within the framework defined by contact mechanics to compute the nanomechanical properties of a material from its indentation response. Unlike the OP method, indentation energies are employed in the form of dimensionless constants to evaluate model parameters. Analysis of the load-displacement data pertaining to a wide range of materials revealed that the energy constants may be used to determine the indenter tip bluntness, hardness and initial unloading stiffness of the material. The proposed model has two main advantages: (1) it does not require the computation of the contact area, a source of error in the existing method; and (2) it incorporates the effect of peak indentation load, dwelling period and indenter tip bluntness on the measured mechanical properties explicitly. ^ Indentation tests are also carried out on samples from cement paste to validate the energy based model developed herein by determining the elastic modulus and hardness of different phases of the paste. As a consequence, it has been found that the model computes the mechanical properties in close agreement with that obtained by the OP method; a discrepancy, though insignificant, is observed more in the case of C-S-H than in the anhydrous phase. Nevertheless, the proposed method is computationally efficient, and thus it is highly suitable when the grid indentation technique is required to be performed. In addition, several empirical relations are developed that are found to be crucial in understanding the nanomechanical behavior of cementitious materials.^
Resumo:
Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.
Resumo:
This paper reports on the development and optimization of a modified Quick, Easy, Cheap Effective, Rugged and Safe (QuEChERS) based extraction technique coupled with a clean-up dispersive-solid phase extraction (dSPE) as a new, reliable and powerful strategy to enhance the extraction efficiency of free low molecular-weight polyphenols in selected species of dietary vegetables. The process involves two simple steps. First, the homogenized samples are extracted and partitioned using an organic solvent and salt solution. Then, the supernatant is further extracted and cleaned using a dSPE technique. Final clear extracts of vegetables were concentrated under vacuum to near dryness and taken up into initial mobile phase (0.1% formic acid and 20% methanol). The separation and quantification of free low molecular weight polyphenols from the vegetable extracts was achieved by ultrahigh pressure liquid chromatography (UHPLC) equipped with a phodiode array (PDA) detection system and a Trifunctional High Strength Silica capillary analytical column (HSS T3), specially designed for polar compounds. The performance of the method was assessed by studying the selectivity, linear dynamic range, the limit of detection (LOD) and limit of quantification (LOQ), precision, trueness, and matrix effects. The validation parameters of the method showed satisfactory figures of merit. Good linearity (View the MathML sourceRvalues2>0.954; (+)-catechin in carrot samples) was achieved at the studied concentration range. Reproducibility was better than 3%. Consistent recoveries of polyphenols ranging from 78.4 to 99.9% were observed when all target vegetable samples were spiked at two concentration levels, with relative standard deviations (RSDs, n = 5) lower than 2.9%. The LODs and the LOQs ranged from 0.005 μg mL−1 (trans-resveratrol, carrot) to 0.62 μg mL−1 (syringic acid, garlic) and from 0.016 μg mL−1 (trans-resveratrol, carrot) to 0.87 μg mL−1 ((+)-catechin, carrot) depending on the compound. The method was applied for studying the occurrence of free low molecular weight polyphenols in eight selected dietary vegetables (broccoli, tomato, carrot, garlic, onion, red pepper, green pepper and beetroot), providing a valuable and promising tool for food quality evaluation.
Resumo:
Efficient implementation of recycling networks requires appropriate logistical structures for managing the reverse flow of materials from users to producers. The steel sheet distributor studied had established a protocol for scrap recovery with the steel plant and its customers. The company invested in producing containers, hiring a specialized labor force and in purchasing trucks for container transportation to implement the logistics network for recycling. That network interconnected the company with two kinds of customers: the ones returning scrap and the ones who preferred to continue business-as-usual. The logistical network was analyzed using emergy synthesis, and the data obtained were used to evaluate and compare the system's environmental costs and benefits from the perspective of the distributor and the steel plant operator. The use of emergy ternary diagrams provided a way to assess recycle strategies to compare the relative economic and environmental benefits of the logistical network implemented. The minimum quantity of scrap that the distributor must recover to improve environmental benefits was determined allowing decision on whether it is worth keeping the system running. The new assessment method proposed also may help policy-makers to create strategies to reward or incentive users of reverse logistics, and help to establish regulations, by decreasing taxes or stimulating innovation, for effectively implement the National Policy on Solid Waste. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
The uptake of Linked Data (LD) has promoted the proliferation of datasets and their associated ontologies for describing different domains. Par-ticular LD development characteristics such as agility and web-based architec-ture necessitate the revision, adaption, and lightening of existing methodologies for ontology development. This thesis proposes a lightweight method for ontol-ogy development in an LD context which will be based in data-driven agile de-velopments, existing resources to be reused, and the evaluation of the obtained products considering both classical ontological engineering principles and LD characteristics.
Resumo:
Performing activity recognition using the information provided by the different sensors embedded in a smartphone face limitations due to the capabilities of those devices when the computations are carried out in the terminal. In this work a fuzzy inference module is implemented in order to decide which classifier is the most appropriate to be used at a specific moment regarding the application requirements and the device context characterized by its battery level, available memory and CPU load. The set of classifiers that is considered is composed of Decision Tables and Trees that have been trained using different number of sensors and features. In addition, some classifiers perform activity recognition regardless of the on-body device position and others rely on the previous recognition of that position to use a classifier that is trained with measurements gathered with the mobile placed on that specific position. The modules implemented show that an evaluation of the classifiers allows sorting them so the fuzzy inference module can choose periodically the one that best suits the device context and application requirements.
Resumo:
Intelligent Transportation Systems (ITS) cover a broad range of methods and technologies that provide answers to many problems of transportation. Unmanned control of the steering wheel is one of the most important challenges facing researchers in this area. This paper presents a method to adjust automatically a fuzzy controller to manage the steering wheel of a mass-produced vehicle to reproduce the steering of a human driver. To this end, information is recorded about the car's state while being driven by human drivers and used to obtain, via genetic algorithms, appropriate fuzzy controllers that can drive the car in the way that humans do. These controllers have satisfy two main objectives: to reproduce the human behavior, and to provide smooth actions to ensure comfortable driving. Finally, the results of automated driving on a test circuit are presented, showing both good route tracking (similar to the performance obtained by persons in the same task) and smooth driving.
Resumo:
This paper decomposes the conventional measure of selection bias in observational studies into three components. The first two components are due to differences in the distributions of characteristics between participant and nonparticipant (comparison) group members: the first arises from differences in the supports, and the second from differences in densities over the region of common support. The third component arises from selection bias precisely defined. Using data from a recent social experiment, we find that the component due to selection bias, precisely defined, is smaller than the first two components. However, selection bias still represents a substantial fraction of the experimental impact estimate. The empirical performance of matching methods of program evaluation is also examined. We find that matching based on the propensity score eliminates some but not all of the measured selection bias, with the remaining bias still a substantial fraction of the estimated impact. We find that the support of the distribution of propensity scores for the comparison group is typically only a small portion of the support for the participant group. For values outside the common support, it is impossible to reliably estimate the effect of program participation using matching methods. If the impact of participation depends on the propensity score, as we find in our data, the failure of the common support condition severely limits matching compared with random assignment as an evaluation estimator.
Resumo:
This case study evaluates the implementation of a secondary land use plan in Winnipeg, MB. The area selected for this case study is the Northeast Neighbourhood located in Waverley West; the development of this neighbourhood was guided by the Northeast Neighbourhood Area Structure Plan (NNASP). This case study evaluates the implementation of the NNASP through a conformance analysis which answers the following research questions: 1) Does the developed land use pattern in the NNASP area conform to what was planned; and 2) Does the implementation of the NNASP conform to the goals, objectives, policies, and intent of the plan? The implementation of the NNASP was evaluated against 62 evaluation criteria which were generated based on the policies of the NNASP. Using this method, the development of the Northeast Neighbourhood is effectively evaluated against the requirements of the NNASP. This conformity test utilized threefold approach including GIS analysis, a site visit, and document analysis.
Resumo:
Semipermeable membrane devices (SPMDs) have been used as passive air samplers of semivolatile organic compounds in a range of studies. However, due to a lack of calibration data for polyaromatic hydrocarbons (PAHs), SPMD data have not been used to estimate air concentrations of target PAHs. In this study, SPMDs were deployed for 32 days at two sites in a major metropolitan area in Australia. High-volume active sampling systems (HiVol) were co-deployed at both sites. Using the HiVol air concentration data from one site, SPMD sampling rates were measured for 12 US EPA Priority Pollutant PAHs and then these values were used to determine air concentrations at the second site from SPMD concentrations. Air concentrations were also measured at the second site with co-deployed HiVols to validate the SPMD results. PAHs mostly associated with the vapour phase (Fluorene to Pyrene) dominated both the HiVol and passive air samples. Reproducibility between replicate passive samplers was satisfactory (CV < 20%) for the majority of compounds. Sampling rates ranged between 0.6 and 6.1 m(3) d(-1). SPMD-based air concentrations were calculated at the second site for each compound using these sampling rates and the differences between SPMD-derived air concentrations and those measured using a HiVol were, on average, within a factor of 1.5. The dominant processes for the uptake of PAHs by SPMDs were also assessed. Using the SPMD method described herein, estimates of particulate sorbed airborne PAHs with five rings or greater were within 1.8-fold of HiVol measured values. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an investigation of design code provisions for steel-concrete composite columns. The study covers the national building codes of United States, Canada and Brazil, and the transnational EUROCODE. The study is based on experimental results of 93 axially loaded concrete-filled tubular steel columns. This includes 36 unpublished, full scale experimental results by the authors and 57 results from the literature. The error of resistance models is determined by comparing experimental results for ultimate loads with code-predicted column resistances. Regression analysis is used to describe the variation of model error with column slenderness and to describe model uncertainty. The paper shows that Canadian and European codes are able to predict mean column resistance, since resistance models of these codes present detailed formulations for concrete confinement by a steel tube. ANSI/AISC and Brazilian codes have limited allowance for concrete confinement, and become very conservative for short columns. Reliability analysis is used to evaluate the safety level of code provisions. Reliability analysis includes model error and other random problem parameters like steel and concrete strengths, and dead and live loads. Design code provisions are evaluated in terms of sufficient and uniform reliability criteria. Results show that the four design codes studied provide uniform reliability, with the Canadian code being best in achieving this goal. This is a result of a well balanced code, both in terms of load combinations and resistance model. The European code is less successful in providing uniform reliability, a consequence of the partial factors used in load combinations. The paper also shows that reliability indexes of columns designed according to European code can be as low as 2.2, which is quite below target reliability levels of EUROCODE. (C) 2009 Elsevier Ltd. All rights reserved.